MadSci Network: Computer Science
Query:

Re: How can the computer differentiate between just 8 series of 0 and 1?

Date: Mon Apr 23 14:58:54 2001
Posted By: Valdis Kletnieks, Staff, Computing Center, Virginia Tech Computing Center
Area of science: Computer Science
ID: 987268809.Cs
Message:

Well Mikael, you actually almost figured out the answer by yourself when
you said that you have to pause between one letter and the next.  Yes, you
need a pause or other signal.  Since there's two major ways that computers
handle this, and I'm not sure which you mean, I'll explain both...

The first sort of signalling is called asynchronous, and is usually used in
conjunction with input/output devices such as tape drives, modems, and
similar.  It relies on the fact that on the external media (the disk
surface, the magnetic tape, the phone line) there are usually actually
*three* states (a signal indicating "0", a signal indicating "1", and no
detectable signal).  (I am intentionally skipping over things like phase
encoding in modems and so on).  Then at the start of each chunk of
information, there is a "preamble", and at the end, there's a "trailer". 
Often on older/slower modems, you will find "start" and "stop" bits -
that's what they do.  So for example,  you might be using 2 start and 1
stop bit on a  modem - so what is actually transmitted for an 8-bit
sequence 01101101 would be:

(silence) (start) (start) (0) (1) (1) (0) (1) (1) (0) (1) (stop)
(silence)... (next start bit)

and your modem would listen for *11* pieces of info.  Similar techniques
are used in disk drives and tape drives to indicate the start/end of blocks
of data - the start/stop bits serve as the "pause" from a telegraph.

The second sort of signalling is "clocked" or "synchronous", where we don't
worry about pauses because an external signal keeps everything in sync. 
The metaphor here would be the strong beat of disco/dance music - it's very
hard to get out of step unless you're *really* uncool (which if you're
royalty or a politician gets you an unflattering clip on TV).  Here,
in addition to the actual signal being sent, there's a *second* totally
seperate clocking signal, that basically chants out "BIt!" "Bit!" "Bit!",
and the hardware that is sending/receiving the bits just takes the bits at
the specified times.  This is how it's usually done inside the computer,
and where things like "bus speed" come from - that refers to how fast the
clock signal for the bus is (which is usually different from the CPU clock,
but usually a simple multiple like 3, 4, 5, 3/2, and so on, so the CPU and
bus can synchronize fairly regularly - if the CPU clock were 8.684 times as
fast as the bus clock, things would be ugly to design - CPU 9 times the bus
is a LOT easier).



Current Queue | Current Queue for Computer Science | Computer Science archives

Try the links in the MadSci Library for more information on Computer Science.



MadSci Home | Information | Search | Random Knowledge Generator | MadSci Archives | Mad Library | MAD Labs | MAD FAQs | Ask a ? | Join Us! | Help Support MadSci


MadSci Network, webadmin@www.madsci.org
© 1995-2001. All rights reserved.