MadSci Network: Astronomy |

Posted By:

Date:

Area of science:

ID:

There are several parts to your question, but the answer comes down to "how closely do you want to know the answer?"

For this we need to introduce the difference between magnitude (crudely speaking, how many zeroes there are after the first digit, or more precisely what power of ten we're talking about) and precision. Precision answers two questions:

- How many meaningful digits are there after the first digit? (How many significant figures, or "sig figs"?)
- What are the values of those sig figs?

The way one finds a number of stars, or galaxies, is the same as one might count the size of a very large crowd of people: you might just look at every single one, and click a counter each time -- but that would take a mighty long time! -- or you might do as the astronomers do: take a sample of known size (say, 1/100 of the floor space of a packed concert hall) and count every person or star in that sample. Then, you just multiply your number by the denominator of the fraction to get the total number. Of course, the number is not as accurate as if you had counted every single individual, but it *will* be close; the law of averages guarantees it.

The surface temperature of the sun is another example. This can actually be found with some precision, because the behavior of incandescent objects (that is, objects so hot they glow) is well-understood. So we measure the light coming from the sun, in particular the wavelength at which the most light comes, and from experimental measurements on earthbound incandescent objects we can give a pretty good estimate of the sun's surface temperature. But even if we didn't happen to have the value for an object just the same color as the sun, we could estimate: suppose you knew that a blue-hot object was 10,000^{o} and a red-hot object was 3,000^{o}. Since the sun is yellow-hot, we know its surface must be between 3,000^{o} and 10,000^{o}. So we know that an answer of 500^{o} or 50,000^{o} is * wrong*. This means, to use your example, that we can usually tell the difference between 400,000 and 1,400,000 degrees!

Finally, your clock question should pretty well answer itself by now: 1 second in a thousand years is a trivial error (I calculate it to be one part in 3 x 10^{10}!) and doesn't affect the magnitudes of the time one is measuring. Actually, most radioactive dating methods are far less accurate than one part in 10^{10}; but even 1 part per thousand is not bad.

If you want to find more information on where astronomers get particular numbers, I recommend the astronomy section in your library, or the MAD Scientist Astronomy Links.

Dan Berger

Bluffton College

http://cs.bluffton.edu/~berger/

Try the links in the MadSci Library for more information on Astronomy.

MadSci Network

© 1997, Washington University Medical School

webadmin@www.madsci.org