MadSci Network: Science History
Query:

Re: Who came up with the idea of significant digits?

Date: Sat Jan 15 12:14:53 2000
Posted By: Mark Huber, Post-doc/Fellow, Statistics, Stanford University
Area of science: Science History
ID: 943374376.Sh
Message:

Dear Tillie,

Take a Roman numeral like CCLXX. One thing about numbers written in this form is that every C, L, and X counts towards something. Put another way, every digit signifies something. In our modern, Arabic numerals, this would be written 270. The 0 at the end doesn't add value to the number, it is just a placekeeper increasing the value of the non-zero digits to the left of it. Before the introduction of zero, every digit was a significant digit, it was when zeros came on the scene that this began to change.

Around 1400 is the earliest the Oxford English Dictionary (2nd edition) records anyone using significant figures to mean a nonzero number, about two to three hundred years after the time that the Arabic numerals entered Europe (see http://www.britannica.com search for "Arabic numerals"). So the earliest use of the times just meant those figures to the left of the right-most zeros.

As we move towards modern times, the idea of significant figures took on new meaning. It became a quick way to indicate the accuracy of a measurement. If I write 1800 meters, then I have only measured the distance to around the nearest 100 meters, but if I write 1837.35, then I have 6 sig figs of accuracy and have measured to the nearest one hundredth of a meter.

This is an example of round-off error, where errors in measurements or knowledge of a number are explicity recorded via significant figures. The first person to seriously deal with round-off error and its effects on computation was Carl Friedrich Gauss (1777- 1855). One of the top three mathematicians of all time (Euler and Archimedes are usually accorded the other two top spots), Gauss was the first to seriously look at how limiting the number of sig figs that trigonimetric and logarithmic tables used affected computation. In Theoria Motus, he writes, "Since none of the numbers...admit absolute precision, but are all to a certain extent approximate only, the results of all calculations performed by the aid of these numbers can only be approximately true." Gauss went on to put his money where his mouth was and work out the mathematics of error analysis, a form of which appears as our rules for determining the number of sig figs after a multiplication, addition, or square root operation.

Mark


Current Queue | Current Queue for Science History | Science History archives

Try the links in the MadSci Library for more information on Science History.



MadSci Home | Information | Search | Random Knowledge Generator | MadSci Archives | Mad Library | MAD Labs | MAD FAQs | Ask a ? | Join Us! | Help Support MadSci


MadSci Network, webadmin@www.madsci.org
© 1995-2000. All rights reserved.