|MadSci Network: Science History|
I was not able to find much about the historical accuracy of thermometers except that each new technology brings increases in accuracy and precision. Early (1500 and 1600s) thermometers measured the change in density of a liquid or expansion of a gas as temperature changed. These were highly inaccurate since they were also effected by changes in atmospheric pressure.
The alcohol thermometer was invented by Gabriel Fahrenheit in 1709 and the first mercury thermometer in 1714. The change from alcohol as the working fluid to mercury increased accuracy since mercury has a constant coefficient of thermal expansion with temperature and alcohol varies. Because of the limitation of alcohol as a working fluid, the specifics of each thermometer were very important (size and perfection of the glass tube, etc.) For this reason, alcohol thermometers were highly precise (that is, each thermometer gave the same result each time it was used, but they were not very accurate. That is, two different thermometers would likely give different answers). Advances in manufacturing to make more uniform glass tubes also helped.
By the late 1800s, new technologies including metal resistance thermometers, bimetallic temperature indicators, and thermocouples increased the accuracy and range over which temperatures can be measured.
The latest advances are non-contact infrared detectors, semi-conductor thermisters, and fiber-optic temperature sensors. Temperature are now routinely measure to ~0.001 degree C although this usually requires special care be taken during the measurement.
Here are some links you can look at:
Try the links in the MadSci Library for more information on Science History.