|MadSci Network: Engineering|
There is not a great difference. Anyhow, the word "accuracy" shall be used in a less "precise" form than "precision" which, in sciences involving evaluation of magnitudes like distances or temperatures, is usually followed by the degree of precision as a % (percentage of the measured magnitude). For example you shall say that a distance between two towns is 125 miles with a precision of 2%, that is that the exact value in miles is unknown, but you know that it is a number between 122.5 and 127.5 miles. In this example you may say that 125 is an accurate measure of the distance above (or extremely accurate, or not so accurate).
Bob Peeples adds:
In statistics, there are separate measures of precision and accuracy. The difference between the specification that you are trying to achieve and the average measurement achieved is accuracy. Standard deviation is a measurement of precision. It is a measure of how widely varied your samples are.
In layman's terms, if you miss the target everytime but your shots are "grouped" tightly, you are being precise but not accurate. If your shots are all over the place but the average shot is dead on target, you are being accurate but not precise.
Try the links in the MadSci Library for more information on Engineering.