10.04 Summary
- Many texts define the error in a measurement to be the difference between the nominal value (at the centre of the uncertainty interval) and the true value. This :
-
leads to inconsistent definitions of accuracy that conflict with common sense (in fact some texts avoid the term accuracy altogether
because of this!)
-
leads to the conclusion that precision has no affect on accuracy
- On the science campus the error is defined as the difference between the true value and the furthest limit of the uncertainty interval. This:
- Leads to a sensible definition of accuracy
- Explains the affect of precision on the accuracy