In some situations the precision that the resolution of a measuring instrument is capable of producing cannot be achieved due to the measurement technique used.
Consider the following example A stopwatch is used to time an event that lasts exactly 10.00 seconds. Assume that the stopwatch is perfectly calibrated and has a resolution of 0.01 of a second. The stopwatch is operated manually so due to human reaction time there will be a slight delay when starting and stopping the stopwatch. Let's say that the reaction time is about 0.2 seconds.
Now if this reaction time was always constant the measurement made would still give the correct value. ( i.e the stopwatch would start timing 0.2 seconds after the start of the event but would also stop 0.2 seconds after the event finished so it would still record a time of 10.00 seconds). However human reaction certainly is not constant!
Let's say that the reaction time can vary by up to 0.1 seconds. So the stopwatch may be started 0.2 seconds after the event starts but then stopped
0.3 seconds after it finishes which would give a measured value of 10.10 seconds.
As well as the problem of a variable delay in starting and stopping the stopwatch it is even possible that the operator could trigger the stopwatch prematurely by say trying to anticipate the moment the event ends in an attempt to reduce reaction time.
The net result is that the variation in the ability to synchronise the triggering of the stopwatch with the start and finish of the event introduces
uncertainty in the measurement that is larger than could be achieved by the instruments resolution.
Practically a measurement made with a manually operated stopwatch should be given an uncertainty of +/- 0.2 second rather than the +/- 0.01 seconds the instrument is capable of. Therefore the time would only be recorded only to within two tenths of a second (e.g 10.0 +/- 0.2 s) rather than a hundredth of a second (e.g. 10.00s)
Electronic sensors can be used to start and stop the stopwatch automatically. This could enable the full precision of the stop watch to be achieved.
(Note. No physical device can react instantly and so there will still be a time delay in triggering the timer (although it will be much smaller than before). Also this delay will be much more consistent compared to the variability of human reaction time and so overall its effect will be negligible compared to the uncertainty due to the instruments resolution.)
To further illustrate the effect of measuring technique, consider this second theoretical example. Imagine that you have a ruler that has a scale with a resolution of 0.1mm (for the moment let us ignore the fact that it would be impossible to read such a scale (at least without a magnifying glass!)). In order to take a measurement you need to align the end of the ruler with one end of the object you are measuring. In reality you are never going to be able to align it perfectly. Let's say that you were only confident that you had aligned the ruler to within +/- 1mm of the end of the object (in reality you can achieve a little better than this). The uncertainty in aligning the ruler is larger than the resolution of the ruler. Therefore you would only be justified in recording your measurement to within +/- 1mm rather than the +/-0.1mm resolution given on the rulers scale.