The resolution of a measuring instrument determines the smallest change in value that it can discriminate between (i.e. tell the difference between) and this limits the precision of any measurement made with it. Consider a digital display that indicates the water level in a tank as it is filling. The display has 3 digits to display a range from levels from 0cm up to 999cm.
Imagine the water level has just reached 100cm and is still rising. As the water height increases above 100cm (e.g 100.1cm , 100.2 cm etc.) the display cannot show this change because it doesn't have a fourth digit and so it will continue to indicate 100cm. When the water level reaches 100.5cm the display then changes to read 101cm. It will continue to display this value until the water level rises to 101.5cm which will then cause the display to change to 102cm.
This illustrates two points
The resolution of the display is 1cm i.e. the true value had to change by 1cm ( from 100.5cm to 101.5cm) before the display indicated any increase in height (when the display changed from 101cm to 102cm).
The uncertainty in the displayed value is +/- 0.5cm. e.g. when the meter indicates 101cm the true value can be anywhere between 100.5 cm up to just below 101.5cm. i.e. +/- 0.5cm of the value displayed.