Let's get into the subject of measurement by making a (thought) measurement. Suppose you are working with a simple pendulum - a heavy mass suspended from a light string (see the diagram at right) - and you need to know the length of the pendulum. You hold a meter stick up against the pendulum, line everything up very carefully, and say to yourself and your lab partner "25.45 centimeters". Then you carefully record this measurement in your data table.
Your text does an excellent job of explaining units and standards of measurement, and we will do some lab activities that should help you solidify the concepts. I just have a few comments about units and the measuring process...
Being a physicist, you realize that science cannot be done "on automatic pilot" - you have to think about what you are doing. You would, perhaps, ask yourself,
"How much do I trust this measurement? How much confidence do I have in it? If I would do this measurement again, would I get 25.45 cm again - or something close? If so, how close? What if my lab partner had made this measurement with some other meter stick, would it have made a difference? What if an experienced, professional physicist measured this length with some fancy, expensive measuring device - would she get 25.45 cm, or something close?"
The questions that you were asking above relate to how much confidence you are willing to place in a measurement. "Precision" is the term that expresses the amount of confidence you have in a measurement. A "high precision" measurement expresses high confidence that the measurement lies within a narrow range of values. In other words, "high precision" means that you are very confident that an additional measurement would produce a value very close to the previous measurements. In particular, physicists want to quantify the precision of their measurements - in other words, assign a number to their confidence in a measurement.