Contents

# ACCURACY AND PRECISION

We know that accuracy of measurement is very important for manufacturing a quality product.

Accuracy is the degree of agreement of the measured dimension with its true magnitude.ย

Itย can also be defined as the maximum amount by which the result differs from the true value orย as the nearness of the measured value to its true value, often expressed as a percentage.ย

Trueย value may be defined as the mean of the infinite number of measured values when the averageย deviation due to the various contributing factors tends to zero.ย

In practice, realization of theย true value is not possible due to uncertainties of the measuring process and hence cannot beย determined experimentally.ย

Positive and negative deviations from the true value are not equalย and will not cancel each other. One would never know whether the quantity being measured isย the true value of the quantity or not.

Precision is the degree of repetitiveness of the measuring process. It is the degree ofย agreement of the repeated measurements of a quantity made by using the same method, underย similar conditions. In other words, precision is the repeatability of the measuring process.ย

Theย ability of the measuring instrument to repeat the same results during the act of measurementsย for the same quantity is known as repeatability.ย

Repeatability is random in nature and, byย itself, does not assure accuracy, though it is a desirable characteristic. Precision refers to theย consistent reproducibility of a measurement. Reproducibility is normally specified in terms of aย scale reading over a given period of time. If an instrument is not precise, it would give differentย results for the same dimension for repeated readings. In most measurements, precision assumesย more significance than accuracy.ย

It is important to note that the scale used for the measurementย must be appropriate and conform to an internationally accepted standard.

It is essential to know the difference between precision and accuracyย

Accuracy givesย information regarding how far the measured value is with respect to the true value, whereasย precision indicates quality of measurement, without giving any assurance that the measurementย is correct. These concepts are directly related to random and systematic measurement errors.

Figure also clearly depicts the difference between precision and accuracy, whereinย several measurements are made on a component using different types of instruments and theย results plotted.

It can clearly be seen from Fig. that precision is not a single measurement but is associatedย with a process or set of measurements. Normally, in any set of measurements performed by theย same instrument on the same component, individual measurements are distributed around theย mean value and precision is the agreement of these values with each other.ย

The differenceย ย between the true value and the mean value of the set of readings on the same component isย termed as an error. Error can also be defined as the difference between the indicated value andย the true value of the quantity measured.ย

where E is the error, Vm the measured value, and Vt the true value.ย The value of E is also known as the absolute error. For example, when the weight beingย measured is of the order of 1 kg, an error of ยฑ2 g can be neglected, but the same error of ยฑ2 gย becomes very significant while measuring a weight of 10 g. Thus, it can be mentioned hereย that for the same value of error, its distribution becomes significant when the quantity beingย measured is small. Hence, % error is sometimes known as relative error. Relative error isย expressed as the ratio of the error to the true value of the quantity to be measured. Accuracy ofย an instrument can also be expressed as % error. If an instrument measures Vm instead of Vt, then,

Accuracy of an instrument is always assessed in terms of error. The instrument is moreย accurate if the magnitude of error is low.ย

It is essential to evaluate the magnitude of error byย other means as the true value of the quantity being measured is seldom known, because of theย uncertainty associated with the measuring process.ย

In order to estimate the uncertainty of theย measuring process, one needs to consider the systematic and constant errors along with otherย factors that contribute to the uncertainty due to scatter of results about the mean.ย

Consequently,ย when precision is an important criterion, mating components are manufactured in a single plantย and measurements are obtained with the same standards and internal measuring precision,ย to accomplish interchangeability of manufacture.ย

If mating components are manufactured atย different plants and assembled elsewhere, the accuracy of the measurement of two plants withย true standard value becomes significant.

In order to maintain the quality of manufactured components, accuracy of measurement isย an important characteristic. Therefore, it becomes essential to know the different factors thatย affect accuracy. Sense factor affects accuracy of measurement, be it the sense of feel or sight.ย

Inย instruments having a scale and a pointer, the accuracy of measurement depends upon the thresholdย effect, that is, the pointer is either just moving or just not moving.ย

Since accuracy of measurementย is always associated with some error, it is essential to design the measuring equipment andย methods used for measurement in such a way that the error of measurement is minimized.

Two terms are associated with accuracy, especially when one strives for higher accuracy inย measuring equipment: sensitivity and consistency.ย

The ratio of the change of instrument indicationย to the change of quantity being measured is termed as sensitivity. In other words, it is the ability ofย the measuring equipment to detect small variations in the quantity being measured.ย

When effortsย are made to incorporate higher accuracy in measuring equipment, its sensitivity increases. Theย permitted degree of sensitivity determines the accuracy of the instrument.ย

An instrument cannotย be more accurate than the permitted degree of sensitivity.ย

It is very pertinent to mention here thatย unnecessary use of a more sensitive instrument for measurement than required is a disadvantage.

When successive readings of the measured quantity obtained from the measuring instrument areย same all the time, the equipment is said to be consistent. A highly accurate instrument possessesย both sensitivity and consistency.ย

A highly sensitive instrument need not be consistent, and theย degree of consistency determines the accuracy of the instrument. An instrument that is bothย consistent and sensitive need not be accurate, because its scale may have been calibrated with aย wrong standard. Errors of measurement will be constant in such instruments, which can be takenย care of by calibration.ย

It is also important to note that as the magnification increases, the rangeย of measurement decreases and, at the same time, sensitivity increases. Temperature variationsย affect an instrument and more skill is required to handle it.ย

Range is defined as the differenceย between the lower and higher values that an instrument is able to measure. If an instrument hasย a scale reading of 0.01โ100 mm, then the range of the instrument is 0.01โ100 mm, that is, theย difference between the maximum and the minimum value.

## Accuracy and Cost

It can be observed from Fig. that as theย requirement of accuracy increases, the costย increases exponentially. If the tolerance of aย component is to be measured, then the accuracyย requirement will normally be 10% of the toleranceย values.ย

Demanding high accuracy unless it isย absolutely required is not viable, as it increasesย the cost of the measuring equipment and henceย the inspection cost. Higher accuracyย increases sensitivity. Therefore, in practice, whileย designing the measuring equipment, the desired/ย required accuracy to cost considerations dependsย on the quality and reliability of the component/ย product and inspection cost.