As the industrial world continues to become more competitive, measurement tolerances will become tighter. At the same time, customers insist that products be produced at a lower cost. In this environment, the production floor of Six Sigma companies must continually improve the certainty of measurements without increasing costs.
Most companies are at the early stages of a learning curve on the concept of measurement uncertainty. Therefore, the competitive standard is not yet truly defined. Companies functioning ahead of the curve are beginning to understand this concept and as they do, competitors and customers will begin to learn as well.
To increase measurement ability, it is helpful to understand some of the concepts involved in the 4.11 element of QS 9000. The opening paragraph states: “Inspection measurement and test equipment shall be used in a manner which ensures that the measurement uncertainty is known and consistent with the required measurement capability.” That element, in conjunction with element 4.4, refers to instruments used on the production floor to confirm product quality.
The term “measurement uncertainty” is defined as the tolerance for measurement inaccuracy. The two entities that create measurement inaccuracy are error and variation. It is important to note that there is no generic quantification for uncertainty that applies across a broad spectrum of measurement situations. Some auditors mistakenly believe that errors found during calibration determine the total measurement uncertainty. However, the focus should be on identifying and then reducing the uncertainty of the measurement in a real world application.
To properly evaluate measurement and test equipment, an auditor must move past a philosophy that all error and variation are caused by the measurement gage. Every measurement system is comprised of six elements:
- Part (or part characteristic)
- Method (of measurement used)
- Tolerance (size)
A change in any one of these elements creates a new measurement system. Each element introduces separate sources of error and variation into the measurement result or, in the case of Tolerance, affects the magnitude of other elements. Consider Method: “When I hold the part and measure it I get one reading, but she rests the part on the granite plate and measures it and gets a different reading.” That’s because of different phenomena such as Abbe’s error; different methods usually produce different results. Consider Part: “If I measure this outside diameter in several places (using a 0.00005-inch micrometer), I get all different readings.” In today’s real world resolution of microns and millionths, nothing is truly flat, round or parallel. Examples like these occur every day on the production floor and have very little to do with calibration and everything to do with measurement uncertainty. Calibration involves only one of the elements, the Instrument. The other five involve variables found within the production floor Environment. These five elements are removed from the equation in a controlled environment during calibration.
Understanding the variation within a given measurement on the shop floor is much like a game of euchre, only instead of 4 suits, (hearts, clubs, spades and diamonds), there are the six elements above to deal with. More often than not, one of the elements is causing the majority of the variation or acting as a catalyst and causing variation within the other elements. This element can be viewed as a sort of “trump.” In order to improve the measurement with the least amount of time and effort, one of two things must be accomplished: 1) The trump element must be isolated and improved or 2) The other five elements must be improved (in-line, so to speak) to hold the trump in check.
After identifying the six elements, it is usually easy to decide which system to analyze first. It is helpful to analyze data from defects or rework categories resident within existing production processes. A discussion of these categories with the operators on the floor will help determine sources of variation.
An understanding of proper gage R&R study techniques is needed to analyze and lower the amount of variation in each measurement system. As experience with the gage R&R process evolves, the concept of measurement uncertainty becomes easier to grasp and implement within an analysis process.