November 21, 2009 at 4:42 am #52953
SParticipant@Pradeep Include @Pradeep in your post and this person will
be notified via email.
I have a question
why P/T ratio for gauges has to be <10 % or in other way R&R to be less <10 % ?
Does it has any correlation with Process specification ? If So what is the margin for not committing Type I error?
0November 23, 2009 at 2:34 am #186984
There are several reasons that gauge R&R should be less than 10% for most gauge applications. In some texts, I have seen P/T ratios between 10 and 30% classified as “marginal” gauges.
First, the disposition of units is subject to incorrect classification as the ratio rises. This can lead to issues with sending on product that does not meet your specifications. A unit measured just inside the spec limit might actually be bad, and similarly measurements from outside specification values could actually be good, if measured using a tool with better precision.
Second, efforts to monitor and improve the process, either through control charts, or designed experiments, can be more difficult due to the inherent variability of the measurement tool. This may lead to higher sampling costs (larger “n”) solely to get a better signal on the current state of the process or to develop an improved process.
Bottom line, use the “best” tool available from a P/T ratio perspective unless there are other factors preventing use.0
The forum ‘General’ is closed to new topics and replies.