# Tight Tolerance with SPC

Six Sigma – iSixSigma › Forums › General Forums › Tools & Templates › Tight Tolerance with SPC

- This topic has 8 replies, 4 voices, and was last updated 11 months ago by Paul Keller.

- AuthorPosts
- August 8, 2019 at 8:32 am #240885

carlislecParticipant@carlislec**Include @carlislec in your post and this person will**

be notified via email.At my shop, we run SPC on every part fairly religiously, holding our parts to a 1.66/1.33 Cp/Cpk standard. We find a major issue, however when we try to run parts with say a +/-.00045 tolerance on a machine that is designed to hold within .0003. As the parts run on this machine, they get a natural spread and their Cp/Cpk values are thrown way under our standard.

At this point, we have decided that for such tight tolerances, it would be best to lower our Cp/Cpk standards for that dimension, but we are unsure how to go about this. I am wondering if there is any mathematical way of knowing before-hand what a machine will be able to hold based on its natural capability and if, based on that, we can calculate what a reasonable Cp/Cpk would be to hold.

0August 8, 2019 at 10:17 am #240887

Chris SeiderParticipant@cseider**Include @cseider in your post and this person will**

be notified via email.you hold parts to a Cp/Cpk standard? Not specs?…..I find this confusing

I apologize if I’m having a brain freeze.

0August 8, 2019 at 10:59 am #240888

carlislecParticipant@carlislec**Include @carlislec in your post and this person will**

be notified via email.Say we are trying to hold a tolerance that is 1.000-1.010. We want to maintain a Cpk of 1.33 or higher and a Cp of 1.66 or higher on that spec. If we drop below that on either one, then we will start to fix possible issues w/ the machines/processes. That’s what I mean by a Cp/Cpk standard.

0August 8, 2019 at 11:01 am #240889

Mike CarnellParticipant@Mike-Carnell**Include @Mike-Carnell in your post and this person will**

be notified via email.@cseider @carlisler I am with Chris. I have no idea how you “run to a Cp/Cpk standard.” First if you can’t hold the tolereance when you make the part then you should not be able to get to a 1.66/1.33. If those are legitimate numbers then that means you did run that well at the time the study was done.

It the end of the day your variation is to large. It doesn’t really mean much that the machine is designed to do 0.0003. It isn’t holding what you need. That means you need to fix it.

This is a generalization but if there is a mean issue then it is generally a knob variable. If it is a standard deviation issue it is a process issue i.e. tool wear, material, tooling etc. This is just a guess but when you are running that tight I would start at tool wear. You have virtually no room for tool wear.

Just my opinion.

0August 8, 2019 at 11:17 am #240890

carlislecParticipant@carlislec**Include @carlislec in your post and this person will**

be notified via email.@Mike-Carnell We have our operators run SPC checks at set intervals. If they are below 1.66/1.33 then they need to take some form of action to correct the issue. The problem though, is that if a machine that can only run within .001 is running a part w/ a tolerance of +/- .001, our Cp/Cpk are going to be inherently low.

Currently, we are having to run case studies on every part to decide the reasonable Cp/Cpk to hold the operators to. But we are researching a mathematical way of deciding that given the inherent nature of our machines.

0August 9, 2019 at 9:03 am #240915

Mike CarnellParticipant@Mike-Carnell**Include @Mike-Carnell in your post and this person will**

be notified via email.@carlislec mixing the language between SPC/contr0l charts and Capability Studies. I don’t get the idea of assessing a control chart in terms of Cpk seems very odd. Control Charts by themselves will tell you when to adjust without any conversation about Cpk. I may be missing something here.

Basically you have a machine that is not capable doing this operation. I am not sure why anyone would be surprised that you can’t hold the Capability. If you don’t do something to change the machine you are going to have to deal with the capability issue.

0August 9, 2019 at 9:26 am #240917

carlislecParticipant@carlislec**Include @carlislec in your post and this person will**

be notified via email.@Mike-Carnell We’ve already decided that the machine is not completely capable of holding our standards, that is why we are looking to lower our standard. Our issue is how much to lower it.

0August 9, 2019 at 2:11 pm #240920

Mike CarnellParticipant@Mike-Carnell**Include @Mike-Carnell in your post and this person will**

be notified via email.@carlislec If the entire goal is to have a Cpk of 1.33 that means you need to have a tolerance width of 4x the standard deviation to the closest specification. You had better use long term data to calculate this. I am not going to tell you about a 1.5 sigma shift in the mean but if you use short term data it will move. A total tolerance width could be established at 8X the standard deviation and your process will have to stay dead center.

0August 12, 2019 at 7:43 pm #240992

Paul KellerParticipant@[email protected]**Include @[email protected] in your post and this person will**

be notified via email.I don’t see an issue with your company setting a standard for process capability of Cpk=1.33. This is a fairly standard practice, and 1.33 has been the suggested goal for Cpk since the 1980’s as it provides a bit of margin of error in case the process shifts without detection.

So, your question is how to respond when the process cannot meet that capability requirement?

First off, I assume you’re using a control chart to estimate the Cpk. It’s not a valid estimate unless the process is in control. Capability is a prediction, and you can’t predict without process stability. I would encourage you to always report Cpk with a control chart, so it is clear that the process is in control before the capability estimate is considered.

You asked about the natural process variation. When the process is in control, the process mean and standard deviation can be used to estimate the natural process variability. The natural process variability is defined by the control chart’s control limits when the subgroup size is one, but can be calculated in all cases at +- 3 process sigma units from the mean, Your control chart software should provide the estimate of process sigma based on the range, moving range or sigma chart. This is inherent to the design of the chart, so that short-term process variation (estimated by the within subgroup range or standard deviation) provides the estimate of the longer-term variation of the plotted statistic defined by the control limits. In this way, process control or stability is defined as the ability for short-term variation to predict longer-term variation.

So, if the process is in-control but not capable of meeting the Cpk=1.33 requirement, then you need to reduce the common cause variation in the process. It usually makes sense to start with an R&R (Repeatability and Reproducibility) study to see how much measurement error contributes to the common cause variation. Especially for tight tolerances, it could be significant, and you might find that improved measurement procedure or equipment is warranted.

After that, continue to investigate the other potential sources of variation. Usually, consider the 5M and E: Measurement (ok, already got that one), Machines, Methods, Material, Manpower and Environment. A cross-functional Six Sigma project team sponsored by local management is often the best way to proceed with process improvements of this nature.

Hope this helps! Please let me know if you have any questions.

0 - AuthorPosts

You must be logged in to reply to this topic.