iSixSigma

Screening Tools Can Speed Up Six Sigma Projects

Lean Six Sigma has led to improvements for many of the companies that use it; a few companies have even achieved spectacular results. However, not every company is pleased with its progress with Six Sigma. One area that may lead to this disparity of results is the use of screening tools developed by Dorian Shainin. Although they are not part of the typical DMAIC (Define, Measure, Analyze, Improve, Control) approach, organizations that use these tools can decrease the time needed to find solutions. Their effect can be seen by examining problem-solving cycle time, comparing a typical approach with one using the screening tools.

Traditional Approach

In a typical Six Sigma deployment, the speed of problem solving is determined by constraints in the system. Sometimes the core problem is an inability to decide where to start when approaching a new problem. Because of Pareto’s Law (80 percent of effects come from 20 percent of the causes), project teams are usually aware that only a handful of factors are critical out of dozens to hundreds of possibilities, however they do not know which factors are critical. The team also may have difficulties quickly identifying the critical factors without wasting time or effort pursuing non-critical factors.

Often, the team begins problem solving by brainstorming an exhaustive list of possible factors to consider. Then, the team must decide which ones to pursue first. The probability of the team selecting even one factor correctly the first time, let alone all of them, is quite low. Typically, they pursue a first set of factors, discover they are non-critical, and try again.

Eventually, the team discovers a critical factor. Then, they must decide whether to keep searching for other root causes, or to stop work on this problem, declare victory and go on to another problem. The irony in this approach is that while the foundation of Six Sigma is data-based decision making, project teams may make their initial decision – which factors to pursue first – without the benefit of data.

An analogy using plain and peanut M&Ms might be helpful. Peanut M&Ms represent the critical factors, while plain M&Ms are the non-critical factors. When a team approaches a new problem, it is like having just one, two or three peanut M&Ms hiding in a candy dish full of plain M&Ms. Brainstorming possible root causes and choosing from the list is like blindly sampling the candies with a spoon. It usually takes several spoonfuls to find one peanut candy, and many more spoonfuls to find them all.

Handpicked Content:   Use Theory of Constraints to Better Focus Lean Six Sigma Efforts

Introducing the Screening Tools

In the ideal situation, the team would immediately know which factors are critical and which are not, so they could ignore all the non-critical factors and simply pursue the critical ones.

In the M&M’s analogy, practitioners can do this with a candy dish that has a slotted opening in one side. The opening is taller than the plain M&Ms, but thinner than the peanut candies. By tipping the dish, all the plain candies slide out through the slot, leaving only the few peanut M&Ms behind. This example is analogous to adding Shainin’s screening tools to Six Sigma.

The screening tools eliminate brainstorming and hypothesizing at the beginning of a project. The project team first makes non-invasive observations of the operation. They ignore everything in the middle, and only compare examples of the very best and very worst outputs, searching for consistent differences. The guidelines are simple:

  • Any factor that is consistently different when the best and worst outputs occur is deemed critical, and the team pursues it.
  • Any factor that is not consistently different is deemed non-critical, and the team ignores it.

With this screening process, practitioners non-invasively observe and review data from the existing operation, and are usually able to separate the critical and non-critical factors more quickly than with the traditional trial-and-error approach.

How They Work

The screening tools follow four fundamental principles:

1. Pareto’s Law is universal. When teams start with a one-in-dozens or one-in-hundreds chance of choosing correctly from a brainstormed list of possible factors, they sometimes must go through several iterations before they find a root cause. Once they have found one, they may stop searching. If there are actually three critical factors at work and they fix one of them, the other two factors remain active and continue generating defects. When teams must go through many non-critical factors exhaustively to find one to three critical factors, Pareto’s Law is working against them. When teams start by using screening tools to eliminate the non-critical factors, they can discover the critical factors quickly.

2. If an operation usually works well, but occasionally generates defects, then the operation is fundamentally sound. When Six Sigma teams use the screening tools to approach a problem, they start with these correct conditions, and then discover which factors are critical and vary excessively whenever defects occur. The solution they develop is simply tighter control of these critical factors.

Handpicked Content:   Differences Between Theory of Constraints and Six Sigma

3. Effects have causes. Whenever an operation that usually works well generates a defect, something about the operation has changed. The optimum solution is to identify what has changed and then tighten control. When a team using the screening tools encounters additional defects, they assume there are other root causes that have yet to be identified, like the peanut M&Ms still lurking in the bowl of plain candies.

4. Comparing performance extremes can reveal consistent differences. The screening tools focus only on performance extremes – the very best and very worst outputs. Whichever critical factors are causing the defects to occur, the values of these critical factors will show the widest difference whenever extreme performance occurs.

Case Study Examples

The following case studies help illustrate the benefits of using the screening tools.

Case #1: Out-of-Square Grills

An appliance manufacturer makes grills for barbeques out of heavy wire that comes on large spools. The plant workers straighten the wire, cut it to a specified length and send it to the assembly line. The line workers then put pieces of wire in fixtures and weld the grills together. Usually they make flat, square grills, but sometimes they make warped, out-of-square grills. Company engineers had studied the problem repeatedly, but they could not identify the cause of the problem.

The company introduced screening tools to the direct line workers in a one-day workshop. These workers used rulers and protractors to compare six flat, square grills to six warped, out-of-square grills, and they solved the problem in just one week.

The workers made 17 measurements on each grill, looking for consistent differences between the good and bad grills. Measurements included lengths, angles and the straightness of various pieces. They discovered that straightness was the only critical factor, which guided them to the root cause: the curvature of the wire on the spool.

At the beginning of the spool, the arc was mild, and the straightener made the wire straight in just one pass. At the tail end of the spool, the arc was much tighter and the wire retained a small amount of residual curvature after only one pass through the straightener. The workers had been putting all the same-length pieces – both straight and slightly curved – into a bundle that went to the welding line. The line workers randomly selected the pieces from the bundles, which is why the engineers were not able to discover any patterns. Everyone assumed the slight curvature was negligible, and that the fixtures provided the final straightening as they held the wire in position, but the opposite was happening; the residual curvature was bending the grills.

Handpicked Content:   Tree Diagrams for Six Sigma: Plain and Simple?

The solution was to check the straightness of the pieces coming out of the straightener and run them through again if there was any residual curve.

Case #2: Pigment Dispersion

A chemical manufacturer makes a pigment material that is a dispersion of solid particles. Usually, the manufacturer makes an excellent dispersion of fine particles, but occasionally a second material precipitates out with the pigment and the batch must be reworked.

In many situations, including the previous example, little or no data on the operation exists, so workers have to make non-invasive measurements to look for the critical factors. Today, most chemical companies have the opposite problem: data overload. Automated reactor controllers record large amounts of data on every batch, and problem solvers may not know which data is critical. In this case, the automated controller was tracking 105 factors on every batch.

To sort through this data, the team focused on eight good batches and eight bad batches. They assembled a spreadsheet of all 105 factors for these 16 batches and looked for consistent differences. In just one day, they determined that 104 of the factors were non-critical, and just one, percent non-volatiles of the finished dispersion, was consistently different. Low solids dispersions were always the worst of the worst outputs, and high solids dispersions were always best of the best outputs.

This guided the team to focus on how batches ended up at different solids levels, given that the controller feeds all the reactants into the reactor automatically. It turned out that the reaction must be run at lower temperatures, colder than the reactor’s cooling jacket can achieve. The solution was for the operators to add ice to the reactor manually. The amount of ice varies, and this determines the percentage of solids of the product. The team learned that they had to limit the amount of ice addition in order to achieve good dispersions every time. They could even estimate the maximum allowable amount of ice, based solely on these initial observations. As long as the operator kept the non-volatiles level above 7.5 percent, the product would be good every time.

Leave a Reply