Done properly, the five phases associated with Lean Six Sigma (LSS) projects can deliver significant, sustainable process improvements. These phases are Define, Measure, Analyze, Improve and Control – otherwise known as DMAIC.
The problem is that the “done properly” aspect is disappointingly rare, particularly in organizations operated using traditional Western-style management. This drives countless instances where practitioners wearing imaginary belts of various colors followed DMAIC, but the project’s outcomes were disappointing or short-lived.
A key aspect of the breakdown originates with how organizations define and track “following DMAIC.” Far too often, tracking amounts to a check list of tasks, often referred to as tools or deliverables. While such tracking is expedient in the short run, it usually shortchanges the thought processes underlying effective problem-solving. The issue is further exacerbated when we review the ubiquitous PowerPoint decks, and find that the individual tools/deliverables are operating in virtual isolation from one another.
At the project level, some simple causes of a breakdown might include:
- Preference for the outward trappings – the deliverables and the tools – over understanding and applying the underlying wisdom
- Overlooking the cultural and organizational adaptations needed to enhance the likelihood of the project succeeding
- Measuring what is convenient to measure – checking off boxes rather than assessing how appropriately the tools were selected, applied and integrated
The table below shows a list of the questions – two per phase. For each question, I list some (not all!) of the tools that might be useful in answering the questions. Bear in mind that few if any projects require use of all the tools, and that part of being a LSS practitioner should entail the skills to discern and select the right tools for the situation at hand.
Clearly, the list of potential tools is far from complete, and some tools might be adaptable to develop responses to multiple questions. Also, the list of questions covers most but not every issue that might come up. An important aspect of coaching needs to focus on knowing when and where the commonly used tools do or don’t apply, and aligning the tools to the underlying question is a part of that coaching task.
The Questions Driving the Five DMAIC Phases | |||
Phase | Questions | Common/Potential Tools | |
Define | 1 | What is the performance gap? | Problem statement, data displays |
2 | Why is it important? | Voice of the customer (VOC), business case, critical to quality (CTQ) | |
Measure | 3 | How is the work performed currently? | Flowcharts, value stream mapping (VSM), failure mode and effects analysis (FMEA), spaghetti maps, gemba |
4 | What are the outcomes? | Response development, measurement system analysis (MSA), control charts, process capability | |
Analyze | 5 | What are the primary root causes for the gap? | Fishbone, Pareto, VSM starbursts, data analysis, hypothesis testing, 7 wastes |
6 | How do we know? | Hypothesis testing | |
Improve | 7 | What process changes will alleviate the root causes? | Brainstorm, creativity tools |
8 | To what extent did they work? | Hypothesis testing | |
Control | 9 | How will we sustain the gains? | Control plan, FMEA, standard work |
10 | Is the organization ready to take the reins on sustaining gains? | Training plan, standard operating procedures (SOPs), statistical process control (SPC) |
Let’s expand on each question a little bit.
1. What Is the Performance Gap?
This question describes the difference between the current and the desired states for the process. It summarizes the primary improvement objective for the project. Often the gap constitutes the first sentence of the charter’s problem statement.
2. Why Is It Important?
Here we begin to capture the pain points associated with the performance gap. The same gap can cause problems for customers, employees and business-related outcomes. The important aspect helps us understand which pain points are of primary concern. This understanding often comprises the second sentence of the problem statement.
3. How Is the Work Performed Currently?
We describe how the work is done at present. While it can be useful to read documentation, most often this is best captured and conveyed in a graphical manner. Sometimes this activity identifies some quick-win opportunities or identifies where temporary measures are needed quickly in order to avert severe outcomes.
4. What Are the Outcomes?
Here we use data to capture how well the process does (or does not) meet the needs of its customers, and of the business. Sometimes we need to invest effort deciding what to measure, how to measure it, and how to ensure that the measurements dependably reflect actual performance. Often we use control charts to understand how stable the performance is, as well as process capability to estimate the probability of meeting requirements.
5. What Are the Primary Root Causes for the Gap?
It’s time for a “Captain Obvious” moment: The best solutions attack (and prevent) the causes of the problems. This question seeks to make sure we know the primary causes. We can seek clues in process data, such as patterns in control charts, or statistical and graphical analyses that expose cause-and-effect relationships. We also can query process maps and use non-data-based team techniques.
6. How Do We Know?
Whenever possible, we seek to use objective techniques to verify whether we have identified the correct root causes. Sometimes this can be challenging, even infeasible, but we are wise to pursue every reasonable avenue.
7. What Process Changes Will Alleviate the Root Causes?
Hopefully, this step helps teams avoid two frequent missteps: inadequate idea generation, and poor alignment between solutions and root causes. There is a host of creativity tools that can enable teams to generate good ideas, and the facilitator’s obligation is to play devil’s advocate, asking how this solution addresses one or more root cause. The activity also involves the non-trivial aspect of planning a rollout to test the proposed solutions.
8. To What Extent Did They Work?
Changing how the process is run does not guarantee that the outcomes will improve to the desired extent – or at all. We need to gather “after” data, and compare it statistically to the “before” data that resulted from answering question 4. And if the improvement is insufficient, we need to loop back to question 7.
9. How Will We Sustain the Gains?
Old habits are tough to break. Even if the answer to question 8 is a huge success, we need to put in place measures to ensure that behaviors don’t slide back to the bad old days. There are tools and systems that help make it happen, and they need to be selected, planned and executed.
10. Is the Organization Ready to Take the Reins on Sustaining the Gains?
The ultimate objective for a DMAIC project is for the organization to become self-sufficient at operating the process in its improved state. Question 9 describes some of the measures to get them ready, while this question focuses on making sure the actual execution is in place. It includes ways to detect and respond whenever the process hints at a return to the pre-project state: a blend of human and technical considerations.
Clearly this list of questions is not exhaustive, but as a means of bringing new candidates up to speed it is beneficial. Try applying them to your projects.