The theory of constraints is an important tool for improving process flows. The implications of the theory are far reaching in terms of understanding bottlenecks to a process and better managing these bottlenecks to create an efficient process flow.
The theory of constraints is an important tool for operations managers to manage bottlenecks and improve process flows. Made famous by Eliyahu M. Goldratt in his book The Goal, the implications of the theory are far reaching in terms of understanding bottlenecks to a process and better managing these bottlenecks to create an efficient process flow. Simply put the theory states, “the throughput of any system is determined by one constraint (bottleneck).” Thus to increase the throughput, one must focus on identifying and improving the bottleneck or constraint.
Goldratt in another book, Theory of Constraints, outlines a fivestep process to applying the theory:
Teaching the theory of constraints can be aided by a simple simulation exercise with people in any training class: Take small five boxes partially filled with 15 small candies each. The instructor hands the boxes one at a time to the first participant who empties the contents and then hands the box and candies to the next participant. The second person counts the number of candies, puts them in the box and hands it to the third person who returns the box the instructor. The instructor varies the rates at which the five boxes of candies are handed to the first person in the “value chain” and times are recorded for each member’s part in the process.
As the boxes of candies are handed to the first person slowly, it is easy to observe that all participants have ample time to carry out their activities. At the slow pace, all resources are being starved of work and the bottleneck is external to the process, i.e., the instructor. Now consider what happens if the instructor speeds up the process and hands over the boxes at a much faster rate – then, all three participants start to work faster, with the person in the middle always busy as the other two participants wait for him/her. In essence the bottleneck has now changed to Person B, and a process constraint or bottleneck has been identified.
Consider for the sake of argument that it takes the following average times for the participants to carry out their activities:
Person A: Time = 2 seconds (Throughput = 30 boxes/minute)
Person B: Time = 20 seconds (Throughput = 3 boxes/minute) <— Bottleneck
Person C: Time = 5 seconds (Throughput = 12 boxes/minute)
The efficiency of the process has slowed down to the slowest resource, the output of the whole process is three boxes a minute; and Person A and Person C are not contributing to the overall efficiency. If they were to slow down, they would not have an adverse impact on the process, since they will always have idle time. Such resources with extra capacity are nonbottleneck resources.
Now consider the impact of inventory on the process. If only one box is sent through the value chain, the total time taken by the box to be serviced is the sum of the activities of all three participants, i.e., 2 + 20 + 5 = 27 seconds. This is the minimum theoretical time that a box of candies can be processed in. From a customer perspective it would take 27 seconds for their box of candies to be delivered. However, if there are other boxes within this value chain, then these boxes will be serviced first. The number of boxes within the system is sometimes referred to as work in process (WIP) and the relationship is expressed through Little’s Law:
Inventory = Throughput x Flow Time
Where Throughput is the throughput of the process and is the throughput of the slowest resource (Person B) and Flow Time is the average time that a typical flow unit spends within the process boundaries.
Now suppose there are a total of four boxes ahead of a particular customer’s box, thus Inventory = 5 and the formula is:
5 boxes = 3 boxes/minute x Flow Time
On overage the packet spends 1.67 minutes within the value chain before a customer gets his box back. While the valueadded work on the packet is just 27 seconds, it spends a total of 100 seconds within the system boundaries. A 27 percent efficiency ratio (valueadd time/total flow time).
While this simple example has been used to demonstrate the theory of constraints, the implications are clear for an operations manager who deals with processes. Suppose the three persons in the example represent three business units working together to create a horizontal value chain. If Business Unit A becomes more efficient (e.g., reducing its cycle time from 5 seconds to 2 seconds), it may actually be detrimental to the overall process. Yet quite often, Business Unit A’s improvements will be rewarded since there is a tendency to measure results vertically in silos. Additionally sometimes the work that an entity does is nonvalueadded. In the candies example, Person C was transporting the boxes and did not provide any value to the overall process. A good operations manager will think in depth about these issues.
Several other corollaries result:
Adding inventory to a system can greatly increase the lead time for a customer. Using the above example, the table below shows the effect of different inventories.
Inventory  Throughput  Flow Time  Efficiency Ratio 
5  3 boxes/min.  1.67 minutes  .27 
10  3 boxes/min.  3.33 minutes  .135 
100  3 boxes/min.  33.3 minutes  .0135 
In the extreme scenario where there is an inventory of 100 units on average within the system boundaries, an individual box only spends 1.35 percent of its time having valueadded work performed. The rest is all waiting time. This is an important result, which means that wait time is an output Y, which is determined by inputs (X) such as inventory or throughput of the system. Some managers erroneously focus exclusively on reducing the wait times without understanding why these wait times occur (i.e., looking at the bottleneck resource or the amount of inventory that may be causing it).
Since inventory is usually a function of external demand, at some point it may be feasible to limit the number of units that can enter into the system. However, this may result in lost revenue opportunities. If this is not a viable solution, an alternative may be to put more resources to the bottleneck which will have a cost implication to the process but may speed up delivery.
In the candies example, Person B is the bottleneck. At this point all efforts should be focused on alleviating the bottleneck (theory of constraints). This can be done in a number of ways, for example, putting more resources on the bottleneck activity (like two persons acting as B instead of one), leaning, standardizing, fool proofing, flexible resourcing, job leveling (shifting some of Person B’s work to either Person A or C), etc. The idea is to alleviate the bottleneck as much as possible.
Suppose some work is shifted from Person B to Person C by asking them to fill the box with candies and the new times are recorded as follows:
Person A: Time = 2 seconds
Person B: Time = 15 seconds
Person C: Time = 10 seconds
Note that the total valueadd time is still the same, i.e., 27 seconds. However, the flow times are reduced and efficiency ratios are improved, as the table below indicates.
Inventory  Throughput  Flow Time  Efficiency Ratio 
5  4 boxes/min.  1.25 minutes  .36 
10  4 boxes/min.  2.5 minutes  .18 
100  4 boxes/min.  25 minutes  .018 
Person B is still the bottleneck, however, if we continue to shift work from B to A or C, the bottleneck may change.
By simply shifting work, without any additional cost, the efficiency of the process has significantly been improved. This, however, may not be easy to do in a work setting with specialized jobs and roles.
It is obvious that theory of constraints can be applied in a number of work situations, whether it is the service industry or the manufacturing sector. The theory of constraints helps to focus on areas where improvements should be made and enables leveraging of other Six Sigma and Lean tools to improve that focus area.


© Copyright iSixSigma 20002018. User Agreement. Any reproduction or other use of content without the express written consent of iSixSigma is prohibited. More »
Comments
a clear elaboration of the TOC.
this is a good explanation, but to be honest I wasn’t and am not sure what Goldratt brings to the table which is not covered by normal standard Lean process of Work Flow Balancing – even his “story” of the scouts is just an example of balancing in action, but not really in a process, as they all walk together.
I see some Lean concepts (getting into VSM) as of more value than TOC.
From what I know, Goldratt has definitely brought a lot of focus on identifying process constraints. Like Six SIgma, it was all there before jut not focused on the way it is now.
@ Kim J B.
Work flow balancing is only one tool to break a bottleneck. I find them wholly different myself.
Nice intro to TOC and coplements Goldratt’s “The Goal”.
Goldratt’s book was completely necessary in order to wack decision makers on the side of the head. No matter how many degrees or how many years of experience one has, we still tend to wake up every morning with a tendency to fly by the seat of our pants instead of doing the brainwork necessary to improve the health of the system.
That is because if we fly by the seat, it supposedly leaves us processor time to relish our state. Whereas if we are continuously working our brains to solve efficiencies, then we are left with very little time to watch and savor our success.
This might be named a theory of its own, aptly could be some Freudian title of human desire to savor in realtime.
So yes the Six Sigma works, but only if by applied by second nature all the time. Don’t let up continuously train everyone all the time and expect results.
A good explanation of Theory of Constraint. I think that Theory of Constraint can be used along with Value Stream Mapping and other RCA tools such as Ishikawa Diagram and the findings can be further validated using Pareto analysis to focus the improvement efforts.
Now, thinking stochastically, if Process A has a throughtpu mean and std dev so that its limits are between 0 and 10 and process B has a througput mean and std dev so that limits are between 20 and 30 it is evident that B is the bottleneck. If A goes from 0 to 30 and B from 20 to 40 then B is the limit 50% of the time. This is just rough gut feeling. I believe you have to look at the throughput distributions to compare.
Remember the definition of throughput; the processing of the candies is not throughput until the instructor receives an empty box from the 3rd person. Throughput distribution would be stating there are [for example] 3 types of boxes that need to be ‘processed’ and that one is a priority box. Then we would have to worry about distribution of throughput in terms of numbers of priority box and other boxes being turned into throughput. Good thought!