Tool set for improving transactional processes

Six Sigma – iSixSigma Forums Old Forums General Tool set for improving transactional processes

Viewing 15 posts - 1 through 15 (of 15 total)
  • Author
  • #33538


    Hi, questions for those of you who have plenty of down-to-earth, hands-on experience applying six sigma to transactional processes:
    Our experience as well as input from colleagues in other industries indicates that some of the statistical tools are not really applicable for achieving the desired improvements in transactional processes.
    The tools we find most useful are: Project Definition and Scoping, Process Mapping, C&E Matrix, FMEA and Control Plans.
    The tools we find, over and over again, are not useful are statistical tools i.e., Hypothesis tests, Regression and Design of Experiments.
    We are curious about the experiences of other organizations. Is there a lot of value in teaching some of the statistical tools mentioned above for transactional processes?
    Are there other tools, within each of the MAIC phases that some of you have found immensely helpful in identifying solutions. If yes, then, which are those tools?
    In our classes, although we have taught hypotheses test, DOE and other advanced statistical tools, we find that our transactional Belts just do not apply them because they do not see value in transactional processes.
    All insights are welcome. Thanks.



    I do believe that some of the other tools are useful. Time series analysis is a good tool to look for seasonality trends. Also a test of variance is a good tool to use as well. Example: If you are looking at the time it takes to complete a work order until the customer is invoiced if you only look at the average time of completion,  it could appear to be better but still have variation that could be reduced. A control chart would be another good way to see out of control conditions. The JIT (just in time) method of training might be more effective than training with all of the material at once.
    Hope this helps



    Having been an MBB in the transactional environment for over 5 years, I echo the opinions of your belts.  It’s not that the statistical analysis tools do not apply, they simply apply less often in a trans environment.
    Tools found most useful have to do with analyzing the process, once high level data has been collected to point to the L2 process contributing most to poor performance.
    Key questions I’ve found useful to answer in Analyze include:

    what is the transaction volume?
    what are the flow rates at decisions?
    what is the flow prioritization (i.e. FIFO, LIFO, etc.)
    what is the ‘normal’ flow process?  Exception?  How are they resourced?
    what are the sub-process cycle times?
    what is the total cycle time?  How much of that is actual touch time (processing time); how much is queue?
    where is the opportunity: to reduce touch time or queue time (almost always the latter, which should also yield fewer instances of jumpion to an IT system solution, which usually address touch time and not queue)
    which steps are value-added?  Non-value added?  Value-enabling?
    where are the handoffs and what are the controls around them?  How is work moved from one area to another … push?  pull?  batch?
    what are the existing metrics/targets?
    who owns the process?
    who performs which tasks?
    what are the Moments of Truth?
    Hypothesis tests, DOEs, etc. do apply, just not as often.  Exponential improvement is achievable simply by ensuring sound prioritization of incoming units, attention to reducing queue, and ensuring existing resources are working on the right thing.



    I happen to agree with you with one exception.  Regression can be very helpful in the services environment.  When you analyzing possible associations between multiple explanatory variables (independent) and one response (dependant) variable, you may desire to determine the strength of impact for each of those variables on the response variable.  This is when regression can be very helpful. 
    What may also be of interest to you is if you are in a call center environment and a supervisor desires to understand the handling times of his/her team with respect to the center average.  He/She can get a good sample size for her team and use statistical inference to determine how her team is performing.  Hence, hypothesis testing can be valuable. 
    Otherewise, your statement is very true.  We have eliminated DOE, taguchi experiements, and response surface methods from our training and installed benchmarking for continuous improvement and process simulations tools for operations planning.
    Hope this helps…..


    Dan Feliciano

    Other tools that I have found extremely useful are:
    q       Lean tools
    o       Value stream mapping
    o       Calculating takt time, given the lack of data that typically exists in transactional environments
    o       Batch size reduction
    o       5S
    q       Theory of constraints
    q       Pareto charts   


    Mark Almeter

    I’ve found the following tools to be extremely useful while improving transactional processes:
    Process mapping, pareto charts, process FMEA, CTQ tree, COPQ, force field anlysis, project management tools
    Data collection checklist, operational definition checklist, attribute sampling techniques, fishbone technique
    5-Why’s, chi-square analysis, multi-voting, correlation / regression analysis, value-added analysis
    Brainstorming, affinity diagramming, process FMEA, tree diagram, network diagram
    Control charts, force field analysis, various other measure tools to confirm the gains
    Hope this helps….


    Xavier Dejaer

    Our main processes are transactionnal.  But we teach up to DOE to the BB.  We have examples where we used MSA, ANOVA or Regression.  For DOE it is more difficult but we are working on it.  If your BB are not using statistical tools, look also at their background: engineers or businnes admin. 
    All the best



    You are right on the money.  I have worked in a financial organization for three years and found that the statistical tools you discussed were rarily utilized in the tranactional world. However, some of the hypothesis testing was useful to determine whether or not we made a difference in our results.
    Most gains are wityh the tools you spoke of, I’d like to add that as I created the training modules I found that more of the lean tools such as Value Stream mapping and 5S, visual office etc were very useful.
    I want to make a comment that while the statistical tool were less useful I would not name someone a greenbelt or Blackbelt without requiring a full competency in those tools.



    I received my Six Sigma and lean training with Allied Signal. I am proud to say that it did not matter whether you were an engineer or an accountant everyone had to meet the same requirements.
    I am not an advocate of diluting the cirriculum based on the whim of the individual company you may work for.


    Steven Hice

    If you are not using the referenced statistical tools, you might want to question whether you are:
    1. Getting enough applicable data. Most transactional processes do not have substantial data because historically these processes are not documented much less measured.
    2. Your teams are aware that there are tools available which can make use of the more comprehensive analytical DMAIC tools, e.g., Igrafx Process for Six Sigma. (DOE output to Minitab is available)
    3. Data mining sufficiently to get the most out of the data. (How comprehensive is your data collection plan?)
    The referenced non-statistical tools are of critical value and to discount their usefulness in light of the more robust analytical tools impacts the quality of the project’s outcome. I have noted process design or improvement teams chose to not use these tools primarily due to their ignorance of the tools and their appropriate usage. These tools are extremely valuable within transactional process projects, They have the ability to segregate truth from intuition and to deliver quantifiable results from completed projects. There are several good books out where the usage of statistical tools is detailed.



    I found that data was not the issue. We had millions of data points a month. Call centers, banking transactions, etc.
    However, the usefulness of many of the classical tools just is not there.
    Unfortunately many people originally tried to forcce tool usage were it was not required.
    This is not to say that GB’s and BB’s must know how to use the tools.


    Dean B

    In transactional processes I have found the “technical or economic data”  may point you in the right direction using a high altitude metric like a simple Pareto, or trend data.  However, getting to root causes often is more about value assessments rather than statistical analysis. For example, a Pareto may suggest that most of the arguments are occuring in a certain place, however it will take human inputs, say from human interventions using FMEA or Situational Leadership, to reveal where the greater hassels, disruption or pain is occurring, and why.
    Transactional processes are driven heavily by valuations made by the transactional agents, not just on utility we as analysts might externally place on data. Examining value issues such as: Which duties take priority? What transactional ends are sought and what intents and means are proper? Or, at whose benefit and at whose expense ought this process be managed? ..these issues all can be more germane to the field of ethics than to statistics, yet they are immensly instrumental to the success of many transactional processes. In a transactional process, knowing its variation is meanless (can be good or bad) unless well examined value judgements tell us what the variation means.
    Regarding  the use of regression, I have found behavioral studies can benefit from using curv-fitting to find non-linear relationships between factors as a means of understanding dynamic behavior, which can  contribute to the “value-finding” human interventions and root cause analysis.



    I have only recently joined a bank after working in more engineering oriented environments.  One of my first observations when I started to apply statistical tools was the inadequacy of the data – not the volume, which is immense, but the lack of standardisation in what was being measured, i.e. the operational definitions were unclear.  As a result, the immediate opportunities for improvement lie more with the use of Lean tools (eg batch reduction, removing queues and achieving flow, identifying and removing waste, etc.).
    Long term I will be working on developing standards and understanding the drivers of work, so that performance expectations can be set.  This will likely involve tools like DOE to identify the affect of individual drivers and various potential interactions (there are many probable ones) on the time it takes to complete an activity.  Once an operational definition based on the content of the work units is established, rather than the current definition, which revolves around a marketing definition.  I hope to be able to group operationally similar work units and understand the causes of variation within these groups. At that point I will be in a better position to apply the more statistical based tools.
    My colleges believe that the nature of work performed in approving loans does not lend itself to standards, and therefore statistical analysis  – they may be right, but I need to prove it.


    Kim Niles

    Dear CSSBB:
    Good questions, here are some quick thoughts.  DOE’s are best applied where continuous variables and non-obvious interactions likely confound a situation.  To that end, they never seem to be applied enough in my opinion. 
    It’s an ever present fight to get “process experts” to truly understand the power of DOE and see where they should be run when those experts don’t think it should be needed.  However with that said, transactional processes don’t have as much potential to yield the same benefits as other types of processes even though I’ve seen truly surprising and successful case studies performed on transactional processes.    
    I hope that helps.
    Kim Niles – http://www.kimniles.com


    Dwayne Gantz

    I would agree that in the vast majority of the six sigma projects the tools that provide the highest payback are those cited such as process mapping, scoping, and project charter, etc… The Process mapping of the inputs, outputs, process, tools, and deliverables provide a good foundation when reviewing the AS-IS environment and looking to what the TO-BE could look like.
    The development of metrics during the initial analysis phase of the project are also very important… for the simple fact of producing evidence that the project met the expected results. On several projects the lack of good solid metrics did not provide that benchmark or foundation to judge the projects success.
    I do feel that one area that does get over looked but which is vital to a successful project is Communication and Change Management. Both of these areas can either positively or negatively impact a project. These are areas that I concentrate on during a project to ensure that later in the project I do not need to spend critical resources in “spin control or management”. Effective Change Management needs to be on the list of items that need to be addressed during the project. It is also an area in which the six sigma program does not provide a tremendous amount of direction or guidance. However, do not under estimate the value or the time committment necessary in this area.

Viewing 15 posts - 1 through 15 (of 15 total)

The forum ‘General’ is closed to new topics and replies.