iSixSigma

Sea

Forum Replies Created

Forum Replies Created

Viewing 28 posts - 1 through 28 (of 28 total)
  • Author
    Posts
  • #168677

    Sea
    Participant

    In manufacturing the benefits are widely known and accepted, I have found that in the service or transactional areas there is far more  skepticism and the only way I can get buy in is by demonstrating over and over the benefits.
    Contrary to an earlier post I have found that data is much more difficult to get hold of and one of the first things we have worked on is a set of metrics.

    0
    #61859

    Sea
    Participant

    Bob (5th Feb) provides great advice. In getting the process map together I would also suggest drawing a Spaghetti Chart of the movement and staff and patients … and then look at the reason for the movements.
    In addition, segment the patient types if possible and analyse the time required to manage each type. You could also segment the staff and analyse how much time each type spends with the patient types.
    With all this information (and having walked the floor) there may be very obvious lay-out and preparation / change-over issues that you can address.

    0
    #167834

    Sea
    Participant

    I wouldn’t get too disheartened by shooter’s negative comments, it is perfectly possible to do a VSM for what you are describing and get some great results.  Just one word of warning though, think of doing the map for a process or group of processes rather than for a department because you will find that to get the most out of the VSM you must consider what happens end to end and this will more than likely involve more than just your department. 
    First of all identify your customer and remember that this can be an internal customer as well as external.  For your customer demand you say that you just work with what is received on a day to day basis, but have a look at what you have historically had, what is the average workload, min – max, is there a pattern based on the day of the week.
    Then identify your suppliers, if there are any, sometimes there are none.
    Then walk through the process and identify bottlenecks, waste, delay, etc.  The most common things that are identified here are the delays when handing off to other departments or between teams in the same department, re-work loops, and organisational issues with people searching for documentation, etc.
    You can also get an idea of cycle times, inventory or WIP, right first time, delays etc and from this see whether you can meet the demand from your customer.
    For me the key thing is to accept that there will be variation in your process because you have people performing slightly different tasks each time, also you need to think of your people as shared resources if they are not working in the process you are mapping all of the time, so some workers may only have 2 hours per day to work on inspection for example.  Having said this, don’t let it put you off, you can still get value from it.
    Finally it is ok to have decision boxes but try and use them only when the result of the decision triggers a different process stream otherwise your map will become too complicated and you will lose the big picture.  If you want all the detail about every individual step in the process, plot a flow chart.  One of the biggest successes I have had with this type of map was figuring out that 80% of the work that was going down the manual process stream could go through the automated process if we made some simple changes to an earlier step in the value stream.
    If you want a reference book you could try Value Stream Management for the Lean Office by Tapping and Shuker

    0
    #167336

    Sea
    Participant

    Another metric I have used is % receivables current.  I like this metric because it is very easy to understand and is not influenced by factors like fluctuating sales.  It is also more controllable for the folks doing the collecting and allows you to get some of the low hanging fruit.  You can have a direct impact on this metric whereas days sales outstanding needs to be a long term target as this relies on changing contracts with customers not just getting what is rightfully yours when it is owed.
    % Receivables current is simply the % of total receivables that are not overdue on any particular day.  I would monitor this at least weekly to get a good picture.
    I work in manufacturing  not service but we look for % receivables current of over 90% and our best practice is 94-95%

    0
    #57334

    Sea
    Participant

    We have 6-sigma MBB, BB, GB, YB and for those that are trained in Lean tools you guessed it Lean MBB, BB, GB or Lean Tool expert for those only using a single tool e.g 5s.

    0
    #158369

    Sea
    Participant

    HRM,
    I have used many of these games and find the cup game particularly useful especially with manufacturing teams.  However when training transactional folks I find that they can get ‘switched off’ when faced with manufaturing examples.  The common complaint being, “I know all this works in manufacturing but how can it help me”.  To get around this I designed a game based on the order process, complete with batches, checking, duplication of effort, handovers etc. and found that this was very powerful because it was something the guys could relate to.  Maybe you could set up a simulation based on one of your own processes, with additional steps, rework, etc. if required to make it a bit easier for the group to make improvements.  A friend in the insurance industry uses claim processing, and in a bank I have seen new customer sign up used.  It might take a little more time to prepare but I think its worth it.

    0
    #57286

    Sea
    Participant

    The process of reducing stock in a production process normally involves a few things and I’ll try and give you the relevant terminology.
    1. Mapping the process using Value Stream Mapping (VSM)
    2. Create Flow (smooth movement from one process to the next with no delay to create flow processes need to have similar Cycle Times)
    3. Using Pull Scheduling rather than Push (Pushing causes a build up of inventory between process steps because process 1 will still produce even if process 2 is not running.  Pull means that process 1 only produces when process 2 has a requirement)
    4.  Implement Kanbans to manage the material flows between process steps.  (A Kanban is a signal used to indicate when material inventory needs to be replenished and with how much)
    Hope this helps

    0
    #151575

    Sea
    Participant

    If you only have limited resources (as most companies do) I would implement Lean first and within 12 months start with 6-sigma.  The reason for this is that when we started 6-sigma we actually found that around 50% of the projects identified did not require 6-sigma but in fact lean techniques.  My company had been doing Lean for a number of years before the 6 sigma rollout but I think a lot of people forgot this and kept trying to force projects to fit the 6s tools.  Now we have re-shifted the focus and I think are getting much better results.  Projects are now identified using VSM (Lean) and then the right tool is applied be it lean or 6 s.  If you do not have 6 s capabilities yet you may have to park som of the improvement the projects identified until you do but I think this is better than wasting valuable resources making simple problems complicated and working on projects that ultimately do not add value.

    0
    #149923

    Sea
    Participant

    Whilst I agree that a Mann Whitney test should be used in this case, I have one question.  Did you expect one set of data to be normal and the other non-normal?
    I often have trainee BB and GB asking similar questions, but when we look at the data find that the non-normal data set has too few samples and then when more samples are collected it becomes normal or that there are really two data sets combined in one and if these are split out then we have normal data as well.
    Basically don’t jump to the conclusion that you need to use non-parametric analysis until you have validated and understood your data.

    0
    #148188

    Sea
    Participant

    for the film clip go directly to youtube.com and search for “Lucky Louie why” and the clip will come right up 

    0
    #148132

    Sea
    Participant

    Brilliant, a great introduction, it’ll give the class a break from listening to me too.

    0
    #147991

    Sea
    Participant

    Hi Eric,
    unfortunately I can’t see the comic strip.  I did find my way to your site on Geocities though. Really liked the mad scientist stuff.
    Jules

    0
    #147988

    Sea
    Participant

    Thanks Brit that would be great.
    julisum at netscape.net
    Jules

    0
    #146628

    Sea
    Participant

    You do not have to be an expert in the industry in order to be successful as a BB or GB.  what you need is to be a great project manager, be open minded and follow the process.  there will be plenty of other experts in the business, it should be your job to guide them to improve their processes.
    In all industries business is a series of processes, some of them work well and are efficient, others contain waste, unnecessary complexity and cost.  Once these have been identified you can map the process, identify the waste and improve the process. 
    Strictly speaking you may not apply 6-sigma techniques, in fact Lean tools are probably more appropriate but don’t let that stop you.
    Try reading a book like Lean 6 sigma for services.  The ideas are pretty generic and can be applied in any industry.

    0
    #144285

    Sea
    Participant

    Do you collect data on your supplier’s quality issues?
    If you do then you could present this to the supplier and challenge them to improve on it.  If you don’t think they have the capability to improve on their own then offer your services in the form of coaching with problem solving or other tools.  If you can show them a win win situation then I don’t think you can lose.  Maybe you can get better quality and they can get reduced costs, but start with the data and see what it shows you.

    0
    #144284

    Sea
    Participant

    George.
    no, no, no, no, no.
    TAKT time is the rate of demand of the customer it has nothing to do with your capability to produce at this rate.
    read EdG’s response I’d give him an A for his response.

    0
    #58890

    Sea
    Participant

    Purchasing is a great place to practice Lean and 6-sigma.  I know a colleague who found that when he mapped the purchasing function in his organisation he found that there were 80 separate steps with an 6 different handoffs between departments and up to 6 levels of approval.  Now tell me there’s no waste or variation in that process.

    0
    #57089

    Sea
    Participant

    In fact its more than suitable, its perfect.  If your company is anything like the one I used to work for you’ll already have loads of data being collected, not just on the outputs but the inputs too, this is a luxury many of us in other industries don’t have.  We know what comes out of the end but not what pressure, temp, flowrate we used to get it.

    0
    #57088

    Sea
    Participant

    I might be being a bit simple here but:
    1. have you control charted your data yet, old versus new to see what it looks like (Xbar , S chart). Does it even look like it might be different?
    2.A means test such as ANOVA and test for equal variance should tell you statistically whether you have changed your process or not.
    Whenever you can, plot your data in an appropriate form and just have a look at it, try and understand what you have before you try applying statistics, it saves a lot of time in the end.

    0
    #143915

    Sea
    Participant

    Jeez, I wouldn’t mind if it were a difficulat question, are you sure you should be studying and MBA?

    0
    #143912

    Sea
    Participant

    A couple of thoughts:
    have you tried control charting your data (I would use a p-chart I think percent lost calls)  You could do this for all the data on one chart and then stratify it (split it into individual days and plot all the Mondays on one chart etc) to see if there is a difference between days.
    You could also try some sort of regression to see if the number of lost calls is related to the total number of calls in a 15 minute period.  Both are discrete variables so Chi squared analysis is probably most approriate, although it may not cope well if you have lots of zeros so you might have to group the data into 1 or 2 hour slots for example.
    Finally you might be able to try a multi-vari study with the inputs day and time slot and the output number of lost calls or total calls in queue, I would do both and then compare the reults.

    0
    #57057

    Sea
    Participant

    create a matrix similar to the cause and effects matrix with your projects listed down the side and business requirements accross the top, weighted for improtance. Then score each project against these criteria.  Remember you will need a mixed group to score the projects to get a balanced view.  You will then have a clear idea of which projects are likely to give you the biggest business gains and have something to show to the senior management to show that you have gone through a robust process to select your projects.

    0
    #58856

    Sea
    Participant

    Focus on the lean tools first as there are so many low hanging fruits out there that don’t really call for all of the statistical analysis you may be used to in manufacturing.  The main thing I use data for is to prioritise actions and to back up the feeling that there is a problem with data.  It’s amazing how many people in the transactional side of business don’t know what a pareto is.  The other monitoring tool I find useful is control charts, whilst they are happy using spreadsheets they rarely seem to graph anything, let alone put control limits on it to see what is noise and what is a signal.
    From the projects I have done the most useful tools have been mapping the process, and here look further than the traditional 6 sigma SIPOC map, I often use a swimlane flowchart that shows complexity and handoffs between departments pretty well.  Then waste walk the process following the map and highlight the 7 wastes on post-it notes as you go along, it’s amazing what you will find.
    Hope this helps
    Jules

    0
    #57041

    Sea
    Participant

    the theory is all well and good, but you need to look at this more practically.Do you have an issue? Maybe maybe not, if all the 8 points were above the centre line then you probably would have because the variation within your subgroup would have increased.  As all the data points are below the centre line things may be looking up.  Have you made improvements or changes that would cause the variation within each subgroup to decrease, if yes then you should expect this change and it may be time to re-calculate your control limits.  If not then are there other reasons this might have occurred, is the measuring equipment working properly and in the same way as before etc, etc.
    The graphs are just telling you that something is unusual (3 in 1000 chance) it’s up to you to look at the practical problem and work out why and whether it’s an issue or not.

    0
    #139893

    Sea
    Participant

    Louis,
    you need to take parts from your whole measurement range so that your part to part variation is big enough to give you valid results.  This means select say 6 parts that range from the smallest gap possible to the biggest gap possible.
    Then you need to re-test each part 2 or 3 times (repeatability), where the parts are cylindrical it is important to make sure the parts are aligned the same way in the machine.  Can you mark the parts in some way to make sure the orientation is the same each time the same part is measured?
    Sarah
    you could carry out a Gage R&R using two or more calibrated weights, again these should cover the range of your measuring equipment or process, this would be a quick test.  Alternatively you could carry out the R&R using two or more different products that have different mean weights and run the test as though it were a destructive test.  In this case you assume that each part is identical within the batch but you have a large variation between batches.

    0
    #68865

    Sea
    Participant

    Hi Niraj –
    Thanxx for the offer. My email address:
     
    [email protected]

    0
    #68822

    Sea
    Participant

    Thanks Evan –
    I was more interested in the Data Entry process in regards to prescriptions.
    Example:
    DE Clerk receives 250 Scripts/day and enters them into a computer. Inherent with any process so dependent upon humans, will be errors entering the pertinent information. Make sense??
    Jules

    0
    #66051

    Sea
    Participant

    Joe/Ken –

    Thanks for your replies. Based upon my findings and your responses, I believe DPMO is the best way to communicate our baseline metrics. Thanks – Jules

    0
Viewing 28 posts - 1 through 28 (of 28 total)