iSixSigma

annon

Forum Replies Created

Viewing 100 posts - 1 through 100 (of 517 total)
  • Author
    Posts
  • #187322

    annon
    Participant

    An amazing grasp of the obvious. 

    0
    #187231

    annon
    Participant

    promote them

    0
    #187230

    annon
    Participant

    The S&S works on an 18 year old kid at boot…try it on an experienced officer or enlisted man and they will make sure you  lose in the end…the term ‘fragged’ exists for a reason.
    Might I point out the oh-so-slight difference between a combat military environment and corporate america…A little SA people. 

    0
    #184091

    annon
    Participant

    Cant you use the Power and Sample Size function on MTB to accomplish this?

    0
    #184028

    annon
    Participant

    As a practical aside, once you have your data collected, you can crank it through both the paired and 2 sample t and see if the two tests offer different “opinions”.  If you accept/fail to accept the null in both tests,  you have what you need…and dont forget to run your power and sample size test (especially if the test “fails”) to quantify its ability to detect a meaningful difference should it occur (ie power).  Good luck.

    0
    #182982

    annon
    Participant

    Stan,
    There appears to be days when the pot just doesn’t stir.

    0
    #182979

    annon
    Participant

    Chad Vader,
    Your post, whether it was brilliant, pointless, inspirational, useless, etc. generated four responses. That would be about the average daily traffic for all four adjacent sites to this one. That is the point of a discussion forum, in the business sense, not to sit on the sidelines and force feed standards. You must do something correctly to attract the traffic.

    0
    #182977

    annon
    Participant

    Chad Vader,
    Pointless but entertaining plays much better than Toby_Macs whining.

    0
    #182976

    annon
    Participant

    Annon,
    The double n is me not you. Get another name.

    0
    #182975

    annon
    Participant

    Chris,
    You appear to be confused. Try again.

    0
    #182952

    annon
    Participant

    Out of a job, just got certified and you know I won the best blogger
    award in Miami.Something truly is fishy here.

    0
    #182907

    annon
    Participant

    Chad Vader,
    Friday has become the weirdest post day ever using what measurement device or process?
    In the average day the normal posts ask these ridiculous questions about statistics. Why does anyone with access to the internet need to ask a question like that. There has never been a statistics post on this site when it was answered by some with “Wow nobody ever thought of that before. We will have to get back to you after we invent some new iteration of frequency observed and frequency expected.”
    There is always a rant. It seems to come from people who want to whine about professionalism because some person who visits this site and is not paid by this site doesn’t have the patience to feign obsequious behavior for the sake of somebody’s Faberge Egg self esteem and they have this delicate self esteem because their Mommy and Daddy chose to negotiate with them about stepping in front of a car instead of something as direct as “Hey, the car will kill you.”
    As far as the college basketball, how long ago did you guys leave college? It is time to move on.
    If you spend any time looking at other Six Sigma sites there is no traffic and that does include the adjacent sites to this one and the blog site. This site was forged in the controversy that happens on this Discussion Forum. It is why people come here and that is why companies advertise here and that is where the money comes from that keeps it in existence.
    The variable monitor issues are to lengthy to discuss. It is vaguely reminiscent of some high level church person dictating rules about birth control. If you don’t play the game then at least take the time to figure out what the rules should be rather than use the rules to impose YOUR “standard” that may or may not be relevant with this little piece of the world but they won’t know because they are not actually a part this community and wouldn’t survive on the forum on their own.
    Your post wasn’t useless. It was just pointless.
    What is the difference between ignorance and apathy? I don’t know and I don’t care.

    0
    #182904

    annon
    Participant

    Chris,
    This is what is called checking for understanding and just possibly a reality check for you. You are responding to a post you made 3 months ago but you are not responding to it in the same string where you posted 3 months ago. Your original post certainly Wurlitzer Prize winning writing. Then you list every feature of the program, pitch the statistics book as a great reference even for people who are too stupid to understand the course; as if the world needs one more statistics book or one more Six Sigma book unless you want narcolepsy to become an world wide epidemic. After all this non-sales pitch you just happen to mention the double hit rate on your CV which could allow someone to believe you don’t have a job yet. Just as a side comment the reason a person sends out a CV is to secure a job, not so you can do a time series analysis on the hit rate. We are assuming you did cover the time series analysis in your 10 weeks of course work, reading 2-3 chapters per week, a project that took 40+ hours and a project report that was 33 pages long. Please tell me you just read that that even you understand how contrived that sounds.
    It is difficult to undertand the common belief that a Six Sigma project  must take six months but it is almost as unbelievable that you did one in 40 hours unless it was a very large + or was in reality a just do it type project.
    It is truly difficult to understand why anyone would believe you are pitching these guys. There seems to be a lot of posts on this site where people do all the schizophrenic things that you are doing. You claim you answered your own post! They have meds for that type of behavior.
    You are a sales person pure and simple. The worst part is you think people are too stupid to understand that. It also seems to be a little fortuitous that this post goes up after A_eta took so much poor publicity on here a couple weeks ago. It should actually cause people to be more cautious of A_eta since they obviously feel they need to spin this after they took such a beating.
    What an odd coincidence.
     

    0
    #182903

    annon
    Participant

    Chris II,
    What an enlightened response. That has been such a valuable contribution to the Discussion Forum I am sure you have touch people deeply and changed their lives forever. We are sitting on the edge of our seats waiting breathlessly for more of your wisdom.

    0
    #182822

    annon
    Participant

    Toby_Mac,
    No? No!!! You have refused my suggestion and refuse to call me by my new name. I just don’t think that is very professional and I have struggled so hard to understand well…. Actually it doesn’t bother me at all and I do get a belly laugh every now and then but I am just taking upo for all those little people that aren’t just as tough as I am you know.
    Maybe you can send over Darth when his is done counseling you and he can help me repair my self esteem. Please ask him to get a new box of tissue because after your session there can’t be much left in the box. You have just been so harsh.
     

    0
    #182821

    annon
    Participant

    Suresh B S,
    A pleasure.

    0
    #182820

    annon
    Participant

    Stan,
    That was my thinking as well. Under the cover of daylight.

    0
    #182811

    annon
    Participant

    Toby_Mac,
    I trust you will be attending English class as well. What is a trully?

    0
    #182807

    annon
    Participant

    Stan,
    What do you think of that one?

    0
    #180013

    annon
    Participant

    Hi Robert,
    So based on the included quotation, why go through the calculation for std dev…why not simply use the range when dealing with n=2?  What value is gained in dividing through Rbar with the constant?  Thanks!

    0
    #179591

    annon
    Participant

    I would offer that perhaps the real benefit to this line of thinking would be investigate how the LSS efforts have impacted the organization’s strategic planning.  A Hoshin approach would be a very effective way to answer your original question about LSS objectives and how they might me measured or assessed against critical business processes and daily management activities.  Once you have this framework in place, then it is a rather simple matter to determine which Objective>Metic>Goal>Strategy sequence will meet your needs.  Best of luck.

    0
    #179081

    annon
    Participant

    Too much free time…just use stable and capable and avoid the confusion.  The former deals with consistency and the latter with meeting an expectation.

    0
    #177383

    annon
    Participant

    A couple random thoughts for you:

    Running a basic Assessment and Selection Matrix on training candidates and project efforts should reveal such constraints / obstacles in the future (ie “Are all GB candidiates committed and calendered?”)
    The need for high skill sets due to operational precision & complexity are holding your CI efforts hostage.  You can move away from the traditional project format and embrace short Kaizen / Work Out Events coupled with individual improvements at the point of use (ie poka yoke program). 
    Regardless, efforts to reduce process sensitivity by eliminating the need for complexity and precision  (and thus, high skill sets) would be good areas to begin your effort.

    0
    #177218

    annon
    Participant

    Good day
    Being relatively new to the Six Sigma world and this discussion forum, I am horrified by all your sarcastic responses to this query!  Remember that we all started at the bottom somewhere and if you can’t respond in earnest to assist someone, then what’s the point of this forum??

    0
    #177199

    annon
    Participant

    Just spit-ballin’ here, but here are some ideas:   “Super fit 18 sigma” instead of LSS….or perhaps “Buns….and Cpks….of Steel”…..
     

    0
    #176776

    annon
    Participant

    As you would expect, it is simply an optimization of the three project management elements:  time, resources, and scope.  You simply adjust this combination to properly bracket your Kaizen expectations. 
    And Kaizens are simply 1-5 day events used to carry out a particular objective:  It can be used for a variety of exercises from a 5S event, to cell design, to matrial handling systems….or other non-LSS objectives.
    Don’t be fooled, there are mock-ups and debugging events that usually last 2-6 weeks as well as 30-60-90 day action plan that are often required to bring the large-scale changes to fruition.  Not to mention the planning – which can take up to 3 months or more and the inventory build-ups necessary to offset the downtime (if manufacturing-based).
    Also, when the risk of failure is acceptable (ie a transactional process that is easily manipluated), a rapid timeline for the event is usually encouraged with team members working 12-14 hour days so that maintenance can make the necessary physical layout changes in the A.M.

    0
    #176369

    annon
    Participant

    How about first defining what it is you are trying to accomplish?  How is the restaurant going to combine its own unique combination of assets to create a competitive edge in the local market place?  Now develop a relevent system of performance measures that allow you to deploy said strategy…..now you are in a position to start assessing and selecting project ideas….this is a strategy deployment issue on a small scale…look into hoshin planning and balance scorecarding….LSS comes after that….my two cents.

    0
    #176368

    annon
    Participant

    “you are a tool in search of an application”….I get it….I will so be using that one in the future….very nice!

    0
    #176244

    annon
    Participant

    Look under the Tool and Template heading on the big blue bar to your left under control charts, it will give you good information. Points to note:

    Chart type depends on data type and sample size constantcy
    Number of samples (k) should be around 25 or more
    There is no hard rule on sample size (n)…depends on several factors
    The key is to minimize the variation within samples in order to see the variation between them….this usually entails small samples using consecutive observations
    Chose the chart based on data type and what makes sense relative to the characterisitcs of your data set
    Good luck.

    0
    #176229

    annon
    Participant

    I would make the argument that your choice should absolutely be driven by the needs of the business –  first and foremost.  The other posters are certainly correct that buy-in is essential (Q x A = E) but that is often the reason itself for chosing poor performers….it is here that you will often find the greatest support for your effort.
    And instability is to be expected in poor performers (by definition)  and is often  remedied through simple process management and standardization.  In fact, you will often mkae your biggest gains in the least amount of time within this environment, as there is no need to dive too deep into the LSS toolbox…..A simple application of the The 7 Quality Tools can work wonders here.
    Find the plant that is causing management the greatest heartburn        (eg pareto-ing profitability).  Start there and prove the skill set. 
    Good luck.

    0
    #176178

    annon
    Participant

    Does it say anything in there about thin-skinned little pinheads inserting themselves into the conversations of others?  I didnt see it…….

    0
    #176176

    annon
    Participant

    Professional?  How long you been tunin’ in?
     

    0
    #176172

    annon
    Participant

    Apology accepted, and I am glad I was able to aid in lowering the collective bar….”Setting low standards and failing to achieve them”….its what I do.

    0
    #176170

    annon
    Participant

    I beg to differ….I believe the reputation of Mr Plackett, Mr Burman, and myself were all equally besmurched, although I am pretty sure you brought my family tree into question at some point……

    0
    #176137

    annon
    Participant

    Start by following the money
    Actively choose the right champion, don’t allow the wrong one to be bestowed upon you
    Determine how they are bonused….focus there.
    Take ownership in teaching them how to write a decent charter
    Good luck.
     

    0
    #176111

    annon
    Participant

    Search under Project Assessment and Selection
    Initially, use a weighted decision matrix to rack and stack ideas
    The sponsor/champion is the one who should provide the project idea, at least in rough form.  If they have no presence in assessment and selection, be verrrrry afraid.
    You can always use multivoting, Affinity Diagrams, C&E Diagrams & Matrix, Brainstorming with 100 dollar exerice, IRD, FMEA, etc to rack and stack the ideas (google what you dont know).
    Good luck.

    0
    #176096

    annon
    Participant

    If you have gone through ANALYZE, have you not already identified the magnitude and direction of the effects that your causal inputs have on your project Y?  It sounds as if you analysis might be less than complete….
    For example, even if you have a potential KPIV that you can’t measure / control, can you not identify the usefulness of the rest of the model (ie regression model using Rsq, S, Mallows, etc) or block on the uncontrollable factor and run a simple DOE to determine its effect?  If the Rsq explains a significant portion of your variability or the blocking factor proves insignificant, you have your answer….Just my 2 cents.

    0
    #175463

    annon
    Participant

    The more capable your process (and I would call a process with a Cpk of 10 extremely capable) the less SPC is required.  What is sufficient a sufficient level of control will be a business decision:  what is the cost of SPC efforts vs the result of a shift in your process mean?  Go from there.

    0
    #175423

    annon
    Participant

    I dont care.

    0
    #175417

    annon
    Participant

    Dont know a thing about you, your organization, or your product…..but the fact that you would:

    Get goaded into a sophmoric rant
    In a public forum
    That caters specifically to your client base
    After you have revealed your professional affiliation with OSSS
    speaks volumes about your organization.  Yikes!

    0
    #175416

    annon
    Participant

    Inspection is useful only as a means to assess process capability and provide actionable data back to production for root cause analysis and solution.  As a means of controlling the the quality of your process, it is always a bad idea in any form other than as short-term containment strategy.

    0
    #175415

    annon
    Participant

    Jsev607 is correct:  your process will be in only one of four states – stable & capable, unstable and capable, stable and incapable, unstable and incapable.  A capability assessment of unstable data means you are looking at data that is unpredictable….which means the data you are using in your capability assessment will change….hence, your capabilty measures will change.  That is why LSS teaches that you prove stability first, capability second.

    0
    #174754

    annon
    Participant

    Duly noted.  Thanks for the correction.

    0
    #174747

    annon
    Participant

    What what you suggest as an alternative method of achieving the stated goal?

    0
    #174743

    annon
    Participant

    Bob H is on point….Once you have your data, run a 4S (for continuous data) – Stability, Shape, Spread, Senter….this will tell you what you need to know when selecting the proper capability metric for baselining your process….So control chart/run chart, histogram/AndersonDarling test, variance/std dev and mean (you can pull these off your xbar charts)…..
    Good luck.

    0
    #174742

    annon
    Participant

    IBV,
    You are dealing with $$, a continuous measure, so you can use XbarR, XbarS, or IMR charts (among others).  My two cents…
    I would use an IMR chart, taking a daily sample Mon-Fri (n=1).  I would go back and test the historical data for normality and ensure I preserved the time order.  If app. normality is not present, then subgroup your data (ie daily average v. daily sum, weekly vs. daily) and see if that moves you closer to normaliy.  
    Or, you could simply manage to the chart knowing you have a non-normal parent population and simply use extra caution when investigating special causes….CC is not an exact science…it is meant to be practically interpreted in some cases…..Good luck.
     
     

    0
    #174739

    annon
    Participant

    Having done work with USPS and Canadian Post, I would say that your deployment challenges are no different than you would find in any large governmental bueracracy. 
    I would reccomend spending a great deal of time on Strategy Deployment (ie Hoshin Planning) prior to the LSS effort to ensure leveragability and organizational alignment.  LSS is a process-level skill set that requires a system-level framework if you are going to leverage the results effectively.  Plan for that.
    Infrastructure is everything.  Tools and techniques are easy.  Focus on establishing cascading Steering Committees for assessment/selection of project ideas and training needs – focus these efforts around the critical value streams of the key SBU / processing facility….talk the language of money first, projects later.
    You have to find a way to get the HQ thoughts implemented at the plant level….you will find HQ owns the product (eg Express Mail) but not the process (eg plant ops).  You have to fill the local / regional  project pipelines through a robust assessment and selection process which focuses on:  Can we affect change at the plant level?  Can we implement it?  Can we leverage it? Does it hit the “P&L”?….these should be your #1 questions on the A&S
    Good champions are critical…they have to be trained and really buy-in…again, determine the business needs (ie pareto out the product families and their key result areas and run through a Balanced Scorecard with leadership) and bring H&R into the discussion regarding the reward and recognition systems of the champions and area VPs…if you dont get them invested somehow, your done…insist project charters come from the champions themselves….its tough…train them well here…LSS can not overcome bad proect management which starts with:  Actionable charter, Gantt Chart (ie Work Structure Breakdown), and timeline management…..Make them do it….bad charters from uninspired / uninitiated / incompetent project champions will kill you.
    Etc, etc, etc…..the cultural change is a total pain in the ass….give them what they want right now…and what they need as soon as you can…..good luck.

    0
    #173653

    annon
    Participant

    Cost certainly, but not necessarily measured in financial terms.  There are many instances where Financial metrics wont drive the decision making process of management.  But certainly everything has a cost consideration and should be baselined and considered in your assessment and selection process , if not a financial price.

    0
    #173648

    annon
    Participant

    There are many ways to set SMART goals….

    VOC / HOQ
    Kano
    Rules of Thumb

    Lean

    OEE – 90/95/99
    6S – Percent improvement based on current capability

    2.0 sigma process v. 3.5 sigma process 
    Subject Matter Expert opinions
    Data

    Lines of clear demarcation

    Where does customer satisfaction, sales, performance, etc really fall off?
    Financial

    What is the min difference required for success?

    2:1 ROI
    Etc
    ETC

     
    Of course, the customer is always king…determine the process, its output, who consumes it – this is your customer, here is the source for goal determination – Charter > SIPOC > VOC/HOQ/KANO/Data > Goal Statement
    Good luck.

    0
    #173647

    annon
    Participant

    SSG,
    I wouldnt do it.  It sets a bad precedence for your organization, waters down the training experience for the GB, and could undermine the repuatation of your accredidations.
    Then again, if what you are talking about are large cross-functional projects (ie BB or MBB level) that justifies having multiple GB assigned, then obviously, there is nothing wrong with that approach.  
    Why not grant them recoginition for completing training while simultaneously working with your leadership to develop the infrastructure that will fill your project pipeline?  You know the drill here – Project Assessment and Selection,  HR involvement to include LSS elements in annual employee reviews, robust Champion Training, Leveraging Committees, etc.
    Good luck.

    0
    #173275

    annon
    Participant

    What is the question?
     
     
     
     
     
     
     
     
     

    0
    #173206

    annon
    Participant

    Copy that.  Thanks for the info.

    0
    #173202

    annon
    Participant

    Thanks, but do I really need to be reprimanded here?  Why not save it for the usual stat student looking for anwers to their homework questions….How anal retentive can one Forum be….

    0
    #173163

    annon
    Participant

    Just think of it as the actual risk of being wrong or the probability of saying something is when it isnt (ie normal, significant, different, etc).  Good luck

    0
    #173162

    annon
    Participant

    It appears as if looking again at Product Families and redefining your Product Family Matrix based not only on the use of similar resources, but in similar amounts and with similar variability.  Product groups with highly variable demand and or cycle times should be broken out and addressed accordingly.  Start here.  Then you can look at Takt Time issues and how to adjust based on the nature of the product:  Can you build to ship or to inventory?  If the former, then you flex the workforce or use OT..If the latter, then you can use pull systems (ie supermarkets) and reduced batch sizes to dampen the effects of variability…etc.  Good luck.

    0
    #172718

    annon
    Participant

    Any loss of availability that was not planned

    0
    #172695

    annon
    Participant

    Identifying your VSs using your priority matrix and spaghetti diagrams and then prioritizing them based on financial data (ie customers, volume, revenue, profits, cost structure, etc) is not a bad place to start.  You can do this individually, quitely, and work with management in understanding the COQ issues in their own language ($$) before you break out the LSS voodoo.
    Good luck.

    0
    #172268

    annon
    Participant

    Dude,
    Summarize for crying out loud, geez, you mightas well write a book…a big one…real thick….oh wait…..Breyfogle, Breyfogle…..oh, that’s right…..you did….I read it…nice job.

    0
    #172267

    annon
    Participant

    V,

    PCE = Value Added Time / Total Lead Time
    Value Added Time is not Cycle Time

    VAT is the sum of all value-adding activities.  These are elapsed measures…you can add them
    Cycle Time is a rate mesure (ie a ratio) which you can not add.  It is equivalent to Processing Time when you have a single operator, otherwise, they will have different values.
    Total Lead Time is the sum of your processing time (again, an elapsed measure which you can add) and your inventory levels (see little’s law)
    So you will want to include all work and their times, determine which work elements are VA, and then you calculate your PCE.
    Good luck.

    0
    #172206

    annon
    Participant

    Y,

    What are you trying to accomplish?  I am confused on the tolerances…you are meeting the one-sided spec for the shaft but not the spec for the housing?
    Your sample size depends on a variety of factors, and destructive testing is a field unto itself, which I would encourage you to research.  But speaking from 30K feet, if you are simply looking at binary data (Did the piece survive the test or not) then you should look at MTB and the elements that should be considered for the type of analysis you are trying to run and the data type you are using….The sample size will depend on multiple factors, including type of test, level of confidance, delta, etc…..good luck.  
     
     
     
     
     
     
     
     
     
     
     
     
     
     

    0
    #172170

    annon
    Participant

    You can treat it as discreet, but six discrete data points are not going to tell you much…what are you trying to accomplish with the data?

    0
    #172169

    annon
    Participant

    James,
    Porque?  Qual es el razon? Que quiere hacer con sus datos despues de transformar?
     

    0
    #172115

    annon
    Participant

    Your numbers appear to be inverted….unless I am just having one of those days….it is unusal to have Cp<Cpk. 
    Your process target is determined using a variety of considerations.  Particularly, what is the nature of the problem based on what the capability study has shown? Do you need to center the process?  Reduce variation?  Both?  What does the process ask of you in this regard?  Where is the sweet spot for capability and cost? etc. 
    The capability study is a good place to start but will not alone tell you where you should target your process. 
    Good luck.

    0
    #57560

    annon
    Participant

    2 cents here….begin at strategy deployment, then process management (ie stabilization), lean (vsm in particular) and then six sigma or lean six sigma (LSS) if properly done.  Good luck.

    0
    #171904

    annon
    Participant

    K,

    What are you want the data to tell you?  Need to know that first.
    You may treat your data as attribute (ie proportion / rate) since you are dealing with binary responses (ie you only have two responses child/not child). 
    You are using attribute data, which among other things, tells you….you are going to need larger sample sizes for a decent level of precision…use the Power and Sample Size function to illuminate your options there
    Non-parametrics are used when your data set does not approximate normal but that assumption is required for your test…again, what are trying to determine?  This will give you your analytical options and then you can determine if non-parametrics are applicable….in your case with so few data points, your binary data will likely plot non-normal
    Summary:  Determine what you want the data to tell you, chose an analytical/graphical method, run a power and sample size determination, and run the analysis. 
    Truth:  In all likelyhood, you should start at the beginning.  Put together your MSA and Data Collection Plan to include such things as sample plan, power, method of analysis, critical questions to answer, graphical displays/assessments, etc.  You have small, binary, happenstance data sets….not your best option. Good luck.

    0
    #171784

    annon
    Participant

    Cpk and Ppk are both estimates of process capability.  Capability is essentially the porportion between the variability allowed by the customer (spec limits) and the variability in your process (std dev), and since the spec limits are given to us by our customer, the only thing we can change in this context is the type of std dev used in the calculation.
    Cpk uses short term estimates, those that are said to be devoid of special cause.  Since our process is rarely devoid of special cause, it is often useful to use long term std dev, which is gathered over a longer period of time and includes both between and within variabililty estimates…it relates more to actual process performance.
    Cpk is something of an idealic state, while Ppk is more represenative of what your customer is experiencing, it is their reality.

    0
    #61971

    annon
    Participant

    I would suggest mapping your processes first…go fast to go slow….focus on basic process management before you jump into heavier skill sets that might not be required.
    This is simple to do and will identify each of your core processes and their key elements which will provide you with not only the key process inputs and their specifications (your leading indicators) but the key process outputs (your lagging indicators). 
    If these lagging indicators do not directly link to the organizational goals and objectives, consider revising them using a Balanced Scorecard approach.
    Now you can work to optimize and control the former in order to ensure the quality of the latter.  Using any basic mapping techniques should suffice, although a Value Stream Map might be more useful when looking at time and cost elements (patient processing times, overall lead times, wasteful practices, etc).  
    Good luck.

    0
    #61970

    annon
    Participant

    Hire a MBB who knows what they are doing to help you plan your rollout.  Power point presentations and on-line research will not provide you with the level of knowledge required to be successful with your deployment.
    Good luck.

    0
    #61969

    annon
    Participant

    I would start be reviewing the control plan in the original CI effort and determine if has been followed and appropriate metrics monitored and controlled.  If none is in place, I would go back to root cause analysis  before implementing someome’s action/control plan that is not specific to your investigative efforts.  Good luck.
     
     
     
     
     
     

    0
    #61968

    annon
    Participant

    You can come out of pocket (2500-5000) for greem belt or black belt training or you can begin a new job search and target those organizations within your industry that have internal programs and/or have embraced the skill set.
    Good luck.

    0
    #171501

    annon
    Participant

    Tests of means are largely robust to this assumption of normality.  Dont be overly concerned with it.  Plot the data and as long as the distibution is not overtly skewed or bimodal, you should be fine running either a one-sample Z (for larger sample sizes or when the historical std dev is know) or one-sample t (generally used when dealing with smaller data sets or where there is no historical estimate for std dev). 
    Good luck.

    0
    #171237

    annon
    Participant

    RB,
    Just wanted to thank you for continuing to post.  I continue to learn a great deal from your insights.  Thanks for the knowledge sharing.

    0
    #171234

    annon
    Participant

    Paul,
    This appears to be a service level issue and might become clearer if you tighten up your VOC data.  For example, who is your customer and how are you defining “consistency, resourcefulness, timliness, etc”?
    If your services are meeting expectations (customer opinion) but accomplished inefficiently (management opnion), then you will have to balance maintaining current service levels while improving productivities.  In either case, it appears that you will be dealing with variability and time issues.  Once you gain clarity on your VOC/Operational definitions, you should be able to better pinpoint your metrics.  Good luck.

    0
    #168758

    annon
    Participant

    Run a search on OEE here….there are several good posts for calculations…if not, reply again and I will send you something.

    0
    #168649

    annon
    Participant

     I would advise that DMAIC is a tool kit used to make changes to an existing process within those boundaries.  When you have a problem from a repeatable process that is not critically urgent and where incremental change is accetable, then DMAIC would be a good fit. 
    DFSS is useful when the process/product is nonexistant or so badly flawed that it must be redesigned or simply created for the first time. 
    They both use many of the same tools and techniques and both rely on the VOC as the driving force behind the effort. 

    0
    #168628

    annon
    Participant

    M,
    As you move through MEASURE, you should be tightening the scope of your effort, and thus, moving from a high-level SIPOC to a move detailed VSM or PMAP, where such details as decision diamonds, rework loops, fall out, bottlenecks, non-value/value adding steps, etc are identified and included.  If using a LSS approach, you might then take this document forward to aid in either a root cause analysis or a process flow analysis, or perhaps both.  Either way, sticking with the SIPOC at the DEFINE phase is appropriate, but as always, use the tool or technique that delivers the phase objectives while satisfying the groups wants and opinions.  Balancing your technical strategy (ie how you generate a quality solution) with your cultural strategy (groups acceptance of the process and said solution) is your challenge.

    0
    #168331

    annon
    Participant

    Jen,
    I am tracking now, thanks. 

    0
    #168315

    annon
    Participant

    Jen,
    Thanks!  So if I understand, this lack of fit entry in the ANOVA is simply determining if the predictive model can adequately approximate the true value for the predictive Y and uses a null of  ‘good fit’.
    I would assume this ANOVA entry is used in conjunction with the residual analysis (in particular the residual v. fitted value plot) to determine how the model is lacking (assuming p-value < alpha).  For example, requiring additial 1st order terms,  2nd order or quadratric terms, transformation, etc.
    Is that a correct statement?

    0
    #168207

    annon
    Participant

    Amisha,
    If I understand your question, you are somehow trying to derive customer requirements based on prior yield rates.  There is no relationship between what the customer wants (USL-LSL) and what the process is capable of producing (%defects). 
    Your challenge is through some means –  passive, active, or both – determine the requirements for each market segment (if each consumes a distinct product line) and then translate this into operational specs (USL-LSL). 
    Probably not a bad idea to pareto out your markets based on some financial criteria (stablity, volumn, margin, cost/unit, ROI, etc) and then cross reference it with your paretoed % defects.  Start with the big bars.  Good luck.

    0
    #168064

    annon
    Participant

    It does..Thanks again.

    0
    #168063

    annon
    Participant

    Thanks Rome & Robert….gotta go get my floaties on….I’ll let you guys swim in the deep end.  Thanks again!

    0
    #168056

    annon
    Participant

    Can you use any form of parametric hypothesis testing (eg paired t, 2 sample t/z, anova, etc) when dealing with discrete data? Thought this required continuous data formats……

    0
    #168052

    annon
    Participant

    Not sure what tool you are referring to, but in the case of using ANOVA, your null would be one of ‘their is not enough evidence to say X has a statistically significant effect on Y’ or something to that effect.  Note, statistically is the operative word here…something can be stat. sig. and practically nonsignificant. 
    As to your as the R calculated value would exceed the R critical value comment, that is news to me….Your R and R sq is not associated with pvalues, but is simply the sq. pearson correlation coefficent (if memory serves).  It should be used in conjunction with proper graphical analysis to determine how best to interpret your model.
    Good luck and verify.

    0
    #167951

    annon
    Participant

    Again, ‘Learning to See’ should walk you through it in a fairly straight forward manner….I would only suggest that you approach this VSM event systematically, with an understanding of WHY what you do today (ie reduce cycle times via a VSM event) should be effective tomorrow (ie Why do you think reducing cycle time and reducing non-value adding steps in this process is going to improve a customer-driven performance metric?)  I have been a part of these events that achieve local results with little global effect….Good luck.  

    0
    #167878

    annon
    Participant

    What are you trying to accomplish? Your uncertainty comes from not knowing the answer to this question.  The VSM or PMAP or any other systemic diagramming tool is just that…a tool…you first have to know what success is going to look like when your efforts are completed.
    Ask first for the charter, then choose your tools from there. 

    0
    #167785

    annon
    Participant

    My experience would lead me to say that not creating demand for the skill set is the primary manifestation of no ‘top down support’.  When process owners provide robust, actionable project charters and a BQC assesses and assigns a solid change agent with appropriate resources, that is the bulk of what I would require of  mang.
    Giving responsability without authority is an indication of poor leadership and not an uncommon mistake.  

    0
    #167779

    annon
    Participant

    The point is to capture both, using both internal (failures occuring before shipment/sending) and external failures (failues occuring after shipment) to assess the quality of the process output. In this case, using DPMO (or better still – DPU) to baseline and assess internal quality efforts as well as a yield measure for external assessments.
    As to process capability measures, using a DPMO or DPU assessment avoids the requirement of normality and would be appropriate here, noting DPU is often easier to calculate and explain, as well as being more sensitive to change….it is also a snap to calculate RTY (ie inverse natural log of DPU) from it if desired.

    0
    #167777

    annon
    Participant

    This has all the makings of a poorly led implementation, and no, it is not your fault.  I applaud your efforts, but from your post it appears that you have multiple barriers to success: Here are a couple ideas.

    Regroup, simplify, and put 6S on the back burner.  6S requires infrastructure, practices, and tools and techniques…focusing on the tools and techniques without first having the other two principles in place will not sustain long term success.
    Insist on having a strong Strategic Planning Process in place from the C level.  Your project efforts depend on this process being robust and market focused.  If your strategic business planning is poor, so will your quality effort…uh…be.  
    Project ideas come from two directions, top down and bottom up, with an optimal situation being a Business Quality Council of some kind driving the process, with process owners being incented to provide valid project charters (botom up) to be assessed and selected by the BQC to insure proper strateric alignment and prevent suboptimizing locally (top down).
    Use Malcomb Baldrige “Are we making progress?” self assessement along with COQ audits as agents of change for upper management. 
    Focus on simple skill sets that are easy to teach and easy to implement. You can do a ton with good project management techniques, solid process management, and Demmings broad base of teachings (I especially find his Theory of Profound Knowledge useful)….wait on the sigma until you have 12-24 months of CI traction…no skill set can overcome poor project and process management.
    Always, always, always….talk to management in their language…MONEY…hence the COQ and especially the COPQ should be a useful mechanism for change….include ACCT in your calculations.
    And note, this is not an exclusive list….I would reccomend you getting a good change agent experienced in roll outs.  This is probably your only real shot of success.
    Good luck. 

    0
    #167323

    annon
    Participant

    Any predictive or inferential method that you wish to use implies that stability is first present in your process.  This stability simply means that the process generates a known output….that it is predictable…it is this predictability that allows you to have ‘confidence’ in your descriptors that will be used in your analytics (cap studies, regression, DOE, Hypothesis testing, etc).
    So yes, an unstable process would first need to be ‘debugged’ of special cause before baselining the capability.  Have you tried using the appropriate control chart over time to test for stability?

    0
    #167166

    annon
    Participant

    No skill set survives bad process or project management……and anytime you can gather sufficient data for a metric and its std you can consider using a 6S approach….

    0
    #166209

    annon
    Participant

    are you using happenstance or experimentally derived data?
     

    0
    #166178

    annon
    Participant

    The only reason you should consider normality as a critical characterisitic for your data set is if it is an required assumption for your intended analysis, and even then you should consider how sensitive your test is to a violation….detectability and sensitivity are two different things.
    What critical question(s) are you trying to answer and what type of analysis are you considering that would provide those answers?

    0
    #166164

    annon
    Participant

    With over 14000 surveys (or was it questions) to go through, I would give some thought to using a team/based approach to help reduce the workload and begin building concensus. 
    For example, segmenting the surveys according to areas of control (ie product line, dept, function, region, etc) and applying a WorkOut format in conjunction with, say, an IRD exercise might be a useful way to start.
    Good luck.

    0
    #166155

    annon
    Participant

    If possible, collecting your data in a continuous format will allow you much smaller sample sizes and provide a straightforward analysis. Good luck.

    0
    #166153

    annon
    Participant

    You will have better luck if you ask specific questions.  Good luck.
     

    0
    #166032

    annon
    Participant

    Adam, what are you doing up so late?  I thought only us single guys stayed up this late…

    0
    #166031

    annon
    Participant

    I find it difficult to understand your question.  Perhaps you can tell us what you are trying to do in a single sentence, o tu puedes decirnos en espanol.  Buen suerte.

    0
    #165994

    annon
    Participant

    Monts,
    Although I appreciate your contribution, I dont believe it is accurate.

    0
    #165968

    annon
    Participant

    Spec limits or tolerance (USL-LSL) are always determined by the customer.  The stability of the process is required before you engage in any predictive analysis….if you are saying something today about how the process is going to behave tomorrow, you need to have established stability.  If instability is present (ie special cause), then you will need to look to process management to instill the necessary discipline….lf looking for specific guidelines for such a thing, lean and/or ISO can be helpful in this regard.

    0
    #165962

    annon
    Participant

    There is no reason why you can’t use an attribute form of your data, as long as you take the limitations into account.  Pay attention to the number of distinct categories that your MS is capable of discerning, as this will determine what type of attribute data you can convert to…for example, 2 distinct categories will lend itself to a binary format, etc.
    Good luck.

    0
Viewing 100 posts - 1 through 100 (of 517 total)