iSixSigma

Sorour

Forum Replies Created

Forum Replies Created

Viewing 100 posts - 1 through 100 (of 193 total)
  • Author
    Posts
  • #57789

    Sorour
    Participant

    I look after 3 things for our shop floor staff
    Utilisation
    Performance
    Effciency
    Utilisation is made up of Clocked Hours / Good work produced (Not rework jigs etc etc)
    Performance is made up of Act Machining hours / Est machining Hours
    Efficiency is Utilisation x Performance showns as a % 
    Paul

    0
    #62452

    Sorour
    Participant

    I know you specified Six Sigma, but I thought I would share that Henry Ford Hospital’s pathology lab has done a lot with Lean. There is some detail, including publications, in the menu on the right.
    http://www.henryford.com/body.cfm?id=47242

    0
    #184952

    Sorour
    Participant

    I worked construction for amny years and eventually got my black belt and worked with a qualiry group. 
     
    The most notable improvement opportunity was generally waste ( defect )……  There is always additional moneys wasted on projects.  This ends up being additional handeling and excess inventory.  That is,  if you get the job.  Adding additonal $$$$ to a job increases the likelyhood that someone leaber will be the bid winner……. 

    0
    #62349

    Sorour
    Participant

    Not really.  I’m the first person to have this type of job at my hospital.  In the past, they would hire consultants.

    0
    #62347

    Sorour
    Participant

    I went over the definition for the Times as PT in (wheels in) and PT out (wheels out) when I got here, and that definition had already been established at the hospital and understood by all.  Charting pt in/out times has been fine.
    I noticed the error when times for cases in the same room were overlapping.  Then I thought there was a possible room change that was not charted.  THAT, has been the issue.  RN’s dont document room change. 

    0
    #62336

    Sorour
    Participant

    Ofcourse the RNs are more concerned with patient care than inputing data.  However, standard of care has changed now and it involves charting accurately to insure the same level of care is consistent and at the level it needs to be. 
    I agree that there are some data measured that does not improve care.  I’m curious as to what you would consider to be not worthy of gathering?
    Thanks for the ideas.

    0
    #57731

    Sorour
    Participant

    One option is to Value Stream Map the process. This will allow you to assign times to the various tasks within the process and also to identify the lag time between tasks. If your improvements reduce lag time only then there is no quantifiable €$ benefit but there is a customer impact. If your improvements include reductions to the task times then you can identify the total minutes, hours saved in a year and working with your finance department find out the employee cost per hour. This will be a soft saving if staff numbers remain unchanged.

    0
    #180438

    Sorour
    Participant

    How about some Quid Pro Quo Trade – you buy some of ours and we will buy some of yours :-)

    0
    #57677

    Sorour
    Participant

    We use 1) room turnover time, 2) the number / percentage of late first case starts, 3) OR room utilization, and 4) block scheduling utilization.

    0
    #57642

    Sorour
    Participant

    Hello Geoff,
     
    Would you please send me a copy of the models.  I’m involved  a change organizations and needing to justifying a new organization that will have to change from a small to large enterprise wide organization.
     
     
    Thank you for your time and assistance.
     
    Paul
    [email protected]

    0
    #176276

    Sorour
    Participant

    Hi Chris- thanks for the reply.
    The question I have, however, is not based around the quality of the ratings system which has been developed (we acknowledge that this may need refinement and I haev built in several manual touch points to accommodate for a review of calculated ratings).
    It is more around the appropriate statistical methods which should be used to best mine the data being produced by the constantly changing nature of the ranking data and how the skills required to complete some or all of these work items relate to the skills profile within existing headcount.
    Kind regards,
    Paul

    0
    #175662

    Sorour
    Participant

    Ken,
    I would like to try KTA at my place of work as well and would be interested in your story board and would maintain confidentiality.
    Also would be interested to know if I can get any materials for this technique on line?
    Thanks,
    Paul

    0
    #59427

    Sorour
    Participant

    Hello all.
     
    If this si still doing the rounds, I would liek to see a copy.  Does anyone have it, and could they send it to [email protected] ?
     
    Many thanks,
     
    Paul

    0
    #174973

    Sorour
    Participant

    Some organisations use aptitude tests in such subjects as verbal reasoning, maths and logic although I have found that this does not necessarily highlight the best candidates.
    Ideally it should be someone who has a high aptitude for the above, has good project management skills, who has experience dealing with people and influencing, experience in managing change and who is seen by the executive of the function as being a potential future leader within the organisation.

    0
    #174843

    Sorour
    Participant

    Hi Guys.
    I was talking to a work colleague. He said you can determine the Ppk by using your confidence intervals and k value. Simply by using the confidence formula 95/999 = mean –k X sd and LSL- mean divided by 3 x sd = Ppk you can work out that the Ppk is 1/3 the k value.
    Don’t know if anybody heard this before or if there some data behind it. I would be interested to hear your comments.
     
    Secondly, typically what would be the smallest sample size you would use to work out Cpk/Ppk? I am recommending a minimum of 15 but ideally 30.
    Thanks again
    Very much appreciated
    Paul

    0
    #174831

    Sorour
    Participant

    Hi
    One more question (actually 2). Is it acceptable to make product for the Gage R&R which is outside specification. e.g. specification 21±1mm and our product would be 20±1mm. it is the same toleremce and i would be able to make product which would cover the entire range. 10 samples at different lengths across the 2mm.
    Question 2. Is there any danger in taking a sample of product from normal manufacturing process. this would not cover the full range of the tolerence.
     
    Thanks again, very helpful.
    Paul

    0
    #174813

    Sorour
    Participant

    Hi
    Thank you both for the respoce. it has been most helpful. i am assuming the AIAG standard can be carried across to the manufacture of medical devices.
    thank you again

    0
    #173849

    Sorour
    Participant

    Hi,
    Thanks for the responces. I am new to the company and to be honest i am in fovour of doing the gage R&R studies but risk assestments have been completed previously and i am meeting resistance.
    wouldyou guys recommend choosing the smallest component (hardest to hold and measure) as the component to do gage r&g on snap gauge. this would capture most difficult interaction.
    thanks
    paul

    0
    #173655

    Sorour
    Participant

    Greeting Michael.
    I’m afraid I don’t understand that answer.
    Could you explain?
    Regards,
    P

    0
    #173602

    Sorour
    Participant

    Hi Michael.
    Not much of a time difference between us, as I am in Oz.
    Thanks again. I will plough on with the tests, happy I’m on the right road.
    Cheers,
    Paul

    0
    #173540

    Sorour
    Participant

    Thanks again Michael.  Getting a little confused, however.
    As some of my data is not normally distributed at a factor level (month), does that not invalidate a comparison of means?
    Hmmmmmmm…
    Paul

    0
    #173537

    Sorour
    Participant

    Thanks Michael.
    Can I ask- what is your definition of “different” when talking about variance?
    As for the suggested alternative tests, a bit of background might help.
    I am looking at 4 month’s worth of data where the output metric is report sign off time.  I want to investigate if there is a difference in this Y between months.
    Some month’s data is normally distributed and some is not.
    Could you therefore suggest an appropriate test to run (or steps to determine an appropriate test)?
    Thanks,
    Paul

    0
    #173456

    Sorour
    Participant

    Stevo, I like your sarcasm. Not everyone can look for every newly coming post on the forum and contribute something so funny.

    0
    #173432

    Sorour
    Participant

    You guys are pathetic. Bought yourself a nice book and are trying to read it without any basic critical thinking. Can’t you even count from 1 to 6?! I really doubt is you ever went/will go beyond reading the book and have seen what manufacturing actually is in practice.

    0
    #172528

    Sorour
    Participant

    Thank you guys!! I appreciate your help
     
    Paul

    0
    #172458

    Sorour
    Participant

    I am just wondeirng about the role played by ASQ in the field of Quality. Has it helped American corporations gain competitive advantage through better quality. Any comments?

    0
    #171336

    Sorour
    Participant

    Hi,
    Please send me this file………….Thanks, Paul

    0
    #169747

    Sorour
    Participant

    Dear Dr. Ravi,I meant to imply you scale the raw data. Clearly, if you increase the magnitude of the data, the standard deviation as a percentage of the magnitude becomes less.I believe we share the same approach – don’t assume anything you can check yourself.Regards,
    Paul

    0
    #169728

    Sorour
    Participant

    Dear Dr. Pandey,I used Minitab sometime ago to study this problem. My approach was to scale all values to a higher magnitude to eliminate the negative CLs, test stability, and then scale back.Perhaps I’m misguided, but I’ve never found a case where the assumption of setting a negative CL = 0 posed a problem with the assessment of stability.As for your view that constrained values have a non-normal distribution, I’m not sure because I thought multiplying a normal distribution by a scalar would not change the shape of the distribution.Perhaps someone else can clarify?Cheers,
    Paul

    0
    #169346

    Sorour
    Participant

    Lets try my trusted an proven email addy.
    Thanks
    [email protected]

    0
    #169344

    Sorour
    Participant

    I would greatly appreciate a copy of the PPT.
    My email is (a3echo9 @ yahoo dot com )
    Thanks

    0
    #166369

    Sorour
    Participant

    I think you need to root cause from two aspects
    1. What is the original root cause – i.e. why did the customer have a problem in the first place? Why did they originally call?
    2. The second aspect should the trigger to the complaint, i.e. when they called why was their issue not resolved and they felt the need to complain.
    The key is to discover opportunity to eliminate the source problem, but where this is not possible you will have a second channel to reduce the reason their calls can turn in to eventual complaints.

    0
    #164796

    Sorour
    Participant

    Dr. Scott,
    I appreciate your response to my question.  
    Paul
     
     
     

    0
    #163264

    Sorour
    Participant

    He is writing a paper (I am) and he did the research on his one page paper on Juran.  I ran across the quote about 2 metrics and was curious what they were.  It’s not part of my paper and I assumed this would be a good place to find out what they were…my mistake.   After all, that’s what process improvement is all about…keeping information to yourself.  At this point, I’m not curious enough to read a 450 page book to find the answer.

    0
    #162977

    Sorour
    Participant

    Invite him to leave the team – do not take no for an answer.

    0
    #156178

    Sorour
    Participant

    Old & Weary
    I understand your point in terms of people who are hired into Black Belt or Green Belt Roles (like myself), but in my organisation we also try and get people who are in regular functional roles to do Six Sigma projects as extra workload. They should be given some sort of reward on successful completion.

    0
    #154143

    Sorour
    Participant

    what are you guys talking about?/????
     

    0
    #154142

    Sorour
    Participant

    Well let medraw it out for you, Iam in the Air Force nearing my 20 year point. I thought this would be great to pick up before I get out.  I like th econcept and thought this coul dvery well help me in my future endeavors. Yeah I think i bit off a bit more than I could chew here.  Anyway Iam taking the course through Gatline.  Its all online paid about 1475.00 for the course.  I guess iam not doing to well because the structure of the course is a bit kooky.  and the online slides are terrrible.  Not to mention its a fairly new concept for me sort of speak. To answer your question…   Well out of the forum maybe some assistance in how the heck I can pass this.  some formulas to success with it.  And how not having to pay the Air force the 1475.00 !!!they gave me to complete it.  haha.  No really Iam interested as heck, i just cant seem to grasp all the charts and some of the applications.  Iam half way through all the quizzes and I swear I must be the dumbest guy in the world! Iam just getting panicky because i have 6 more quizzes till the online exam.  And of course no Air Force guys havent even heard of Six Sigma. so I have no support there.  See what I mean. Anyway I appreciate any help. 

    0
    #154111

    Sorour
    Participant

    Both really. 
    Implementing a Quality Assurance and Continuing Improvement Program into our Ops teams i.e. Improving operational processes [steamlining, automating, understanding customers requirements,analysing reports etc] and ongoing monitoring and controlling [tracking changes and testing of these, review new projects and implementing QA and CI measures surrounding these too]
    Look forwarding to hearing from you. 
     

    0
    #153771

    Sorour
    Participant

    It is written as:
    Cpl = (Mean – LSL)/3*Std.dev
    Cpu=(USL-Mean)/3*Std.dev
    But shouldn’t it be
    Cpl = (Mean – LSL)/(3*Std.dev)
    Cpu=(USL-Mean)/(3*Std.dev)
    Notice the final parenthesis.

    0
    #153212

    Sorour
    Participant

    Thanks for the clarification, could you give me an example from process migration from BPO environment.
    Regards
    Paul

    0
    #64606

    Sorour
    Participant

    gvrk,
    Thank you for your reply.
    Being a Planner/Buyer, I deal with the forecast on a SKU/unit level. I have already looked at Cpk using the percent below and setting the UCL to 120% and the LCL to 80%…..calculating the sigma level. It is the zero FC and the zero Actual that tends to create large vaiances.

    SKU
    FC_Jan
    Actual_Jan
    Unit Var.
    Percent

    a
    200
    600
    400
    300%

    b
    0
    100
    100
    10000%

    c
    200
    0
    -200
    0%

    d
    300
    268
    -32
    89%
     
    Is there a better way?
    Thanks …..Paul

    0
    #147904

    Sorour
    Participant

    Neeraj:
    You certainly need to start with identification of business performance drivers. These drivers are key measures of the business performance. Then look into reducing waste using Six Sigma or Lean principles. This will focus on the bottom line. In addition to reducing waste, you may look into expanding services innovatively.
    With these three things, one can start optimizing the business performance, and sustain it through BPM.
    Paul

    0
    #147534

    Sorour
    Participant

    Fred:
    You sound like a very smart guy. Are you suggesting something to improve Six Sigma?
    paul

    0
    #145267

    Sorour
    Participant

    Hi Neal – yes this is helpful thanks.
    I guess the main point that I am arguing for is that if a business is executing FMEAs as part of its Quality Mgt system, then it should be documented.
    Thanks
    Paul

    0
    #143027

    Sorour
    Participant

    The Six Sigma Business Scorecard includes a process for calculating corporate sigma level by calculating an overall performance index, used as the yield for the corporation. – Paul

    0
    #143026

    Sorour
    Participant

    I believe August 05 issue of the Six Sigma Forum magazine was devoted to innovation.

    0
    #141941

    Sorour
    Participant

    EdG: Thanks for sharing these three points of TPS, Thinking Production System. But, when you reduce the thinking from this TPS, what is left, Lean Thinking, or the Lean System, one without thinking, just doing.
    That’s why TPS discourage layoffs, while with Lean Thinking, layoffs is the first expectation.
    Paul

    0
    #141610

    Sorour
    Participant

    A negative cpk in terms of goal posts means that you missed the extra point (provided that you had previously scored a touchdown) or that you missed the field goal (you’ll have to kick from a shorter range).
    Paul

    0
    #141468

    Sorour
    Participant

    There is a book titile Six Sigma Business Scorecard. You may look into it.

    0
    #141427

    Sorour
    Participant

    Mr. Bender wrote a SAE paper in the 1960’s about doing a statistical stack calculation to determine the expected overall dimension for sets of individual dimensional tolerances occurring in a linear path. His recommendation basically was to multiply the square root of the sum of the squared tolerance values by 1.5. Hence when a linear tolerance stack outcome is Benderized the RSS (root sum-of-squared) tolerances has been multiplied by 1.5.
     
    Basically if you took your tolerances and generated a large set of random normally distributed values for each one within its limits,
     
    then you randomly picked one from each of the tolerance value groups, and then added them together and repeated the process until the individual groups were gone,
     
    you would end up with a distribution of stacked tolerance sums. It would be normally distributed and the spread between its +&- 3 sigma limits would equal the RSS value of the tolerances stacked. That process of randomizing and adding the random parts is called a Monte-Carlo analysis. If RSS =  +/- 3 sigma limits — Benderizing it makes it +/- 4.5 sigma limits.
     
    Paul

    0
    #141393

    Sorour
    Participant

    No you cannot figure MMC bonus tolerance with runout.
    Runout tolerances are always a constant values limiting the variation that a surface revolving about an axis causes a test indicator vary.  

    0
    #140450

    Sorour
    Participant

    Bill:
    P value alone does not determine strength of the correlation. What is the r-sq. value, or the correlation coefficient? If that is small, it implies there is no correlation with 95% confidence.
    Paul

    0
    #140417

    Sorour
    Participant

    You may checkout Th Six Sigma Performance Handbook for reasons of failures. It lists various types of failures.
    Good luck!
    Paul

    0
    #137820

    Sorour
    Participant

    Geometric tolerances i.e. Ø9.4-8.9 |⊕|Ø0.36Ⓜ|A|B|C| have a variable upper specification limit. That limit can be visualized by making a histogram with both distributions on the same graph, the one for the geometric tolerance and the one for feature size.  The scale of the graph begins at 0 and at the geometric tolerance’s specified USL the size limit tolerance corresponding to the MMC condition would begin. When the two distributions are plotted on this graph the histogram reveals the extent to which the distributions intersect.
     
    The geometric tolerance distribution will often appear skewed toward the zero boundary (this happens because the computed deviation is always a positive number that reflects the size of the diameter zone needed to contain the deviation). If a scatter diagram shows that the means of the X,Y position deviation coordinates are roughly centered on target the histogram will appear more skewed conversely the more they are off target the histogram will appear more normal.
     
    To figure the Ppk of a variable geometric tolerance you have to estimate the intersecting area of the two distributions in contrast to the area between their means. When one is non-normal this is a difficult problem but not impossible, however you can estimate that area differential somewhat less accurately with the classic equation for stress vs. strength if you treat both distributions “as normal.”  If we assign letters to the mean and standard deviation values for size (Ms=mean size, Ss=stdev size) and position (Mp=mean position, Sp=stdev position) the equation for Ppk would look like this:
     
    One more thing, the MEAN value for size Ms has be converted to its corresponding value for variable position tolerance Mt. Subtract the mean size value from its MMC limit and add that to the lower constant value for the geometric tolerance and you will find the mean variable tolerance from the mean size.
     
    Variable Tolerance Ppk = (Mt-Mp)/(3*sqrt(Ss^2+Sp^2))
     
    To figure the Pp process potential of a geometric tolerance you must examine the scatter plots of the measured coordinates and determine whether the coordinates can be adjusted to target or not. If they can be improved refigure the geometric tolerance deviations as if they had been adjusted (understanding that the distribution shape will change). Pp = Ppu (coordinate means adjusted to target)
     
    To figure the Pp process potential of a VARIABLE geometric tolerance you must first adjust the means to target (if possible) and refigure the geometric tolerance distribution as described above and then you must find the optimum mean value for size that will make the Ppu for the variable geometric tolerance and the Ppu for size equivalent. This minimizes PPM defective for both size and variable position simultaneously. By setting the equation above equal to the equation for Ppu Size and solving for the optimum target size we have:
     
    One more thing, the USL value for size has to be converted to its equivalent maximum variable value for position USLpmax. Add the difference between USL and LSL size to the specified minimum USL value of position.
     
    Optimum variable tolerance Mt[optimum] = (Ss*Mp + sqrt(Ss^2+Sp^2)*USLpmax)/(Ss+ sqrt(Ss^2+Sp^2))
     
    Convert Mt(optimum) back to Ms[optimum] and we have:
     
    (USLs-Ms[optimum])/(3*Ss)=Ppu[size]=Pp[variable tolerance]=Ppu[variable pos]=(Mt-Mp)/(3*sqrt(Ss^2+Sp^2)).
     
    This method slightly underestimates the capability and potential capability of a variable tolerance because the predictions are made by assuming both distributions for size and position are normal. If capability analysis software was written to figure the intersecting area of dissimilar distributions then the estimation would improve somewhat.
     
    Other methods have been touted as a solution to this variable tolerance capability analysis problem but I have found them to be lacking. Most methods combine the individual variable bonus tolerance with the individual position deviation and then compare the resulting surrogate variable to a constant limit. These methods Adjusted TP, Residual Tolerance, Percent-of-Tolerance, and (effective size compared to virtual condition) can mask or amplify the variation in the surrogate relative to the variation inherent in the contributing sources therefore their predictions I have found to be untrustworthy.
     
    Some will say that the capability should be determined on the coordinates separately. I disagree! The specifications are often given as cylindrical zones where the maximum coordinate displacements are a function of one another. To limit that variation to something other than the design tolerance is to give a false capability. The variation is always different in each coordinate distribution.
     
    There are also methods to compare the elliptical boundary of the scatter plot to the circular boundary of position tolerance but the circular boundary is regarded as a constant value in those analysis methods so even those methods fall short of variable tolerance capability analysis.
     
    I hope this explanation helps,
     
    Paul F. Jackson   
     

    0
    #137637

    Sorour
    Participant

    Mike:
    I agree with you about inclusion of seven and many so called TQM tools in the Six Sigma tool box. By saying ‘beyond TQM’ did not mean to exclude TQM, I meant Six Sigma was built upon previously known tools. I hope it clarifies it.
    Thanks for your comments,
    Paul

    0
    #137628

    Sorour
    Participant

    Yes, the improvement initiative must become a culture. They require passionate hardwork in applying proven tools creatively, and a very direct and disciplined leader. One can either make a choice of copying somebody else’s successful practice, or learn from the successful practice and adapt to ones organization. We are too much tuned into reproductive thinking rather than productive thinking.However, I still do believe that TQM was the ill-defined TLA (three letter acronym), which was first time defined through ISO system, which has its own problems due to adultrated role of accreditation and registrar bodies. That’s a different topic for discussion.
    Paul
     

    0
    #137608

    Sorour
    Participant

    We lost track of “Six Sigma/TQM- Which is Better?”
    The challenge with TQM was that it was poorly defined, meant different things to different people, and results were promised after three years, which was about the average stay of the Quality Manager on the job. People got tired of trying TQM, which eventually became a Total Quality Mess.
    Six Sigma is an accelerated method for achieving improvement in short-term, which is equal to the project duration of 4-6 months, and when institutionalized throughout, Six Sigma can lead to long term improvement.
    Six Sigma evolved beyond methodologies such as JIT, TQM, and Benchmarking, however, it does incorporate principles and tools of these methodologies.

    0
    #136750

    Sorour
    Participant

    Use Minitab –  good product!

    0
    #136127

    Sorour
    Participant

    Yes, Trev honey, it’s me.

    0
    #136090

    Sorour
    Participant

    I believe everyone involved in Six Sigma is responsible for its downfall from CEO, trainers, consultants, Champions, and Practitioners. We all jump onto something like Six Sigma and start applying in rote manner. We need to look at each project, decide what’s the efficient approach, and apply necessary tools to get desired results.
    Most important aspect of Six Sigma that has been misunderstood is its intent, i.e., to realize a lot of improvement very fast, which is not possible without being creative. Thus rote application of Six Sigma methodology without its intent is not gonna work.
    Paul

    0
    #134607

    Sorour
    Participant

    Stan:
    You are good at writing ‘nonsense’.! Anything better?
    Paul

    0
    #134561

    Sorour
    Participant

    If you are measuring performance, and claiming to be 8/9 sigma, something is wrong! At 8/9 sigma level, one can pretty much stop measuging.

    0
    #64221

    Sorour
    Participant

    Hi Rekha:
    No one can make you understand Lean in S/W. Sorry!
    Paul

    0
    #133917

    Sorour
    Participant

    Hi Rekha:
    No one can make you understand Lean in S/W. Sorry!
    PaulThis thread has been moved to the IT/Software discussion forum. Please click here to continue the discussion.

    0
    #133689

    Sorour
    Participant

    Six Sigma was first launched on Jan 1, 1987. Before than various statistical methods, pre-control, dorian-shanin techniques, or DOE’s were used. My question is to find out first successful published case study. Mario’s work is more about improvement activities, and getting CEO award in 1990. Was it published in any magazine?
     

    0
    #60747

    Sorour
    Participant

    Sheri i would be gratefull if you could forward a copy of the Value Stream to [email protected] we are working on a project in healthcare and like you the only experience I have is in manufacturing/Industry. Look forward to hearing from you

    0
    #130976

    Sorour
    Participant

    Actually six corresponded to the level of defects determined to be necessar to stay competitive 15 years later in 1985.
    However, there were six steps to six sigma. Were you thinking of those?
     

    0
    #127783

    Sorour
    Participant

    No! Consider multiple conic sections perpendicular to the cone’s axis that are perfectly round. Now consider that the cone’s surface parallel to the axis is perfectly straight. It’s cylindricity will be half of the difference between the cone’s largest and smallest diameter.   

    0
    #127782

    Sorour
    Participant

    No! Consider multiple conic sections perpendicular to the cone’s axis that are perfectly round. Now consider that the cone’s surface parallel to the axis is perfectly straight. It’s cylindricity will be half of the difference between the cone’s largest and smallest diameter.   

    0
    #127066

    Sorour
    Participant

    Somebody missed a good acronym for:
    The GRPI model is broken down as follows: G oals R oles P rocess(es) I nterpersonal
    Call it GRIP, Goals, Roles, Interpersonal, and Process.
    Paul

    0
    #125505

    Sorour
    Participant

    Maybe I tune in to learn something…ever think of that?  I am new to six sigma (so spare me the hyperlink from ths site) and am trying tl learn.  Instead I see useless banther between you and your pals.  Not to mention that you seem to have a huge ego and are downright rude and mean to people.  When and if I learn anything from this site, I will have something of value to post.
     
    Good Day.

    0
    #125500

    Sorour
    Participant

    Zzt – as a oft reader but seldom poster, I commend you.  My thoughts exactly.

    0
    #64012

    Sorour
    Participant

    Suneeth, Thanks, If you can share more details that would be great, or some docs on the process and what to be tracked and how did you start and what you do will be of grt help. If you can share your email I can send a pvt msg and take it from there.
    Thanks,

    0
    #122298

    Sorour
    Participant

    I posted this message yesterday but wrote mixed effects, it was meant to say main effects:
     
    I am having real problems with analysing a basic data set in Minitab. The data are catch rates of fish and they are separated into 3 groups by year. I had a look at the mean catch rate for each year in excel and the catch rates go up from year 1 to 2 and then decrease sharply in year 3.
    However when running an ANOVA (GLM) model in Minitab on the data the main effects plot (which states it show fitted mean values) shows that the catch rates increase from year 1-2 and then again in year 3! how can this be? I am using excatly the same data and yet the main effects plot shows a different result?
    any help is as always very much appreciated
    Paul

    0
    #122185

    Sorour
    Participant

    since posting earlier messages, I have tried to run basic ANOVA (GLM) models on this data set, but can not do it as the distribution of residuals from these model runs are not normal, no matter how many times I log or square the original length values, so now I’m really stuck!

    0
    #122180

    Sorour
    Participant

    disregard the word female in the column headings, it just means that this data set is for females and does not refer to any of the data below. I can’t attach the excel spreadsheet with the data, most people just copy and paste the data from here into excel and then can work on it from there!
    regards
    Paul

    0
    #121031

    Sorour
    Participant

    Yeah…the sigma is academic if you don’t satisfy the cusotmer needs at all. But to improve internally, you need to define clearly for yourselves what constitutes a defect and find the number of opportunities for a defect. Therein lies the possiblity of making yourself as good or as bad as you like. Its up to you to use the tool to your business’ benefit.
     

    0
    #120832

    Sorour
    Participant

    Thanks for those comments BTDT, they are all very relevent and I will get back to my supervisor with these points.
    cheers
    Paul

    0
    #120786

    Sorour
    Participant

    OK now I am confused. In some way I agree slightly with BTDT that ANOVA may not be the most proper way to analyse the data. I first mentioned this to my supervisors but they said it could be used. BTDT is right in that year and month are continuous variables associated with time whilst fishing gear is more discrete and depends on what research vessels sampled the fish. I take it by processing plant you mean fishery: a fleet of deep-water fishing vessels which is what I am trying to analyse.
    I’m not sure what you mean about catch rates changing on their own, yes there are natural fluctuations but the cpue dramatically drops in the final year group and this is not easily explained by natural phenomena.
    so to conclude if ANOVA is not the appropriate tool to analyse what variables are effecting the cpue the most, then what statistical analysis can be applied to these data?
    (would just like to breifly thank everyone who has contributed so far, your comments are very much appreciated)
    cheers Paul (still the stats retard)  

    0
    #120785

    Sorour
    Participant

    yes I was fearing this type or reply, and think that this is the main problem. The data I have collected for some species (these data are only from one species by the way) are not balanced enough for a proper ANOVA analysis. However if I “pool” values together more then I ultimately will lose the accuracy won’t I?
    a catch 22 situation 

    0
    #120783

    Sorour
    Participant

    the data is fine the way it is, 1 refers to 1970-1979 2 is 1980-1989 and 3 is 1990-2000 as far as the years go
    the rest of the model is ok the way it is, I have discussed this over with my supervisors. My problem lies with the model not running. Depths are also gouped by depth category 

    0
    #120720

    Sorour
    Participant

    just curious but what is the mistake on data point 63??? I cannot see it?

    0
    #120719

    Sorour
    Participant

    the summary of my data is correct. To be breif the data are catch rates from research vessel sampling using different fishing gears over a long period (between 1970 and 2000) on deep-water fish stocks to the west of the UK.
    What I want to try and show is that there has been a gradual decline in catch rates over the years and this could be due to the fishery that has been operating there since the early 1980s.
    My supervisors have suggested that I run these models on the data to try and decifer which of the variables has the greatest effect on cpue and therefore then to try and explain why.
     

    0
    #120715

    Sorour
    Participant

    N=241 yes, and thank you for correcting the error. The fixed factors were indeed month, year and gear and depth was used as a covariate. The models were fitted with these variables to see if month, year gear or depth had the strongest effect on the cpue (catch rates of fish basically). The values in the month column where there are no numbers are for one month were no other data were recorded (and as there are no values for these I guess they can be omitted from the model). A basic logarithm of the cpue was finlly calculated (this is the final column) and this is why some values are negative in this column.
    did the model actually run?

    0
    #120699

    Sorour
    Participant

    ok, here it goes:
    month  year   gear    depth     cpue4 3 4 750 -2.1919211 1 2 1000 -1.994644 1 1 750 -1.978954 1 1 1000 -1.949984 3 4 1250 -1.937213 1 3 1000 -1.907494 3 4 1000 -1.905194 3 4 500 -1.890894 3 4 1000 -1.880915 1 2 1250 -1.869709 3 4 1500 -1.842834 3 4 1250 -1.807253 2 2 500 -1.8027510 1 1 1000 -1.758654 3 4 1000 -1.670594 3 4 1250 -1.655624 3 4 1000 -1.647855 2 2 1250 -1.6478510 1 1 1250 -1.633711 2 2 1000 -1.626669 3 4 1000 -1.602504 2 3 1000 -1.600081 2 2 500 -1.550944 3 4 1250 -1.5454111 1 3 1000 -1.543854 1 1 1000 -1.536809 3 4 500 -1.474859 3 4 1000 -1.474854 3 4 1250 -1.379876 1 2 500 -1.374859 3 4 1000 -1.365714 3 4 1000 -1.346945 2 2 500 -1.336104 3 4 1000 -1.328766 1 1 500 -1.307134 3 4 1000 -1.2957811 1 2 500 -1.2899210 1 1 1000 -1.281531 1 3 1000 -1.270454 1 1 500 -1.265741 2 2 500 -1.2499110 2 2 1000 -1.249914 3 4 1250 -1.247483 2 2 300 -1.244989 3 4 1000 -1.240771 1 1 500 -1.2130311 2 3 1250 -1.197594 1 3 1000 -1.1591510 2 3 1000 -1.152929 3 4 1500 -1.143864 1 1 750 -1.138651 2 2 750 -1.1152110 2 2 1000 -1.1100310 1 3 1000 -1.103786 1 2 500 -1.103784 3 4 1250 -1.096501 1 1 1000 -1.089653 2 2 500 -1.0796511 2 3 1000 -1.068603 1 3 750 -1.049465 2 2 1000 -1.0404010 1 1 1000 -1.033539 3 4 1000 -1.018924 1 1 750 -1.0137111 2 3 1000 -1.012264 3 4 1000 -0.998879 3 4 750 -0.997739 3 4 1000 -0.997736 1 2 500 -0.994647 1 1 1000 -0.980509 3 4 1000 -0.977534 3 4 1000 -0.9471111 2 3 1000 -0.946871 2 2 500 -0.907495 2 2 750 -0.907496 1 3 1000 -0.907407 1 3 750 -0.904314 2 3 750 -0.901115 2 2 500 -0.895805 2 2 500 -0.884427 1 1 1000 -0.882041 1 1 1000 -0.8788411 2 3 1000 -0.877799 3 4 750 -0.872797 1 1 1000 -0.8574410 2 3 750 -0.844599 3 4 1000 -0.842831 1 1 1000 -0.8360410 2 2 500 -0.834949 3 4 1000 -0.828596 1 2 1000 -0.802751 1 1 1000 -0.793419 3 4 1000 -0.788476 1 2 500 -0.7727910 1 1 750 -0.761467 1 1 1000 -0.758656 1 2 750 -0.744767 1 1 500 -0.727353 2 2 500 -0.718433 2 2 750 -0.714801 1 1 500 -0.705074 1 1 750 -0.7001910 1 1 500 -0.683886 1 1 1000 -0.681487 1 1 1000 -0.674624 3 4 1000 -0.673845 2 2 750 -0.670134 1 1 1000 -0.6489511 1 3 750 -0.617596 1 1 500 -0.612526 1 2 750 -0.606464 2 3 1000 -0.600085 1 3 750 -0.598924 1 3 1000 -0.594299 3 4 1000 -0.5883810 1 1 750 -0.572411 1 1 500 -0.572307 1 1 1000 -0.5697710 1 1 1000 -0.569773 2 2 1000 -0.568671 2 2 1000 -0.550941 1 1 750 -0.547581 1 1 750 -0.547589 1 3 750 -0.544283 2 2 1000 -0.534819 3 4 1250 -0.534625 2 2 750 -0.527287 1 1 1000 -0.527226 1 1 750 -0.5176210 2 2 1000 -0.517524 1 1 750 -0.516553 2 2 300 -0.4717611 2 3 750 -0.468984 2 3 500 -0.4539510 1 3 500 -0.447617 1 1 750 -0.441136 1 1 750 -0.4348811 2 3 500 -0.421105 1 2 1000 -0.419051 1 1 750 -0.4166510 1 1 750 -0.4136910 1 1 1000 -0.4028210 1 1 1000 -0.402824 1 1 1000 -0.398487 1 1 1000 -0.394066 1 1 750 -0.382356 1 1 1000 -0.358926 1 1 1000 -0.352721 2 2 1000 -0.346823 2 2 750 -0.343734 1 1 1000 -0.316705 2 2 1000 -0.315415 1 3 1000 -0.310637 1 1 750 -0.302253 2 2 750 -0.299836 1 1 500 -0.298531 2 2 750 -0.295675 2 2 750 -0.295675 2 2 1000 -0.281437 1 1 500 -0.280461 1 1 1000 -0.2782910 2 2 500 -0.276781 1 1 1000 -0.274034 1 1 1000 -0.273294 1 1 1000 -0.272931 1 1 750 -0.271384 1 1 1000 -0.269041 2 2 1000 -0.2676411 2 3 500 -0.254245 1 2 750 -0.251327 1 1 1000 -0.246694 1 1 750 -0.238581 2 2 750 -0.2328810 1 1 1000 -0.2298210 2 2 750 -0.224617 1 1 750 -0.223075 1 2 500 -0.220533 2 2 750 -0.216497 1 1 750 -0.2155210 1 3 500 -0.213825 2 2 1000 -0.200693 2 2 1000 -0.1948610 1 1 500 -0.191103 2 2 750 -0.182974 1 3 750 -0.1674310 2 2 750 -0.163557 1 1 500 -0.1627110 1 1 500 -0.1625510 1 1 500 -0.1579110 2 2 500 -0.156495 2 2 750 -0.144847 1 1 750 -0.1401010 1 1 1000 -0.111131 1 1 1000 -0.095684 1 1 1000 -0.092653 2 2 750 -0.081336 1 1 1000 -0.077837 1 1 1000 -0.053932 2 2 750 -0.045795 1 3 750 -0.0290110 1 1 1000 -0.022263 2 2 1000 -0.015836 1 1 750 0.014492 2 2 750 0.014913 2 2 1000 0.024259 2 2 1000 0.028843 2 2 750 0.033397 1 1 500 0.035291 1 1 750 0.042245 2 2 1000 0.044553 2 2 750 0.046751 1 1 1000 0.047842 2 2 750 0.068153 2 2 750 0.100342 2 2 750 0.1026610 1 1 750 0.102945 1 2 1000 0.1248410 1 1 750 0.1291110 2 2 750 0.137484 1 1 750 0.1416310 1 1 750 0.148976 1 1 1000 0.149023 2 2 750 0.154921 1 1 750 0.183921 1 1 750 0.188666 1 3 750 0.1922310 2 2 750 0.194133 2 2 750 0.237024 1 1 500 0.2381810 1 3 750 0.249873 2 2 750 0.297252 2 2 750 0.309766 1 1 750 0.310011 2 2 750 0.313579 2 2 750 0.313574 1 1 750 0.3419210 1 3 750 0.353351 2 2 750 0.375401 1 1 750 0.439505 2 2 750 0.488871 1 3 750 0.57972

    0
    #120697

    Sorour
    Participant

    I have tried to copy the data on to this page and post it but it won’t let me do this (the data set consists of nearly 400 lines, is this why?)
    is there anyway I can get the data posted for people to look at the problem?
    cheers Paul (the stats retard)

    0
    #120680

    Sorour
    Participant

    the sample size for these data are large (n=241), but after having looked at a basic normailty test (using probability graphs and histograms) the data contain a large amount of very low values, which I think effects the overall distribution. The data were chosen randomly and have not been sorted or filtered in anyway.
    I am confused as how to use the moods median test? It only lets me put in one factor (I have one input and three variables in this test that I want to look at). Also how will running these tests help in the overall object of looking at the data using ANOVA? I can supply the minitab spreadsheet if needed should that help?
    Paul 

    0
    #117086

    Sorour
    Participant

    Please email a copy to [email protected]
    Thanks.

    0
    #117085

    Sorour
    Participant

    Bob,
    Please send a copy of your presentation to [email protected] as well. 
    Thanks.

    0
    #116797

    Sorour
    Participant

    Bob,
    Thanks so much for your insightful feedback…  I will take your advice and expand the survey beyond current customers and employees.
    Thanks!
    Paul

    0
    #114693

    Sorour
    Participant

    I have used a software package called “Qualiware”, this allows not only mapping of the process in one dimension but many.
    This is process modelling rather than process mapping tool. The resulting output is a process model that gives greater understanding of the process in regard to direct and indirect improvement.
    After you have remodelled to take into account your changes or improvements you are left with excellent training tool, which also can be your quality document system.
    See http://www.qualiwareinc.com/
     
     

    0
    #113785

    Sorour
    Participant

    Give me a call at work and we can discuss this in depth:  817-352-1436

    0
    #113538

    Sorour
    Participant

    Darth,
    The easiest way to think about this is to ask yourself how are you going to control an ABC interaction, by using the A knob, B knob and C knob. Therefore they have have to be in the model so that you can use them. That’s the only reason why they have to stay in the model, its a practical one. leaving the 2 ways in has no practical use, although it might please the mathematicians amongst us.  Hope this helps.
    Paul

    0
    #113537

    Sorour
    Participant

    Darth,
    The basic approach use

    0
    #113536

    Sorour
    Participant

    Darth,
    The basic approach

    0
    #113535

    Sorour
    Participant

    Darth,
    The

    0
    #112748

    Sorour
    Participant

    Oli,You can use Graph > Time Series Plot and select the Time/Scale option to specify your time stamp column. Tick mark selection will probably not be to your liking, but it will get the job done provided there aren’t too many observations.

    0
    #112068

    Sorour
    Participant

    = Gay.

    0
    #110868

    Sorour
    Participant

    The readings are of air pressure which is recorded once a day over a thirty day period.one reading is recorded each day.
    thanks for your help
    Paul

    0
Viewing 100 posts - 1 through 100 (of 193 total)