iSixSigma

Michael Clayton

Forum Replies Created

Forum Replies Created

Viewing 10 posts - 1 through 10 (of 10 total)
  • Author
    Posts
  • #201584

    Michael Clayton
    Participant

    Great discussion and important topic. But I agree with using MANY graphical tools for EXPLORATORY DATA ANALYSIS then show the graphs to domain knowlege folks for “ah-hah” reactions. Time series graphs are fine but its the CONTEXT data that is critical.
    Example: which TYPE of crime has most variance? Most far outliers? Does database have other FACTORS that can serve as HISTOGRAM CATEGORY or Component of Variance analysis. Get stat software or use online tools that allow many nominal factors as quick REPLOT by new selections for Histograms or Time Series Chart (like SPC which is subset for factory machine control and other issue Wheeler knows well) and also Multi-Vari or similar graph common in JMP or JMP or STATGRAPHICS or college online JAVA graphical tools.

    I would start with Histogram and list of all factors in database that could help you look at impact of one or many factors at same time.

    That means your dataset should have all data types in columns for factors and variables with each row simply a date or case number. Just my approach. Software often dictates a spreadsheet structure that works well for each major type of analysis.

    Excel is generic but can get messy if you mix NOMINAL and NUMERIC data in same column.

    Many court studies online or on city or state or gov web sites and many justice college home pages. Ask a student or professor.

    0
    #195806

    Michael Clayton
    Participant

    @Darth: Sorry if my tardiness offends you.
    I only view this site a few times a year, since manufacturing discussions seem to be few, and since they are kept online for long times, they can be useful to those with similar problems…IF the responses stay one the subject. So since this one went way off topic in typical social-media-type rants, I thought I would try to conclude it with some perspective on the process of flattening, and its very confusing engineering measurement limits, a common problem in manufacturing that require careful (with diagrams) definitions. At least @MBBinWI was as confused as I was until the comments about the process were clearer.

    Thanks for keeping these few manufacturing discussions online. I also go back through the archives for learning examples. Sampling plan questions are always problematic, as economic costs of sampling FREQUENCY as well as sample size per SPC subgroup are more important than the textbook rules suggest.

    0
    #195784

    Michael Clayton
    Participant

    @javiermora85 MBBinWI has good idea as does prior comment.

    So please think about, what is cost of product roughly, volume, and cost of field problems historically that might have suggested tracking. That would tell you if you can afford laser marking after polishing IF each casting is TRACKED in a DB to that point so you really know the input serial number or vendor batch date.

    Risk management is based on history of failure modes, and customer experience, as well as internal scrap issues with each vendor, or same vendor over long time. And lately, with litigation being used simply to harass and get lower price next round, you have to discuss these costs with finance, marketing, sales and management team, as remedies are not free. So first question, do you have wip tracking system in place? Eyelit, for example. Or Camstar. Or any home brew database and tool data entry terminals.

    0
    #195783

    Michael Clayton
    Participant

    Thanks for explaining the flattening process, nominal is not nominal, but rather the starting diameter roughly) which gets flattened to slightly larger diameter, so so you simply want to trend the data and set up SPC charts using a sampling plan based on your preliminary study data, right? I think if you had explained that in first post, DARTH would have been less offensive..or wrong. But Mike Carnell is right, in that stat software on sampling is abundant, and even free on web.

    However, the cost of sampling and cost of NOT sampling should be added to the discussion, and economics rather than statistics often determines the optimal sampling plan, in my experience. If the gaging is automated, simply capture the readings and plot the data. If its manual data entry, typos by operators can be 2% of alarms, or operators can “flinch” giving no alarms, but a strange clustering of data just inside of the spec limits. So using real process control limits, rather than spec limits, to alert machine keeps of drift or shift or high variability is useful. But if the only question is a homework assignment asking for best sampling plan…sorry…but there is no economic answer possible, and statistics can be very misleading…in my experience.

    0
    #195782

    Michael Clayton
    Participant

    Connecting machines or gages to a DB should be simpler now days, since few tools are built without some kind of data port, that supports some kind of data transfer protocol. In the bad old days, we had to “wire behind the light bulbs” to see when a machine was ON vs OFF, but even that paid off over time with many tools.

    Tool data integration is a specialty not found in many IT organizations, who do not work on “real-time” operating systems, but rather get data from tracking systems after operators manually enter data. If OEE is a hot topic at your shop, perhaps rudimentary data capture might be better than the usual mistake-prone typed data. Each industry has tool vendors who understand tool data capture issues, so at least you might ask them, regardless of what your IT people say “cannot be done.” In semiconductor industry, we evolved communications protocol standards for all our tool vendors to follow (SECS I and II using RS-232 port, and later high speed messaging using ethernet ports.) Texas Instruments famously “wired behind the light bulbs” on a lithography tool that was used in many of their fabs worldwide, to see “when is lamp flashing” as that means the tool was at least exposure wafers to circuit pattern reticles, and they found that one fab was 2x better at OEE using that simple input, so they then studied it and were able to clone setup and scheduling ideas across all fabs, saving the cost of an entire new fab! At other extreme, machine sensor data can be streamed to multivariate analysis tools to detect FAULT counts (fault detection and classification schemes are common for some industrial tools). The idea is that if the tool is still doing the same things, the fault counts can be sensor means, medians, or inter-quartile-range spreads of time, power, voltage, current, whatever by process step vs history during known good product periods. For plasma etching, we found that the slope of the end-point-detector signal correlated to variation in line widths from center to edge of a wafer.

    Even in low tech factories, simple data capture by a tablet attached to a machine data port, or to the on-off signal from operator start button, can give valuable information to maintenance and engineering folks.

    0
    #195781

    Michael Clayton
    Participant

    Realizing that factory floor teams, middle management, finance, and upper management all need to think about the problems, various kinds of Process or Flow Maps have been used (often seems like style over substance, but if done simply for intended audience mix, can help get improvements funded. In my opinion, process and flow maps are less complex, as they avoid some “Lean metrics” that are used in VSM’s, since many levels of decision makers often have no idea what TAKT time means, but Process Time or Cycle Time they tend to interpret as waste, so that does get attention. If you ever have to do many maps to get teams to understand complex issues, Visio has templates, and there are also Excel templates on web, some free. With high turnover of management (bungie CEO’s, etc) one cannot assume Lean training, let along Six Sigma training, but they all seem to love 5S and entire factory is often seen as “waste” by design managers, thus the many foundries. Life in the 21st century.

    0
    #192261

    Michael Clayton
    Participant

    “We can make any process better as long as you are not trying to violate the laws of physics, which processes are your most important?” from Gary Cone is memorable. Should be part of our new elevator speech.

    You better make sure the CEO or VP or Director really understands what a PROCESS really is. But you will find that out when he/she answers your priotizing question. So be prepared to expand his/her thinking a little if the only reply is “cost reduction.” Gently. Humbly. But persistently. We all have trouble with listening (I do certainly) so let them do most of the talking, and listen, after your initial elevator speech. I screwed that up many times, personally, when I jumped in on top of their own elevator speech.

    0
    #192258

    Michael Clayton
    Participant

    7 years ago I used this site for a while, under username SemiMike.
    After I got a lot of really useful things, I got busy working and stopped using it. Now at age 76 I am only working part time, so set up new account, not remembering the old password and username. However, after a new logins, I remembered the old account name and password (slow response in brain at this age?) and sure enough its still there. There were just as many jokers back then. But to explain my experience pertinent to LSS “elevator speeches” I get more jobs if I only let my Black Belt certification come out in extended interviews, not in first contact. In first contact, I try to listen while explaining that I am a contract engineer, working to help factories get stabilized, and then perhaps suggest and lead improvement efforts. If they start talking about cost reduction (normal behavior for VP’s and Directors) I mention that the Lean methods are somewhat useful, particularly a gap assessment visit that I could make at half-fee after which we can see if we have a match in my capabilities and the factory cost problems.

    Does that make sense for an elevator speech?

    The reason I learned NOT to invoke certifications or LSS terminology was that it seems to invoke memory of huge wasted training costs with no results (which later turn out to be guru who did not know the industry, giving generic series of black belt classes, which the engineers loved, but never dealing with the real CEO business objectives. To get around those sad memories, I use stealth methods, and only provide help that is really needed for the current business objective. They can get all the other stuff on Google or in local colleges. But I do donate a copy of Lean Six Sigma Pocket Toolbook, by George/Rowlands/Price/Maxey to help with vocabulary of LSS just in case they want to focus on a few of those tools later. First, the one day gap assessment.

    0
    #192243

    Michael Clayton
    Participant

    Lean = Jargon
    Six Sigma = Jargon
    Acronyms = Jargon
    I have to avoid those terms to get managers to listen without pre-judging the reputed high training costs of LSS etc. Most managers understand the word STABILIZE and OPERATIONS and IMPROVE and PROCESS CAPABILITY and OPTIMIZE and SUPPLY CHAIN or FLOW. That gets them to start listening and asking questions.
    Unless they think they are already LSS-gurus, I avoid any of the LSS “jargon” until we focus on the factory or office process. Value Stream Mapping is even Jargon, but that comes up early after the first very very simple opening comments. Cost reduction is what’s on their minds. So I don’t have to say that.

    0
    #192235

    Michael Clayton
    Participant

    Call us if you need to stablize your operations in order to then improve their underlying process capability, while optimizing supply chain or factory flow.

    0
Viewing 10 posts - 1 through 10 (of 10 total)