# Noise factors in Taguchi designs

Six Sigma – iSixSigma Forums Old Forums General Noise factors in Taguchi designs

Viewing 31 posts - 1 through 31 (of 31 total)
• Author
Posts
• #40231

RikdC
Member

I am rather new to the DOE environment so please don’t hit me too hard.
Noise factors might change the signal response ratio in uncontrolled ways ( in the field ).
Do I have to control these factors during my experiments if I want to understand how they change the responses ?
Given the above true , do I have to treat noise factors as discontinuous variables ( categorical ) ? Something which is on or off ?
Generally speaking any info on designing Taguchi exp with more than a noise factor is the issue of this mail
Thanks

0
#124328

HF Chris
Participant

Is noise preventing or interfering with detection? Is the noise enough to sidetrack criterion rewards. Speaking from a signal detection theory background the key factor is not the noise per se but the motivation to be more liberal or conservative in selction. Please elaborate if this is the direction you are leaning towards.
Chris

0
#124337

RikdC
Member

Chris ,
I’ll try to go into details to explain a bit more what I am doing.
The material we are manufacturing is tested by small scale simulation of the process that our customers will be doing in their plants.
I regard the amount of material I am reacting ( scaling of the reaction ), the mixing energy and the temperature of the chemicals as being control factors.
The signal is the chemical composition of the tested material ( a change in the recipe should be reflected in a change in the responses ).
The response is the time it takes for the reaction to reach a certain point measured as the viscosity of the reacting mixture.
Noise factors are the temperature of the lab where the measurement is performed and the operator who is doing the test.
Noise factors were separatedly proved to be statistically significant factors for the result of the test ( std ANOVA ).
Given that in real life I can’t control the temperature of the lab nor the operator who is carrying on the test , it is the aim of my experiment to find out the set up which is less sensitive to noise factors.
The question now goes straight to the initial point , should I repeat all my exp runs under all the possible noise conditions ( and combinations of noise conditions ) which turns out to be another factorial experiment ( with noise and control factors all together )or should I simply repeat the experiment under uncontrolled noisy situations ?
Actually it looks like putting everything together in a big factorial exp could be a solution. In this case finding the levels for the control factors where the interactions with as many as possible of the noise level are small seems to answer my question.
Isn’t it that Taguchi designs should help reducing the amount of exp needed for this sort of problems ?
Thanks for your attention

0
#124339

HF Chris
Participant

According to your post you indicated that noise (lab temp) was a significant factor. To me that, this makes it more than noise. Have you run your data for “all-way” interaction. By looking at mutli-level interaction this should break down the major plays that need to come together to make the difference. In other words, level 1 of factor A by level 2 of factor B x lab temp creates a significant difference between level 2 of factor A x level 2 of factor B x lab temp.
Chris

0
#124340

RikdC
Member

Chris ,
might be I misunderstood the idea of noise.
Taking it from the beginning I would say that everything is influencing everything so that the response = f(a,b,c,d…….infinite).
Some of these variables are under my control in the field , some might be manipulated only in limited( expensive) conditions and some are not under control at all.
It is my objective to find that setup of control variables ( those under control in the field ) that maximize the signal / noise ratio where the noise in coming from two big families ( manageable under special conditions and not manageable at all ).
I understand your point of view when you suggest to analyze high order interactions between control factors and other sources of variation but this sounds to me just like a std ANOVA ( and actually a rather expensive one given the resolution needed to avoid confounding of high order interactions ).
Please be patient ( I told you I am rather new to this argument ) but if noise is not manageable during the experiment ( at least a part of it )than I don’t see where the difference is between Taguchi designs and fully replicated factorial designs.
Thanks

0
#124341

HF Chris
Participant

I have put Vinny’s name on this to pull him in if possible. I believe he has more background on temp and chemicals. The first question I have is do you know the lab temps in the tests that you have already performed? If you do, then by looking at low level interactions you can weed out those that are not as sensitive to temp changes.
An example in my area is adhesive. We were applying composite material with certain adhesives in the plant which were normally done in a clean room. Temperature and humidity become a major problem until we experimented with different worklife levels and weep holes. If lab temp is a factor in your tests, how can you not test different ranges of temp?
Chris

0
#124344

Robert Butler
Participant

Based on your posts you seem to have the following situation

You have variables you can control  amount of material(s)  that is the recipe, mixing energy, and reactor (?) vessel temperature
You have variables you cant control  lab temperature and operator
Your response is time to percent reacted which is measured by the viscosity of the reacted mixture.
An independent study showed the lab temperature and the operator to be significant  you ran an ANOVA to determine this.

If this is a reasonable description of your situation then the question you are really asking is  with the variables I can control can I impact the process and can this impact outweigh the effects of those things I cant control.

The quickest way to check this would be a simple main effects factorial design.  A design of this type will not only tell you the significance of the effects it will also (assuming you analyze the design using regression methods) tell you the magnitude and direction of the effects. This means you will be able to see if the effects of lab temp and operator overwhelm the effects of the other variables.  If you use Box-Meyer methods with the design you will also get a measure of those factors impacting the variability of your process.  If lab temp and operator are, in fact, the variables with the largest effect then what this quick check will tell you is you are wasting your time trying to compensate for them by changing other things in the process.  If they arent then you can start thinking about ways to minimize their effect on your final product properties.

0
#124345

FTSBB
Participant

Taguchi methods focus on the separation of inner and outer arrays – the inner arrays the factors you want/can control, while the outer array is”noise”.  Doesn’t mean you can’t get tricky with the way you set up the sampling, but this is the theory in a nutshell.
For a good example of Taguchi arrays, try this link:
http://www.itl.nist.gov/div898/handbook/pri/section5/pri56.htm
However, from your other posts, it looks like you want temperature to be a factor in the design.  If you can completely randomize temperature, great.  If you leave temperature set at each level and run your other treatments (a.k.a., blocking), beware.  You might venture into the world of whole plots & split plots…

0
#124346

RikdC
Member

Chris ,
the impact of lab temperature has been assessed analyzing data coming from many months of testing .
During those runs only the lab temperature and the operators were “controlled ” ( I know who did what and what was the lab temp ) and of course I know that the runs were carried on following the stated procedures that are covering some of the levels for the control factors that I would like to investigate.The problem with these data is that the level of the signal factor is unknown ( I don’t know what was in the blend ). Given this anyone migth agrue why I decided to analyze my data as if the composition was correct ( signal ).
I did so because the deviation is only in one direction ( for the period sutdied ) and nothing has been changed in the recipes in the plant ( as far as I know ).
The only info I can can extract is that the actual set up ( the one used for the measurements in the first study ) is sensitive to temperature and operators but I don’t know if any of the control variables can be set up so to compensate for uncontrollable factors.
I hope I found the right words.
Thansk

0
#124349

RikdC
Member

OK,
thanks for the link FTSBB ( never Taguchi’s were made so clear to me !).
I think I will try run the experiment in its std FULL factorial design and keeping operators and temperature together with all the other easily controllable factors.
It finally becomes clear to my mind that nothing I can do other than running into the tricky ( and expensive ) world of high order interactions.
Currently , in my position , I can’t randomize lab temperature that easily but I am working on it.
Thanks Chris and Thanks Robert
Robert any link for BOX MEYER methods ?
Rgds

0
#124351

Dayton
Member

In reading RikdC, Chris, and Robert Butlers thoughts I believe there is a great deal of right thinking being pulled together.   I was reading all of this and thinking about RikdCs givens, e.g., given I cant control lab temperature, and given I cant control operator variance, and wanting to first challenge that basis of that thought.

Laboratory temperature, humidity and other support systems are controlled well in many industries and companies.   And operators are trained, tested/certified, procedure driven, and equipped in many industries and companies to reduce the variability you term noise  and uncontrollable noise at that is your assessment.   Now it might well be that with your particular industry and company you are not in a highly regulated health outcome critical industry and company like mine (GiantMedicalProducts) and revenue, profit margin and cost benefit analysis does not allow you the capacity for finite controls.  But you can systematically reduce the non-controlled impact of these variables and I would not just assume they are givens.

Also I have some concerns that your chemical mixing and testing processes appear to both prone to exothermic and endothermic reactions which bring into question the adequacy of the process validations equating your in-house laboratory simulation to the scale up done by customers in their plants.

And how did you determine that viscosity was the chemical composition critical to assuring optimized subsequent operating parameters?   Did you rule out pH, opaqueness, homogeneity, or other potential variables or characteristics?    Its also certainly possible to get the same viscosity results from very dissimilar chemical formularies.  But all of those (and more) are just things Id be asking if I was in your lab looking over your shoulder  you know what you are doing and your business, I dont.

For where you are now and potentially making things better, Id go back to what Chris and Robert were telling you and make a bit of a change to your DOE.   Develop procedures for your lab operators, train your lab operators, audit your lab operation, use calibrated laboratory equipment, and make the basis of your testing done one acceptable way only, and describe your test results and product performance against specifications as viable only if your procedures are followed to the letter  essentially take as much operator noise out of the equation as possible, and then as part of your DOE control for upper an lower ranges of temperature.   I believe your challenge is to first eliminate and control for as much of what you are calling noise as possible and then run your DOE around the edges of what you cant control but understand to be variable and impacting  temperature ranges along with the other factors in your DOE.

This is one of those times that it helps to have multidisciplinary and varied industry collaboration and I and others would like to be there to help and I hope some of the collective thoughts will at least give you some alternatives to try.
Vinny

0
#124352

FTSBB
Participant

Didn’t mean to scare you off of the whole plot-split plot thing.  If temperature is the only non-randomized factor you will have, I’d suggest this link, which is almost identical to your situation:
http://www.minitab.com/support/docs/OneHardtoChangeFactor.pdf
If you use minitab, this is a step-by-step for a single restriction on randomization, complete with a data set and analysis steps.  If you have additional restrictions on randomization (a.k.a., other “block” variables), I’d suggest hands-on guidance from a master blackbelt.  The statistics get pretty rediculous with more restrictions (in my opinion).

0
#124356

RikdC
Member

Thanks Vinny ,
my given are just the state of the art in my lab.
Yes , you are right operators can be trained and lab conditions can be kept under control. cost/benefit
Just for the sake of completeness my reaction is an exotherimc reaction between two blended systems.
When I say viscosity being monitored I am simplyfing the issue , actually there is a needle being pulled in and out of the reacting mixture which is polymerizing . The polymerization rate is being controlled by a catalyst in the blend. The amount of catalyst will be my signal.
All of the other chemicals in the blend are monitored by NIR spectroscopy.
The question that could come out is why am I not just measuring the amount of catalyst by any of the many available separation technique ( GLC – MS and so on ) , well ..cost /benfit.
It is my task to find a way to improve the way we are reproducing our customers’ processes in small scale , minimizing costs – skills required – equippments and not least wastes.
Thanks
RiK

0
#124357

HF Chris
Participant

Vinny,
Thanks for the support and for breaking the chemical process down to include other possible variables. I still believe that by looking into interactions between all levels he will be able to identify where he needs to spend the money to control for temp.
Chris

0
#124372

Dayton
Member

RikdC,

I understand now better what you are doing and facing  both from technical and resource availability perspectives.

You are faced with working a macro solution because you dont have resources for a micro solution yet due to the technologies and precision required need to provide both an intertwined micro and macro solution.  Not the best place to be.

But better controlling some of the human elements can at least reduce their negative impacts, and fractional factorial DOEs can minimize the number of runs but it does appear, to me anyway, the with an endothermic reaction between two blended systems with a polymerization rate controlled by catalyst in the blend that mix rate and vessel size are also of increasing importance giving additional concerns about scale-up correlation to your customers sites.   Occasionally scale-up issues defy logic.   I once had a burst strength issue with arterial balloons used on medical devices and the variable that showed the most statistical impact turned out to be the ratio of surface area of the exposed latex in the dip tank compared to the mass of latex in the dip tank.   Oxygen exposed surface latex relative to the total latex mass and frequency of aeration mix movement was the scale-up killer.  Go figure.   Find that in your chemistry or physics texts.

Your cost constraints are impedances but not show stoppers.   Sometimes its enough to put a few stakes in the ground that you might otherwise not want to place and say, With these assumptions, these constants, these uncontrollable variables, and these controllable variables  these are reasonable expectations.   If we were to additionally do this _____, the cost would be _____, and the results could be better predicted by this much_______. And factor your operational/experimental constraints into the report of analytical results.  This makes the cost/benefit analysis you are wrestling with visible to all and makes all part of problem and solution.   You provide the analytics.   Others provide the resources, and they get what they pay for.   Be professional, matter of fact and dispassionate about it.

Good luck.  Ive been there and you just have to keep working smart, as it appears that you have been.
Vinny

0
#124476

RikdC
Member

Thanks Vinny ,
I think I have learned at least two major points :
– scaling up/ down is a serious issue especially when dealing with chemistry ( a funny point here is that scaling down molecules is impossible ! )
– well designed exp should go along with well presented results ( to the managmenet – those who pay the bill )
Thanks all
PS from a technical point of view the example given on hard to change factors ( within Minitab ) seems to fit my purposes.

0
#124820

Manee
Participant

One good link for Taguchi mehtod is on this sight is https://www.isixsigma.com/library/content/c020311d.asp  A good software for Taguchi design is rdExpert from Phadke, where you can have three levels of inner and outer array and still save on experiemtation
Manee

0
#124846

Jayen466
Participant

RikdC wrote:
> Actually it looks like putting everything together in a big factorial exp could be a solution. In this case finding the levels for the control factors where the interactions with as many as possible of the noise level are small seems to answer my question.
Just a couple of comments.
In the outlined scenario, there were 3 control factors, 2 noise factors.
Assuming two levels (high, low) for each factor:
Putting everything into a single, big full factorial will require 2^5 = 32 runs.
Taguchi method: A full factorial inner array for the 3 control factors requires 2^3 = 8 runs. The outer array of 2 noise factors has 2^2 = 4 runs. The result is again 8 x 4 = 32 runs.
The runs here are in fact exactly the same, it is just two different ways of writing them down.
As for interactions, the idea is to find LARGE interactions between control and noise factors. A large interaction between a control factor and a noise factor means that the effect of the noise factor strongly varies in size depending on what level the control factor is set to.
So in a Taguchi experiment, large interactions between control and noise factors are GOOD, they are the Holy Grail, since they offer the possibility of adjusting control factors in such a way that the effects of noise are minimised.
Where more control factors are involved, the inner array used in a Taguchi design can be a fractional factorial, to save runs and costs. The split between inner and outer array is then useful in ensuring that all noise factor interactions remain unconfounded with interactions between the control factors – something that could easily happen if all the factors were put into a single array.
Hope that has added something not already covered.
J.

0
#124901

RikdC
Member

Thanks
Jayen466
It seems that the advantage between a Taguchi design and a std design is that the inner array can be highly fractionalized while the outer array can be full factorial.
As we are looking for interactions between the inner and the outer array then we assume we are not too much intrested in interactions within the inner array.
If this is true then I understand.
I am using Minitab , how would you append a full factorial for noise fatcors to a highly fractionalized for control factors ?
I think this can not be done automatically.
Would you generate a separate full factorial for noise in the general full factorial module and repeat the fractionalized design generate by Taguchi module for each row of the full factorial ?
Does this smell like a split plot ?
Thanks

0
#124902

Michael Schlueter
Participant

Hello RikdC,

Noise factors are intended to change your output response. The stronger, the better.
Yes, this implies that you do control Noise factor settings during experimentation.
I do not understand your question about discontinuous Noise factors.
Your objective is in a Taguchi design:

to make your process less sensitive to Noise

at least cost

in the shortest period of time.
Hence, a good Noise strategy is key. Effective Noise strategies are treated as a business secret by some companies for a good reason.

Hope this helps, Michael Schlueter

0
#124903

Michael Schlueter
Participant

Hello RickC,
Sorry, I fear you head for a wrong direction with the full factorial on the Noise factors (though I understand it a little).
Why do you think specifically this is necessary to do in your situation?
The problem I see is that you increase cost by this strategy. You run more experiments (full factorial on Noise), which costs time, money, resources, material, etc. Time-to-tesults increase considerably. You do have competitors for this market, don’t you?
A better approach would be to re-use your previous analysis about the Noise factors. Find two sets of combinations from the Noise factors:

N+ = (N1+, N2+, …) driving your output (viscosity) to too high values
N- = (N1-, N2-, …) driving your output (viscosity) to too low values
Evaluate the inner array (design choices) for these 2 N+ and N- conditions only. (As a side effect you don’t have to replicate. You can, but with strong, strong N+, N- in place you probably do not need to.)
Optimize the relative variation first (Taguchi’s Signal to Noise Ratio (SNR)). Next adjust on-target. Verify (2 runs: 1) initial design 2) optimized design).
Done. Next improvement, please.
Hope this helps, Michael Schlueter

0
#124905

RikdC
Member

Thanks Michael ,
I see your point and appreciate your attitude but I see a few problems.
Can’t change control factors for already existing data.
Signal ( the product ) is supposed to have been stable during the already observed data but I don’t want to base a full project of revisions on flying suppositions. I can’t afford cheap solutions.
Am I wrong ?

0
#124907

Michael Schlueter
Participant

Hi RickC,
Thank you for your reply. Sorry, I’m afraid I don’t get your message. Can you please try to say it in different words?
Thank you, Michael

0
#124910

RikdC
Member

Michael ,
in your previous message I understood you were suggesting to use those data that I said confirmed an influence of the factors identified as noise factors. Is this correct ?
If this was your suggestion than I am afraid I can’t do that for the following reasons :
– those results were obtained using always the same set up for the control factors.
– those results were obtained on production samples and hence I can’t be 100 % sure that the variation in the response is coming only from noise and control factors ( say I don’t know if the signal for my process is the same )
– the only thing I can  say is that no changes in the process were introduced ( no major changes ) and so I expect that changes in the responses were due to noise.
– I understand that the information based on this assumption is weak and as a matter of fact I only used it to identify potential noise factors.
I hope this is making it more clear.
Rgds

0
#124911

Michael Schlueter
Participant

RickC,
Thank you, now I understand your points. Yes, that’s what I suggested. It is a common approach to identify N+ and N- conditions (strong deviations from target situations) for one set of control factors only (i.e. for 1 design or for 1 prototype).
Assumptions made: when the design/process/system is sensitive for 1 prototype, it will be sensitive to other protoypes, too (i.e. for other variations given by the inner array). For this purpose it is not needed to know all Noise factors. We are interested in the effect of Noise factors, i.e. in the ability to drive off-target. So sometimes you can replace cumbersome combinations of Noise settings by a single, dominant one which ‘simulates’ (or represents) the effect.
Your assumption about the Noise factors is ok: it’s to the best knowledge you can have today. So I can only encourage you to find a (may be preliminary) set of N+ and N- settings. N+ and N- are for your lab (where you do perform the Taguchi experiment). They are not for production, as you most likely can not realize them on the factory floor.
Remember: it is more important to drive your process off-target during experimentation than scientifically replicate the manufacturing environment. When you included relevant Noise parameters, and when your process becomes more stable, you will see improvement in your fab, too: reduced sensitivity to things you can not (or do not want to) control.
I assume you can (an do) perform process optimization in a separeta (lab) environment. This is more efficient most of the time to do.
Hope this helps, Michael Schlueter

0
#124916

Michael Schlueter
Participant

RikdC,
You asked: “Given the above true , do I have to treat noise factors as discontinuous variables ( categorical ) ? Something which is on or off ?”
You vary the levels of the Noise factors, which can mean switching on/off sometimes.
Example (Noise factor and its levels, i.e. its alternatives):

temperature (low [20 degC], high [100 degC])
speed (slow, fast)
supplier (highQ Inc, cheaper.com)
motor (on, off)
Compunded Noise factors:

N+ (driving off-target towards too high values):

high temp. + fast speed + cheaper.com + motor on
N- (driving off-target towards too low values):

low temp. + slow speed + higQ Inc. + motor off.

Build the prototype (or create the process situation) according to your inner array (Control factors). Subject each prototype to the N+ and the N- situation. Take data. Do Taguchi transforms (SNR and mean).
(I omit the closing steps of the Taguchi cycle for simplicity).

Hope this helps, Michael Schlueter

0
#124919

Jayen466
Participant

Combining or “compounding” noise factors in the way Michael describes is a great idea — IF you know which type of operator behaviour has the same effect as a high or low temperature.
When you say “discontinuous noise factors” I reckon you mean that they are discrete characteristics, rather than parameters measured on a continuous scale. Noise factors can be either. Temperature is continuous, operator A/B is discrete. But if you use just two extreme temperature settings, then each factor just has a + and – level (high/low temp, operator behaviour A/B) so they look and behave just the same in the experimental design.
I imagine modelling possible operator behaviours in the experiment could be quite a challenge. It is always difficult to think of what someone else might do differently. The best thing might be to actually use a number of different people, but that would increase your run number …
I have no experience with doing this sort of thing in Minitab (and generally only know a bit of the theory), perhaps someone else here can help. Failing that, a google search for
Minitab Taguchi array
should get you something.
Best wishes
J.

0
#124921

Jayen466
Participant

You wrote:
> It seems that the advantage between a Taguchi design and a std design is that the inner array can be highly fractionalized while the outer array can be full factorial.
> As we are looking for interactions between the inner and the outer array then we assume we are not too much intrested in interactions within the inner array.
> If this is true then I understand.
Yes, that is what I meant. And Michael has indicated a way of reducing the size of the outer array as well — not by using a fractional factorial, but by compounding noises, or combining noise extremes that are known to have a similar effect, so that two or more noise factors are allocated to one and the same column of the outer array.
J.

0
#124923

RikdC
Member

If I understand correctly those set up that will reduce variance on extreme noise factor settings will most probably reduce variance when the settings for noise are not that extreme.Brilliant !
Some gains and some loss.
Gain – less experiments
Loss – no discrimination within noise factors – but if I assume that I can’t change noise ( and then Vinny is back ) this is not really a loss
Thanks guys , I should consider inviting you all for dinner. Does anywho dare travelling to Italy for a dinner ?
Give me a call !
Rgds

0
#124929

Deep
Participant

Jayen466: Yesterday you answerd one of my question regarding capability indices. Thanks once again for that. I have some more doubts. Can i have your email adds please. Rikdc; really sorry for the off topic.
Deep

0
#124932

Michael Schlueter
Participant

Hello Jayen466,
Thank you for you for your hint about the operators. Yes, they can have a bigger impact. It’s the purpose of a preceeding Noise investigation to find out. Depending on the result you decide about the Noise strategy.

0
Viewing 31 posts - 1 through 31 (of 31 total)

The forum ‘General’ is closed to new topics and replies.