ANOVA question.

Six Sigma – iSixSigma Forums Old Forums General ANOVA question.

Viewing 8 posts - 1 through 8 (of 8 total)
  • Author
  • #34547


    I am working in Minitab 14 and using ANOVA when something that I am unfamiliar with happened (not surprising).  Why am I getting  Rsquare values from my ANOVA test? 
    Help in understanding this is appreciated. 



    Good question, wish I had an answer to give you. 



    Have you posted to Minitab?  They’re really good at responding.



    i am not sure what you are asking. minitab release 14 calculates this r sqd value for you. it is the square of the correlation coefficient and measures the proportion of variation that is explained by the model.
    ideally r sqd should be equal to one, which would indicate zero error.
    hope this answer your question.



    I guess that is what is confusing me.  I understand Rsq for my regression and correlation models, but for ANOVA when I am interested in the difference in means is where I am confused.  Is it telling me the basic same thing as Pvalue and Fstats – that there is a difference in means and I am certain of it?  I guess having a Rsq value for ANOVA I just feel like it is something new and I am not wanting to miss out on why it would be calculated for you and what significance it shows more than the pvalue or fvalue.



    ANOVA will validate your model and display the results in terms of an F-test with an associated p-value as you mentioned. your Rsqd as i mentioned before should be close to 1 as possible and Rsqd(adj) should be maximized and as close to Rsqd as possible. (all this i’m sure you know)
    you still use your p-values for statistical significance but your Rsqd values can give you some conclusions on how well your model is and how much variation still exists. check p-values first and then your Rsqd values for fit.
    hope this helps.



       In spite of the fact that most statistics packages have separate modules for ANOVA and general linear models the fact is that ANOVA and regression are two sides of the same coin. 
      In both you are entertaining a model and the Sums of squares of both are computed in exactly the same way.  The key difference is that in ANOVA the model is overparameterized – they contain more parameters than are needed to represent the desired effects.  An easy way to check this is to run the following:
    1                  3
    1                  4
    1                  5
    2                  3
    2                 6
    2                  7
    3                  3
    3                  4
    3                  5
    Set up your ANOVA with IDVAR as the group variable and run it in the usual fashion – note the ss for within and between.
    Next recode the IDVAR into two variables such that
    if IDVAR = 1 then v1 = 0, v2=1
    if IDVAR = 2 then v1=1 , v2=0
    if IDVAR = 3 then v1 =0, v2=0
    and run the general linear model RESPONSE = V1 V2.
    You will note that the Sums of squares for model and error are exactly equal to Sums of squares for between and within.
      A full explanation of the connection can be found in any good book on regression or ANOVA. R2 is one attempt to describe the amount of variation explained by the model you are entertaining and it means the same in both instances.



    I have the same question, what I am troubled by is, what is the model that the R2 is refering to?  In Anova, I am looking to see the difference between means.  I have several sets of measurements that I am compairing, where is the model?  In regression I have a dependant variable and trying to explain where the variation comes from.  In Anova, I am looking at many variables, but none of them is a dependant variable to all the others.

Viewing 8 posts - 1 through 8 (of 8 total)

The forum ‘General’ is closed to new topics and replies.