iSixSigma

LoF v.s. curvature

Six Sigma – iSixSigma Forums Old Forums General LoF v.s. curvature

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #44970

    Anderson
    Participant

    Can some one know the difference between Lack of fit and curvature?
    I am confused. Seemly, both show the regression model not linear..?
    In case it’s right, what’s the difference?
     
    regards, 

    0
    #145369

    Eric Maass
    Participant

    Anderson,
    Lack of fit is a statistical test that tests whether there is a problem with the mathematical model fitting the data.  For example, you can have significant lack of fit if you try to use a linear model, y = a + bx1 +cx2, but there should be curvature in the model in order to have a better fit with the centerpoints, y = a + bx1 + cx2 + dx1^2However, you can also have significant lack of fit for other reasons – for example, if you removed a meaningful term from the equation: y = a + bx1, but you should also have cx2 in the equation.On the other side, while Lack of Fit can be one indicator that your model should include curvature, another indication may come from the residual analysis: perhaps there is curvature in the residuals plotted against the actual or predicted values or one of the factors (what I refer to as a “smiley face” or “frowny face”).
    So, in summary – The concepts of Lack of Fit and Curvature overlap, but are not equivalent.Lack of fit could mean you need curvature in your model, or could mean you left out a key term.Lack of fit is not the only test that could indicate that you need curvature in your model – residual diagnostics are also an indicator. 

    0
    #145370

    Fit
    Participant

    Linearity is used in two ways in regression: if a regression is linear in the parameters that does not mean that it is a straight line (or more technically it does not refer tothe shape of the response surface). A regression in the form y = beta0*log(10)*beta1(x) + error is not linear. However, the shape of a quadratic curve is linear in the parameters because it can be written in a linear fashion. Linear models themselves can be very complex. Thus, “linear” has two meanings, and you have to know which is used in which situation.
    The sum of squares of the lack of fit (SSLF) can be used to test for various types of regression functions. The degrees of freedom will differ depending on the function that you want to test for. For a simple linear regression the degrees of freedom are c – 2 (2 stands for beta o and beta 1, c for the number of means). The alternative hypothesis is that any other function such as quadratic or logarithmic etc. are better fits. Quadratic function represent curvature, but are still linear in the parameters. The straight line regression is both linear in the parameter and the form. I hope this helps.
    Also, I hope this helps.

    0
Viewing 3 posts - 1 through 3 (of 3 total)

The forum ‘General’ is closed to new topics and replies.