FRIDAY, AUGUST 29, 2014
Font Size
Topic Understanding results of Gage R and R- Minitab

Understanding results of Gage R and R- Minitab

Home Forums Old Forums General Understanding results of Gage R and R- Minitab

This topic contains 0 replies, has 1 voice, and was last updated by Avatar of Learn Learn 10 years ago.

Viewing 1 post (of 1 total)
  • Author
    Posts
  • #65640

    Dear all,
    I have difficulty understanding the results from minitab for attribute Gage R and R.
    I am intending to find R and R % for 5 appraisers.What is one value I should look for this.
    What is reproducibiiy(%)  in this result sheet and the reproducibility(%).
    Can someone explain the following qustions Pls:
    1.How to find overall gage R and R for this results
    2.How to find out which question has got max variations among appraisers.
    3.Who is the poor appraiser from this result?
    4.What does Residual and P indicate.Should P vale be high or low?
    Attribute Agreement Analysis for RAM1, RAM2, MARK1, MARK2, JERRI1, JERRI2, …

    Within Appraisers
    Assessment Agreement
    Appraiser # Inspected # Matched Percent 95 % CI
    1 10 6 60.00 (26.24, 87.84)
    2 10 10 100.00 (74.11, 100.00)
    3 10 9 90.00 (55.50, 99.75)
    4 10 2 20.00 ( 2.52, 55.61)
    5 10 1 10.00 ( 0.25, 44.50)
    # Matched: Appraiser agrees with him/herself across trials.
     
    Fleiss’ Kappa Statistics
    Appraiser Response Kappa SE Kappa Z P(vs > 0)
    1 0 -0.250000 0.316228 -0.79057 0.7854
    1 -0.250000 0.316228 -0.79057 0.7854
    2 0 * * * *
    1 * * * *
    3 0 0.780220 0.316228 2.46727 0.0068
    1 0.780220 0.316228 2.46727 0.0068
    4 0 -0.600000 0.316228 -1.89737 0.9711
    1 -0.600000 0.316228 -1.89737 0.9711
    5 0 -0.818182 0.316228 -2.58732 0.9952
    1 -0.818182 0.316228 -2.58732 0.9952
    * When no or all responses across trials equal the value, kappa cannot be
    computed.

    Each Appraiser vs Standard
    Assessment Agreement
    Appraiser # Inspected # Matched Percent 95 % CI
    1 10 5 50.00 (18.71, 81.29)
    2 10 7 70.00 (34.75, 93.33)
    3 10 8 80.00 (44.39, 97.48)
    4 10 1 10.00 ( 0.25, 44.50)
    5 10 0 0.00 ( 0.00, 25.89)
    # Matched: Appraiser’s assessment across trials agrees with the known standard.
     
    Assessment Disagreement
    Appraiser # 1 / 0 Percent # 0 / 1 Percent # Mixed Percent
    1 1 33.33 0 0.00 4 40.00
    2 3 100.00 0 0.00 0 0.00
    3 0 0.00 1 14.29 1 10.00
    4 0 0.00 1 14.29 8 80.00
    5 1 33.33 0 0.00 9 90.00
     
    # 1 / 0: Assessments across trials = 1 / standard = 0.
    # 0 / 1: Assessments across trials = 0 / standard = 1.
    # Mixed: Assessments across trials are not identical.
     
    Fleiss’ Kappa Statistics
    Appraiser Response Kappa SE Kappa Z P(vs > 0)
    1 0 0.082094 0.223607 0.36714 0.3568
    1 0.082094 0.223607 0.36714 0.3568
    2 0 -0.176471 0.223607 -0.78920 0.7850
    1 -0.176471 0.223607 -0.78920 0.7850
    3 0 0.652015 0.223607 2.91590 0.0018
    1 0.652015 0.223607 2.91590 0.0018
    4 0 -0.018981 0.223607 -0.08489 0.5338
    1 -0.018981 0.223607 -0.08489 0.5338
    5 0 -0.133333 0.223607 -0.59628 0.7245
    1 -0.133333 0.223607 -0.59628 0.7245

    Between Appraisers
    Assessment Agreement
    # Inspected # Matched Percent 95 % CI
    10 0 0.00 (0.00, 25.89)
    # Matched: All appraisers’ assessments agree with each other.
     
    Fleiss’ Kappa Statistics
    Response Kappa SE Kappa Z P(vs > 0)
    0 0.0264550 0.0471405 0.561196 0.2873
    1 0.0264550 0.0471405 0.561196 0.2873

    All Appraisers vs Standard
    Assessment Agreement
    # Inspected # Matched Percent 95 % CI
    10 0 0.00 (0.00, 25.89)
    # Matched: All appraisers’ assessments agree with the known standard.
     
    Fleiss’ Kappa Statistics
    Response Kappa SE Kappa Z P(vs > 0)
    0 0.0810648 0.1 0.810648 0.2088
    1 0.0810648 0.1 0.810648 0.2088

    Attribute Agreement Analysis
     

Viewing 1 post (of 1 total)

The forum ‘General’ is closed to new topics and replies.

Login Form