TUESDAY, DECEMBER 12, 2017
Font Size
Topic Attribute Agreement Analysis

Attribute Agreement Analysis

Home Forums General Forums Implementation Attribute Agreement Analysis

This topic contains 6 replies, has 4 voices, and was last updated by  Strayer 4 months, 1 week ago.

Viewing 7 posts - 1 through 7 (of 7 total)
  • Author
    Posts
  • #665158 Reply

    I our organization we conduct a monthly knowledge assessment test to know the understanding level of the employees.Here we have a requirement to translate the strength of agreement between all the employees in to a metric (like Kappa value, etc..,) Please suggest a method to know agreement among all employees in the form any statistical metric.
    Responses for 82 employee for 20 questions is attached.

    Thanks in advance

    #665318 Reply

    no attachment seen

    #671375 Reply

    sorry for inconvenience may our organization network is blocking to upload.

    Attachments:
    1. Raw-data.xlsx
    #671413 Reply

    I think your first effort should just be a simple bean count of correct vs incorrect answers by question across employees. If you do this you can see some very clear patterns with respect to frequency of correct vs incorrect responses.

    The following questions have no incorrect answers: Q4,Q5,Q18

    The following questions have the largest count of incorrect answers: Q1(50/82),
    Q3(25/82), Q6(20/82), Q8(17/82),and Q17(11/82)

    The rest of the questions have incorrect counts of 2,2,7,5,2,1,3,3,3,7,1,and 5

    A simple Pareto chart of these results indicates Q1 is the single biggest hitter with respect to incorrect answers followed by Q3 and Q6. Everything else is of secondary or tertiary consideration. Therefore, rather than worrying about employee agreement it would seem that the main concern should be uncovering the reasons for the high incorrect counts for Q1, Q3, and Q6 and identifying ways to reduce this count.

    #671680 Reply

    Thank so much for your quick reply. yes simple been count can give us good inference, but main challenge to me is our manager has already set a deliverable in the form a metric like “kappa value” says that team kappa value should be >.80. My doubt is, is it possible to get a kappa value for this kind of data? if no i should be strong enough to convince my manager to remove this kind of metric (kappa value) and suggest an alternative.!!!!

    #671815 Reply

    The kappa statistic is a measure of agreement of the responses of two entities when they are assessing some aspect of a test data set of some kind. For example, say you have two different groups of physicians and you want to see how their Likert scale ratings of some aspect of something such as a stress fracture as presented in a 2D X-ray agree.

    All you have is a series of true/false questions and a count by participant of the number of correct answers. This is not an issue of agreement it is an issue of the percent of correct answers across questions within each participant and the takeaway is as I mentioned in the previous post – which question has the most incorrect answers and why is this so?

    #672290 Reply

    I suspect that management wants to learn something about agreement on the correct or preferred answers. I’m reminded of a question an IT group was once asked. “Should our next CIO be a tech person or a business person.” Of course, a CIO needs to be strong in both. The question was asked to learn about the current IT staff’s orientation, not for any sort of advice. I suspect that these true/false questions have a similar purpose and, from what you say, your management is looking for a single measure of agreement across all questions rather than prioritizing them as in a Pareto. If that’s what they’re looking for I’d ask them for the preferred answer for each question and then simply average the matches.

Viewing 7 posts - 1 through 7 (of 7 total)

Register Now

  • Stop this in-your-face notice
  • Reserve your username
  • Follow people you like, learn from
  • Extend your profile
  • Gain reputation for your contributions
  • No annoying captchas across site
And much more! C'mon, register now.

Reply To: Attribute Agreement Analysis
Your information:






5S and Lean eBooks
GAGEpack for Quality Assurance
Six Sigma Statistical and Graphical Analysis with SigmaXL
Six Sigma Online Certification: White, Yellow, Green and Black Belt

Lean Six Sigma Project Tracking

Lean and Six Sigma Project Examples
Six Sigma Online Certification: White, Yellow, Green and Black Belt

Find the Perfect Six Sigma Job

Login Form