Sampling

Updated:

It is often not possible or practical to collect all the data from your process. It can be too costly or take too much time. It may not be possible to access all the data. If this situation exists in your process, then you will need to sample your data. Overview: What is sampling?  There […]

Read more »
The Importance of Sample Size

The Importance of Sample Size

Published:

What do Goldilocks and statistical analysis sample size have in common? A razor-sharp focus on “just right.” A sample size that is too big or too small leads to inaccurate data and wasted resources (although hopefully not three bears).

Read more »

Random Sampling

Updated:

Random sampling is a statistical method of selecting a sample of data from a larger set in such a way that each data point has an equal chance of being selected, so that the sample represents the population from which it was drawn. Overview: What is random sampling?  Random sampling is a method of choosing […]

Read more »

A Study of Estimates of Sigma in Small Sample Sizes

Published:

This paper looks at some of the methods of estimating standard deviation (which I will usually refer to as ‘sigma’). Additionally, I propose a new formula for estimating sigma for small sample sizes and also present a means to mathematically evaluate these competing estimates of sigma. The question was posed to me: “I have five […]

Read more »

Use a Classification and Regression Tree (CART) for Quick Data Insights

Published:

In the Analyze phase of a DMAIC (Define, Measure, Analyze, Improve, Control) Six Sigma project, potential root causes of variations and defects are identified and validated. Various data analysis tools are used for exploratory and confirmatory studies. Descriptive and graphical techniques help with understanding the nature of data and visualizing potential relationships. Statistical analysis techniques, […]

Read more »

Using Censored Data in Transactional Processes

Published:

Censored data is commonly used in reliability studies to determine the mean time to failure in order to establish warranty and maintenance periods for products. A large number of samples are subjected to either normal-use or accelerated-use conditions. Failure modes and occurrences are logged. Plotting the distribution of the sample failures over time allows the […]

Read more »

TaaG Analysis – Fast and Easy for Comparing Trends in Large Data Sets

Published:

TaaG (trends at a glance) analysis is a fast way to compare trends of subsets of data across large data sets. It is an ideal tool to use in the Measure and Control phases of DMAIC (Define, Measure, Analyze, Improve, Control) projects. The value of TaaG analysis is best understood by way of example. Suppose […]

Read more »

How to Avoid The Evils Within Customer Satisfaction Surveys

Published:

When the Ritz-Carlton Hotel Company won the Malcolm Baldrige National Quality Award for the second time in 1999, companies across many industries began trying to achieve the same level of outstanding customer satisfaction. This was a good thing, of course, as CEOs and executives began incorporating customer satisfaction into their company goals while also communicating […]

Read more »

VOC: Comparing Reactive Data and Proactive Data

Published:

Collecting data – be it voice of the customer or otherwise – requires a plan. Details of the plan should include what data to collect, how to get the information, where the information will come from and so on. Before any of these details are defined, however, the first step is to identify what a […]

Read more »
Mind Mapping

Mind Mapping: A Simpler Way to Capture Information

Published:

Reducing wait times is a perennial challenge for the service industry, particularly if you haven’t rooted out all the causes for the delays. The mind mapping visualization tool can help you make sense of large volumes of interconnected data and fast-track implementation of Lean Six Sigma.

Read more »

Process Data Mining: Partitioning Variance

Published:

Manufacturing facilities can be faced with major challenges when it comes to process improvement, largely because practitioners don’t always know enough about the underlying process factors (x’s) are that drive the improvement metric (Y). Practitioners might have a brainstorming session to tap into the collective experience of experts involved in the process, and design experiments […]

Read more »

Reducing Sampling Costs: Implementing a Variable Sampling Interval Strategy

Published:

Most manufacturing processes are controlled by sampling a product at some regular interval. Often, when a process is running normally, this interval is once every shift. It is not too surprising that in today’s economic climate, where cutting cost is of paramount importance, reducing sampling to save money is inviting, especially at large manufacturing facilities, […]

Read more »

Is There Bias In Your Random Sample?

Published:

By definition, a sample of size n is random if the probability of selecting the sample is the same as the probability of selecting every other sample of size n. If the sample is not random, a bias in introduced which causes a statistical sampling or testing error by systematically favoring some outcomes over others. […]

Read more »

Rounding and Round-off Rules

Published:

When performing statistical data analyses, quality professionals are always challenged to maintain data integrity. When should you round up the answer; when should you round down? How many significant figures are appropriate for the data set that has been taken? Below are a set of simple rules that should help you traverse the perils of […]

Read more »

GE’s Six Sigma Focus On Span

Published:

We have heard about GE being one of the biggest proponents of Six Sigma, both for their own processes and for their customers. We’ve also heard how much GE has saved by implementing Six Sigma. This article is not a regurgitation of the existing rhetoric. Instead, I’d like to focus on an aspect of how […]

Read more »

Actionable Information from Soft Data

Published:

Engineers, Six Sigma practitioners and other researchers often work with “hard” data – discrete data that can be counted and legitimately expressed as ratios. But what of “soft” data, things like opinions, attitudes and satisfaction? Can statistical process controls (SPC) be applied here? Can process variation in customer satisfaction, for example, be measured and then reported to […]

Read more »

How To Turn Process Data Into Information

Published:

A repeated series of actions and variables is a process. A collection of processes is a system. Virtually perfect Six Sigma quality results from an optimal interaction of all the variables in a given system. Process and system questions we all face at work include: Which variables are the most important to the customer? Am […]

Read more »

Digging for Data: Insurance Companies Strive to Improve

Published:

Based on experience with property and casualty insurers (P&C), one of the biggest profitability drivers is the expense incurred staffing and settling claims. Many P&C insurers struggle with timely and efficient claims processing. In working with P&C companies consulting groups found three common challenges that project teams must address to improve the overall claims settlement […]

Read more »

Building a Sound Data Collection Plan

Published:

Black Belts and Six Sigma practitioners who are leading DMAIC (Define, Measure, Analyze, Improve, Control) projects should develop a sound data collection plan in order to gather data in the measurement phase. There are several crucial steps that need to be addressed to ensure that the data collection process and measurement systems are stable and […]

Read more »

Enlist Process Owners to Survive an Absence of Data

Published:

The beauty of Six Sigma, over other decision-making strategies is that it is, by nature, data driven – it involves making decisions backed by evidence. In the absence of data, then, what good is Six Sigma? A Black Belt without data is like a navigator without a compass; finding north becomes complicated, but not impossible. […]

Read more »

Eliminating the Fear About Using Confidence Intervals

Published:

One of the pleasures of teaching Green Belts is helping to eliminate the fear of statistical analysis. One technique is to place an emphasis on not only when and why a tool or methodology is used but also what the data says in “plain English.” Memorizing complex formulas may be the goal of many Master […]

Read more »

How to Determine Sample Size, Determining Sample Size

Published:

In order to prove that a process has been improved, you must measure the process capability before and after improvements are implemented. This allows you to quantify the process improvement (e.g., defect reduction or productivity increase) and translate the effects into an estimated financial result – something business leaders can understand and appreciate. If data […]

Read more »

Statistics Do Three Things – Describe, Compare and Relate

Published:

Fear of statistics is often a barrier to learning and applying Six Sigma methods. One way to minimize this fear is to remember that only three things can be done with statistics – describe, compare and relate. Many people are skeptical when they first hear this statement. “It couldn’t be that simple,” they think. However, […]

Read more »

Improving Staff Scheduling at Providence Health System

Published:

As with most hospitals, labor is the largest budget expense at the Providence Alaska Medical Center (PAMC) in Anchorage. But benchmarking indicated that staff utilization at PAMC, a part of the Providence Health System, was above the 75th percentile of the national average. To remedy this, in October 2003, a multidisciplinary team (nursing, leadership, finance […]

Read more »
To top