+ Sidebar

A Little Data’ll Do Ya

figure-1-coleman.png
An introduction to basics statistics…

An introduction to basics statistics and data analysis for auditors

So what is a process approach anyway? Wait a minute. Hold up! This article is about data analysis for auditors isn’t it? Well yes, but before we can talk about how auditors can analyze data we need to understand the processes from which this data comes.

A process can be thought of as an activity that transforms inputs into outputs. In manufacturing, the 6Ms—man, machine, material, measure, method, and mother nature—are often identified as process inputs, with the understanding that problems with process outputs typically come from problems with process inputs. This is shown in figure 1.

In other words, the root causes of nonconforming outputs tie directly back to the process inputs. You don’t always need to use the 6Ms though. It’s only important to identify inputs that make sense for your organizational processes. Data from most process outputs align themselves in a normal distribution, or bell-shaped curve. The bell-shaped curve is a graphical representation of process variation, of which there are two kinds. Common cause variation is that which is normal to any process. Special cause variation is that which is outside of the  +/- three sigma control limits caused by an external factor.  Special cause variation should be investigated to determine root cause and apply corrective action.

Having a normal distribution is important because bell curves allow for a more profound understanding of process behavior through the use of statistical tools and methods. For example, understanding whether a given process is producing normal variation or if some special cause is adversely affecting it. It can also show if there’s a statistically significant difference between two events. Additionally, we can make certain predications about a population based on the bell curve, such as how likely something is to be true or whether it will fall within a certain range.

These benefits are so important that when process data doesn’t fall into a normal distribution pattern, the data is often transformed. Transformation is a kind of “statistical hocus-pocus” that seeks to answer the question: What would this data look like if it was normally distributed and which statistical tools can we apply based on this theoretical model?  Data transformation is thankfully outside of the scope of this article.

The process width of six standard deviations (+/- 3 standard deviations from the process center) is considered the voice of the process. This is the portion underneath the bell-shaped curve where 99.7 percent of the data falls, as shown in figure 2.

A Little Data’ll Do Ya

Someone way smarter than me years ago decided that 99.7 percent was a high enough percentage of the population to draw conclusions about the entire population. It’s this plus or minus three standard deviations from the process center where control chart control limits are set. Note that uncontrolled doesn’t necessarily mean out of specification. You want control limits to be within specification limits so that if a control limit is passed, you have time to either troubleshoot the process or make adjustments before parts begin to consistently go out of specification. The two most important things to understand about a process are: Is it stable and in control? Is it capable?

A Little Data’ll Do Ya

Process capability is simply the ability of a process to consistently make parts to specification. Process capability indices compare the process width to the difference between the upper specification limit and lower specification limit. The most commonly used capability indices are Cp, Cpk, Pp, and Ppk. When voice of the process equals voice of the customer then the capability index is one. Less than one and the process is not capable. An industry rule of thumb when a capability requirement is called out in a specification is Cp/Cpk of greater than or equal to 1.33 and Pp/Ppk greater than or equal to 1.67.

Cp/Cpk shows short-term variation. Pp/Ppk shows long-term variation, which captures more special cause variation. It is for this reason that Pp/Ppk will always be lower than Cp/Cpk. Cpk and Ppk are the preferred methods for evaluating process capability, as they both account for  process location and width.

Though statistical software often shows capability indices as a part of the control chart graphic, it’s important to understand that capability indices are not an element of control charts. Control charts (graphical) and capability indices (analytical), although complimentary of one another, are two separate and distinct tools.

 A Little Data’ll Do Ya

 

Now that we have covered the basics, let’s talk about data review. As we review an organization’s data analysis program you want to ask what information is reviewed, by whom, and what the data is used for. When looking at data you want to not just look at whether or not it’s within specification or even control. You also want to understand trends in data. Understanding how to respond to uncontrolled conditions or negative trends is an often overlooked portion of the less mature QMS.

“Decisions based on the analysis and evaluation of data and information are more likely to produce desired results” –ISO 9000:2015 2.3.6.1

What does this mean to us as auditors? As auditors we should be looking to confirm the maxim below.

  • Don’t collect data if you aren’t going to plot it.
  • Don’t plot data if you aren’t going to analyze it.
  • Don’t analyze data if you aren’t going to do anything with the results.

Ideally you would want to see process data even distributed around the process average, as shown in figure 5.

A Little Data’ll Do Ya

Some examples of commonly seen trends that would warrant further investigation are shown in the following figures. You will note that some of these changes can be subtle, so you will need to be on the lookout for them.

A Little Data’ll Do Ya

A Little Data’ll Do Ya

A Little Data’ll Do Ya

A Little Data’ll Do Ya

There’s a set of rules developed by Westinghouse in the 1980s called the “Westinghouse Rules” that gives additional examples of plotted data that might require investigation. However, you don’t need to be a statistical expert and remember every guideline for when to launch an investigation based on data. You should, however, be able to recognize when there is the possibility of special cause variation indicated by plotted data and know what questions to ask.

Some of the questions that an auditor might ask when reviewing the process monitoring program are:

  • What do you do when an adverse trend is encountered?
  • What do you do when an out of control condition is encountered?
  • How do you know what to do when an out of control condition or adverse trend is encountered?
  • If Cp/Cpk or Pp/Ppk data is captured, ask if there is a minimum requirement?
  • How do yields or other process data compare across shifts or between similar lines?
  • Is there a structured program in place for review of and response to adverse trends/conditions in data?
  • How do you identify positive trends that may point to an opportunity to transfer a best practice from one process or work center to another?
  • What training is provided in SPC and data analysis?
  • Are the daily metrics captured aligned with organizational goals and objectives?
  • How were the control or specification limits selected? Don’t assume that a statistical or even logical method was used.

In our data-driven society, more data is available than ever before. It’s important to not just understand the data that we are looking at but to also know which data to review.

When looking at data analysis an organization, it’s important to understand how flow down of strategic goals and objectives as called out in ISO 9001:2015 is accomplished. It should be clear how operational targets support tactical objectives, which in turn support strategic goals. Each goal and objective should have an associated metric that will indicate when the goal or objective has been met. When there is not a clear link between goals, objectives, and metrics it may sometimes be the case that unnecessary metrics are being tracked. Let’s look at the following example of proper flow down of corporate vision.

Vision: Become marketplace leader within the next five years.

Strategic goal: Increased market share

  • Metric: Industry ranking

 

Tactical objective: Improved quality

  • Metrics: Reduced customer return rate and increased customer satisfaction survey scores

 

Operational targets: Reduce process variation.

  • Metric: Lower scrap rate

 

We seek to derive insights from the review of data. Through those insights, data is transformed into information upon which can be based decisions.

Reviewing data doesn’t just occur by reviewing charts on the manufacturing floor. Often as part of an audit we are called upon to review validation reports. This may appear to be a daunting task. However, you don’t have to have a degree in statistics to provide a thorough review. Here are some basic tips below:

  • Is the data that was specified in the protocol in the report?
  • Have all of the required signatories signed off?
  • Have all the success criteria been met?
  • If all of the success criteria has not been met, were appropriate procedures followed?
  • Any red lines crossed on the graphs?

 

Auditors play an important role in their assessment of an organization’s data analysis program.  Understanding basic statistical tools and techniques will allow an experienced auditor to provide a thorough review, regardless of their background. I will close with this quote by W. Edwards Deming: “In God we trust. All others please bring data.”

 

About the author

Lance B. Coleman has more than 20 years of leadership experience in the areas of quality engineering, Lean implementation, quality, and risk management in the medical device, aerospace, and other regulated industries. He has a degree in electrical engineering technology from the Southern Polytechnical University in Marietta, Georgia and is an American Society for Quality Senior Member, Certified Quality Engineer, Six Sigma Green Belt, Certified Quality Auditor, and Biomedical Auditor. He is also an Exemplar Global Principal QMS Auditor. Coleman is chair of U.S. TAG 302 and a voting member U.S. TAG 176.

He is the author of Advanced Quality Auditing: An Auditor’s Review of Risk Management, Lean Improvement and Data Analysis (Quality Press, 2015) which has been nominated for an ASQ Crosby Award. Additionally, Coleman is an instructor for the ASQ Certified Quality Auditor Exam Preparatory and FMEA courses. As principal consultant of Full Moon Consulting, he has presented, trained, and consulted throughout the United States and abroad.

The post A Little Data’ll Do Ya appeared first on The Auditor.