by Duke Okes
Most organizations have more data than they will ever use. Some of it is maintained to have a record of what has occurred, other data is used to monitor and/or control business processes, and still more is used to predict future outcomes.
It’s the latter that has been gaining attention over the past decade or so. Data mining, big data, analytics, and business intelligence all are terminology used to describe attempts to leverage large amounts of data to improve organizational performance.
For example, rather than spending large sums of money marketing to every potential customer, analytics can help identify and target the customers most likely to respond, allowing the campaign to have a much higher success rate with a smaller investment. In effect, reducing waste. Such analyses can be done through analysis of distributions, ratios, clustering, and correlation, and also often involves sophisticated modeling techniques such as multiple regression. Even text can be mined and turned into data that can be analyzed.
So what does this have to do with auditors? Financial auditors have been using analytics for a long time. For example, Benford’s Law states that among a set of data the frequency of numbers beginning with the number one is much higher than those beginning with a nine, with those in between following a logarithmic pattern. This can be used to analyze the distribution of payments made by accounts payable to detect unusual deviations. Such deviations might signal unusual purchasing policies, fraud, or simply errors.
Management system auditors (e.g., quality, environmental, safety, etc.) can also use analytics. An analysis conducted before an audit could help identify areas of the organization where an assessment might be more beneficial. For example, what does a graph of the number of instruments calibrated each day, week, or month look like from one time period to another (see example 1 below), and might deviations indicate that calibrations are falling behind, being rushed at the last minute before an audit, or are just entries to the database vs. actual calibrations?
During an audit, analytics might detect variances in process management. For example, an analysis of the ratio of production output to preventive or breakdown maintenance hours (see example 2 below) might indicate when maintenance activities are falling behind relative to equipment usage. Auditing to determine why the ratio has changed, as well as how it might have affected equipment reliability and/or product quality might then be performed.
Note that the use of analytics does not mean one has to invoke statistics, although obviously doing so can provide significant value in the right situations. Simple visual presentations of data looking for patterns, clusters, shifts, etc. can often be just as useful. For example, what does the distribution of supplier ratings look like? Do they perhaps hint that the ratings are being skewed in to prevent having to deal with low performers? Would a scatter diagram of supplier audit scores and subsequent performance of the supplier indicate that the audit is a useful predictor of success? If not, then perhaps supplier audit criteria should be modified.
Related Article: Applying Risk-Based Thinking to Audit Nonconformities
Certification bodies classify audit nonconformities as either major or minor depending on the perception of whether they represent a significant breakdown of the management system. This is logical, since the purpose is to decide whether certification of the system is warranted. However, far too many internal auditors choose to use the same classifications, implying that risks are binary.
Click Here to Read
Obviously, the same concepts can be applied to safety, environmental, information security, and other management system audits. For example, looking for clusters of safety incidents by location, time of day, type of process activity, etc. can help identify areas where audits of safety controls might be more useful. The relationship between safety training and safety outcomes could also be explored. If there is no correlation it might indicate that either the training is not effective, or that other factors are more critical.
Ideally, the data used in analytics would consist of KPIs and other process metrics already in use by the organization to manage internal processes. In reality, some of the detailed information might require involvement of IT to extract data from various sources. Sample size and accuracy of the data are of course highly important.
An interesting application of analytics after the audit might be to predict the future time frame during which product/customer issues might become of concern, based on the lead time between system nonconformities found and when the impact is likely to be known. Another might be to try to predict customer satisfaction and other business outcomes based on combinations of process metrics, something that process owners should be doing themselves.
It’s these more complex relationships where analytics is currently of greatest interest, driving the need for more data scientists. For example, sophisticated models can be built based on the relationships between business processes, and sensitivity analysis used to discern which processes (and by implication, failure of which controls) are more critical to overall performance. And neural network software and social network analysis can help identify critical patterns or relationships that one might not ever envision.
It’s important to understand that analytics usually does not provide answers, but raises questions that can lead to greater understanding of relationships. For auditors trying to assess the soundness of numerous process interrelationships and their controls, analytics should be a tool to be added to their tool belt. Eventually many audits are likely to be automated by having computers continuously doing such analyses and reporting any anomalies.
Example 1 – Analysis of Calibration
Situation: An organization has 300 devices that are to be calibrated quarterly. To level the workload the typically do approximately one-third of them each month. An analysis of several past quarters reveals that they have typically done no fewer than 80 in each month, and no more than 120 in each month.
An analysis of the number calibrated each month during the past quarter reveals that the second month fell below normal, and the final month was significantly higher than normal (see Table 1). What does this indicate? Some possibilities might be:
- The facility was shut down for a week during the second month
- Several calibration technicians took vacation during the second month
- Some of the calibrations in the third months were not actually performed, but data was entered into the database to indicate they had been done
Based on this analysis the auditor obviously has some additional questions that he or she might ask based on this analysis, which go above and beyond what a typical audit would cover.
Example 2 – Analysis of Maintenance
Situation: The auditor has access to both production output and preventive maintenance hours for a particular machine where quality problems have been an issue. He or she decides to analyze the ratio of output to maintenance hours; Table 2 is the result.
The analysis shows that less maintenance is being performed relative to the output of the machine. Questions that it might raise are:
- Is the process owner (of the machine) aware of this change? Was it intentional (e.g., are they trying to make up for lost product by not giving up the machine for maintenance)?
- Is the difference significant in that the preventive maintenance not being performed is related to aspects of the machine that could contribute to the quality issues?
References:
- For an in-depth discussion of Benford’s Law see the article available at: https://www.agacgfm.org/AGA/FraudToolkit/documents/BenfordsLaw.pdf
- For more on audit analytics see the AICPA book “Auditing Analytics and Continuous Audit” available for download at: https://www.aicpa.org/InterestAreas/FRC/AssuranceAdvisoryServices/DownloadableDocuments/AuditAnalytics_LookingTowardFuture.pdf
About the author
Duke Okes is a knowledge architect who has trained thousands of quality management professionals in techniques for planning, controlling and improvement of organizational processes. He has published numerous article on how to advance the state of quality auditing, and is the author of two books, Root Cause Analysis: The Core of Problem Solving and Corrective Action and Performance Metrics: The Levers for Process Management.