Six Reasons to Leave Excel Behind for Quality Analytics
In this second of three guest posts on actionable quality data, Mike Roberts, Marketing Analyst at LNS Research, shares his insights on using Excel for quality analytics.
There are many reasons spreadsheets have stuck around plant floors and production lines for so long, and are not likely to be permanently relegated to the digital dust bin any time soon. On some level, one that looms is trust and familiarity. Excel has been around forever, and, in capable hands, is reliable and proven as a tool for analyzing and recording data.
However, there are new technologies and software applications that have been created that perform the job of Excel exponentially better in every category, from speed, accuracy, and reliability to visibility and accessibility. Today’s leading manufacturing and quality software analytics have automated the data analysis often performed in spreadsheets.
More importantly, as the Big Data revolution sets in and takes hold, the old way of doing things—manually plugging numbers into boxes on a screen or using purpose-built macros—is just not going to be sustainable. In this post we’ll look at some of the main drawbacks of relying on Excel, and how advanced solution applications are poised to handle the coming onslaught of Big Data.
Six Main Drawbacks of Excel
- Prone to human error: By introducing a human hand into the data collection and analysis process, Excel brings with it all the possibilities of a false keystroke, fatigued employee, misinterpretation, or anything else that can be attributed to manual operations—tying data reliability to individual competency.
- Macros lack agility: The macros used to crunch numbers in Excel need to be routinely updated as processes and operations evolve, requiring additional manual manipulation of the analysis tool, and yet another introduction of potential error of a human hand.
- Not real-time/validated: Typically, when a heap of data is dumped into Excel, it can very quickly become stale and irrelevant to its intended use. Analyses in Excel are almost always based on historical data, which lacks the degree of timeliness many manufacturing professionals require.
- Adds to the challenges of disparate information: As mentioned in my colleague Matthew Littlefield’s post on the Data Heads’ blog, 47% of quality professionals from the LNS Research Quality Survey feel they have too many disparate and disconnected systems for measuring quality. By taking this disparate data and entering it into spreadsheets, workers are acting as an extension of this disconnected architecture and spidering data out further.
- User access limited to one at a time: If multiple people need to edit or manage quality data, only one can edit the spreadsheet at a time, severely limiting efficiency and accessibility.
- Not scalable: Related to all the other points, the bottom line is that manual data dumps into Excel workbooks simply are not a scalable or sustainable way to operate.
The Widening Utility Gap between Excel and Quality Analytics
Although professionals may find Excel to be a useful solution for ad-hoc analyses, and purpose-built macros may even provide deeper algorithmic data analysis support, the fact of the matter is—as shown in the six points above—that strategy does not deliver the insights needed for making actionable improvements at the speed of manufacturing.
Because more and more manufacturers are deploying quality management solutions like statistical process control (SPC) to gain real-time process performance visibility, spreadsheet reliance and its drawbacks are becoming increasingly apparent. And market-leading manufacturers are leaving Excel behind for quality analytics, a set of next-generation analysis and visualization capabilities that compound the impact of these solutions.
Quality analytics solve every challenge noted above. Connecting directly to quality data sources, these solutions are developed to handle real-time and historical quality data as well as simultaneous users. They generally come stacked with pre-configured algorithms for monitoring, controlling, and modeling data. And companies leveraging this technology are able to take a more proactive approach to quality.
Further, in the quality management arena, people are talking about Big Data—especially with more focus on finding correlations in real-time performance data. Many of today’s quality analytics solutions are designed to support the use of Big Data analytics, so manufacturers can cultivate and analyze massive amounts real-time data generated by SPC and other quality process data sources.
Maximizing Your Data Visibility and Insights
The rampant use of spreadsheets and other ad-hoc solutions for managing quality is putting many organizations at risk of falling behind in competitiveness. Market leaders have already adopted these next-generation capabilities, and are now working to identify ways to leverage Big Data analytics for more strategic performance improvements.