Suzanne de Treville, Tyson R. Browning, Matthias Holweg, Rachna Shah
{"title":"Rethinking Six Sigma: Learning from practice in a digital age","authors":"Suzanne de Treville, Tyson R. Browning, Matthias Holweg, Rachna Shah","doi":"10.1002/joom.1284","DOIUrl":null,"url":null,"abstract":"<p>As scholars in the field of operations management (OM), we would like to suggest that our field fell short in terms of due diligence when transitioning from statistical process control (SPC) to Six Sigma—accepting without scrutiny, building theory around, and teaching heuristics and algorithms without recognizing its underlying statistical inaccuracies. It is our view that these incorrect heuristics and algorithms have introduced bias and inefficiencies in process improvement throughout the OM field, contributing to a disconnect between OM and knowledge development in data science more generally. We call for a return to first principles and the establishment of formal conceptual definitions for the theory and methods underlying Six Sigma. We urge the OM academic community to embrace the lessons from SPC and Six Sigma so that we prioritize our due-diligence role, beginning with a requirement that all algorithms and tools be vetted before entering our curricula and case-study repertoires, especially as we move forward into an age of big data and potentially further opaque algorithms and tools. We propose that our top journals be open to research that scrutinizes methods developed in practice, so that OM will continue to be the focal field for quality assurance—even when the “product” of a process is data.</p><p>The application of statistical methods to quality management has been a central theme in OM since Shewhart's seminal work at Western Electric's Hawthorne Works nearly a century ago (Shewhart, <span>1925</span>; Shewhart, <span>1926</span>). He studied process variation to determine “how and under what conditions observations may contribute to a rational decision to change or not a process to accomplish improvements” (W. Edwards Deming, p.i, in the foreword of the 1986 edition of Shewhart, <span>1931</span>). His work fostered methods and tools to monitor, diagnose, measure, reduce, and control variation in the output of a process to increase its consistency and capability.</p><p>The capability of a process with respect to a process parameter hence can be defined as the number of standard deviations (“sigmas”) of the parameter that fit between the mean for that parameter and its specification limits. If the process is not centered between the specification limits, then the process capability is set using whichever specification limit is closer to the process center. Six Sigma moves the concept of process capability from descriptive to prescriptive. At the time that Motorola's process-improvement ideas were proposed by Bill Smith in 1986 (Harry, <span>1994</span>), process capability was defined in terms of specification limits set three standard deviations from the process mean (a “3<i>σ</i> process”). Following Shewhart's logic, a centered and normally distributed 3<i>σ</i> process is expected to produce 2700 parts-per-million (ppm) pieces that are more than three standard deviations from the process mean.</p><p>Motorola quantified the “zero defect” philosophy at the heart of the “quality is free” argument proposed by popular and influential authors like Crosby (<span>1979</span>) by translating the number of standard deviations that fit between the process center and the tightest specification limit into ppm defects. Historically it was assumed that a 3<i>σ</i> process capability was good enough. The 2700 ppm out-of-specification pieces produced by a 3<i>σ</i> process capability, however, is too high in many contexts. Harry described Bill Smith as demanding a higher standard: “Bill's proposition was eloquently simple. He suggested that Motorola should require a 50% design margin for all its key product performance characteristics” (Harry, <span>2003</span>, p. 1). This buffer, which adds another three standard deviations between either side of the mean and the specification limits (as illustrated in Figure 1), marked the inception of the “Six Sigma” concept.</p><p>Up to this point, Motorola's approach can be seen as a protocol that adds intuition and goal setting to a simple application of the theory of process capability. Had the Motorola project stopped there, a refined and stricter protocol might have seen the light of day, resulting in an incremental but eminently sensible extension to SPC as applied to high volume, repetitive-manufacturing contexts. Unfortunately, this is not what happened. Several approaches to the application of these core concepts—artifacts of the limitations of practical application at the time—are difficult to justify today. In the following subsections, we will address a few of the more notable of these legacy issues, and how they might be re-evaluated and addressed in current practice, teaching, and research.</p><p>Our intent is not to deny that implementing Six Sigma has had a positive impact in many companies over the past decades, nor do we wish to discredit the tried-and-tested heuristics that underpin SPC. We seek to draw attention to and rectify the persistent failure of our field to use the peer-review process to clarify Six Sigma's statistical claims, and to update process capability and SPC to allow decision-makers to use these tools with full comprehension. Peer-reviewed research is intended to eliminate this combination of misunderstanding and mystique, facilitating the transformation of interesting ideas emerging from the world of practice into a solid increase in knowledge. Before peer review can function, it is necessary for top journals to be open to submissions whose contribution is this type of due diligence. Back in the mid-1990s when Six Sigma emerged to great excitement, top OM journals were not perceived as being open to this kind of submission. Suri and de Treville (<span>1986</span>)—a conceptual article published in <i>JOM</i>—gives an idea of the kind of research contribution that could open the way to the fact checking that we are calling for. In the early 1980s, the idea of “rocks and stream” became popular: The claim was that as inventory was removed from a system, the resulting line stoppages would cause learning. Suri and de Treville explored in detail what happens between two workstations as intermediate inventory is reduced. Learning can occur when one workstation blocks or starves another, but such blockage and starvation can also result in a failure to learn. This academic exploration contributed to a more nuanced understanding of the relationship between inventory reduction and learning. Along similar lines, we suggest that <i>JOM</i> should be open to relatively technical, conceptual submissions that permit an in-depth exploration of a phenomenon that has emerged from practice to great enthusiasm.</p><p>Allowing these SPC and Six-Sigma methods to stand unchallenged and unchanged has caused confusion and kept OM theory and tools from evolving to be able to address today's data-rich environment. Rather than analyzing small samples of manually collected data from a production line, modern statistical quality monitoring systems produce high frequency, real-time data that is automatically captured in digital form. By linking SPC and process capability to sound statistical principles, we become able to evaluate systems like internet of things (IoT) and real-time location and sensing using OM thinking and tools. Effective use of advanced tools like machine learning requires a solid statistical foundation in a framework that has undergone peer review, not opaque algorithms and arcane heuristics.</p><p>We call for the OM community to insist that process capability and SPC be returned to sound statistical roots and formal conceptual definitions, recognizing Six Sigma as a protocol from the world of practice that prompted important discussions, but whose underlying assumptions are fundamentally flawed. This return is essential if we are to remain relevant to what is now happening in the world of quality assurance and data science more generally. We further call for <i>JOM</i> to be a journal that welcomes the manuscripts that bring ideas from practice to peer review, thus replacing blind faith in magic sauces with solid increases in knowledge.</p><p>Suzanne de Treville, Tyson R. Browning, Matthias Holweg, and Rachna Shah.</p>","PeriodicalId":51097,"journal":{"name":"Journal of Operations Management","volume":"69 8","pages":"1371-1376"},"PeriodicalIF":6.5000,"publicationDate":"2023-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/joom.1284","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Operations Management","FirstCategoryId":"91","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/joom.1284","RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 0
Abstract
As scholars in the field of operations management (OM), we would like to suggest that our field fell short in terms of due diligence when transitioning from statistical process control (SPC) to Six Sigma—accepting without scrutiny, building theory around, and teaching heuristics and algorithms without recognizing its underlying statistical inaccuracies. It is our view that these incorrect heuristics and algorithms have introduced bias and inefficiencies in process improvement throughout the OM field, contributing to a disconnect between OM and knowledge development in data science more generally. We call for a return to first principles and the establishment of formal conceptual definitions for the theory and methods underlying Six Sigma. We urge the OM academic community to embrace the lessons from SPC and Six Sigma so that we prioritize our due-diligence role, beginning with a requirement that all algorithms and tools be vetted before entering our curricula and case-study repertoires, especially as we move forward into an age of big data and potentially further opaque algorithms and tools. We propose that our top journals be open to research that scrutinizes methods developed in practice, so that OM will continue to be the focal field for quality assurance—even when the “product” of a process is data.
The application of statistical methods to quality management has been a central theme in OM since Shewhart's seminal work at Western Electric's Hawthorne Works nearly a century ago (Shewhart, 1925; Shewhart, 1926). He studied process variation to determine “how and under what conditions observations may contribute to a rational decision to change or not a process to accomplish improvements” (W. Edwards Deming, p.i, in the foreword of the 1986 edition of Shewhart, 1931). His work fostered methods and tools to monitor, diagnose, measure, reduce, and control variation in the output of a process to increase its consistency and capability.
The capability of a process with respect to a process parameter hence can be defined as the number of standard deviations (“sigmas”) of the parameter that fit between the mean for that parameter and its specification limits. If the process is not centered between the specification limits, then the process capability is set using whichever specification limit is closer to the process center. Six Sigma moves the concept of process capability from descriptive to prescriptive. At the time that Motorola's process-improvement ideas were proposed by Bill Smith in 1986 (Harry, 1994), process capability was defined in terms of specification limits set three standard deviations from the process mean (a “3σ process”). Following Shewhart's logic, a centered and normally distributed 3σ process is expected to produce 2700 parts-per-million (ppm) pieces that are more than three standard deviations from the process mean.
Motorola quantified the “zero defect” philosophy at the heart of the “quality is free” argument proposed by popular and influential authors like Crosby (1979) by translating the number of standard deviations that fit between the process center and the tightest specification limit into ppm defects. Historically it was assumed that a 3σ process capability was good enough. The 2700 ppm out-of-specification pieces produced by a 3σ process capability, however, is too high in many contexts. Harry described Bill Smith as demanding a higher standard: “Bill's proposition was eloquently simple. He suggested that Motorola should require a 50% design margin for all its key product performance characteristics” (Harry, 2003, p. 1). This buffer, which adds another three standard deviations between either side of the mean and the specification limits (as illustrated in Figure 1), marked the inception of the “Six Sigma” concept.
Up to this point, Motorola's approach can be seen as a protocol that adds intuition and goal setting to a simple application of the theory of process capability. Had the Motorola project stopped there, a refined and stricter protocol might have seen the light of day, resulting in an incremental but eminently sensible extension to SPC as applied to high volume, repetitive-manufacturing contexts. Unfortunately, this is not what happened. Several approaches to the application of these core concepts—artifacts of the limitations of practical application at the time—are difficult to justify today. In the following subsections, we will address a few of the more notable of these legacy issues, and how they might be re-evaluated and addressed in current practice, teaching, and research.
Our intent is not to deny that implementing Six Sigma has had a positive impact in many companies over the past decades, nor do we wish to discredit the tried-and-tested heuristics that underpin SPC. We seek to draw attention to and rectify the persistent failure of our field to use the peer-review process to clarify Six Sigma's statistical claims, and to update process capability and SPC to allow decision-makers to use these tools with full comprehension. Peer-reviewed research is intended to eliminate this combination of misunderstanding and mystique, facilitating the transformation of interesting ideas emerging from the world of practice into a solid increase in knowledge. Before peer review can function, it is necessary for top journals to be open to submissions whose contribution is this type of due diligence. Back in the mid-1990s when Six Sigma emerged to great excitement, top OM journals were not perceived as being open to this kind of submission. Suri and de Treville (1986)—a conceptual article published in JOM—gives an idea of the kind of research contribution that could open the way to the fact checking that we are calling for. In the early 1980s, the idea of “rocks and stream” became popular: The claim was that as inventory was removed from a system, the resulting line stoppages would cause learning. Suri and de Treville explored in detail what happens between two workstations as intermediate inventory is reduced. Learning can occur when one workstation blocks or starves another, but such blockage and starvation can also result in a failure to learn. This academic exploration contributed to a more nuanced understanding of the relationship between inventory reduction and learning. Along similar lines, we suggest that JOM should be open to relatively technical, conceptual submissions that permit an in-depth exploration of a phenomenon that has emerged from practice to great enthusiasm.
Allowing these SPC and Six-Sigma methods to stand unchallenged and unchanged has caused confusion and kept OM theory and tools from evolving to be able to address today's data-rich environment. Rather than analyzing small samples of manually collected data from a production line, modern statistical quality monitoring systems produce high frequency, real-time data that is automatically captured in digital form. By linking SPC and process capability to sound statistical principles, we become able to evaluate systems like internet of things (IoT) and real-time location and sensing using OM thinking and tools. Effective use of advanced tools like machine learning requires a solid statistical foundation in a framework that has undergone peer review, not opaque algorithms and arcane heuristics.
We call for the OM community to insist that process capability and SPC be returned to sound statistical roots and formal conceptual definitions, recognizing Six Sigma as a protocol from the world of practice that prompted important discussions, but whose underlying assumptions are fundamentally flawed. This return is essential if we are to remain relevant to what is now happening in the world of quality assurance and data science more generally. We further call for JOM to be a journal that welcomes the manuscripts that bring ideas from practice to peer review, thus replacing blind faith in magic sauces with solid increases in knowledge.
Suzanne de Treville, Tyson R. Browning, Matthias Holweg, and Rachna Shah.
期刊介绍:
The Journal of Operations Management (JOM) is a leading academic publication dedicated to advancing the field of operations management (OM) through rigorous and original research. The journal's primary audience is the academic community, although it also values contributions that attract the interest of practitioners. However, it does not publish articles that are primarily aimed at practitioners, as academic relevance is a fundamental requirement.
JOM focuses on the management aspects of various types of operations, including manufacturing, service, and supply chain operations. The journal's scope is broad, covering both profit-oriented and non-profit organizations. The core criterion for publication is that the research question must be centered around operations management, rather than merely using operations as a context. For instance, a study on charismatic leadership in a manufacturing setting would only be within JOM's scope if it directly relates to the management of operations; the mere setting of the study is not enough.
Published papers in JOM are expected to address real-world operational questions and challenges. While not all research must be driven by practical concerns, there must be a credible link to practice that is considered from the outset of the research, not as an afterthought. Authors are cautioned against assuming that academic knowledge can be easily translated into practical applications without proper justification.
JOM's articles are abstracted and indexed by several prestigious databases and services, including Engineering Information, Inc.; Executive Sciences Institute; INSPEC; International Abstracts in Operations Research; Cambridge Scientific Abstracts; SciSearch/Science Citation Index; CompuMath Citation Index; Current Contents/Engineering, Computing & Technology; Information Access Company; and Social Sciences Citation Index. This ensures that the journal's research is widely accessible and recognized within the academic and professional communities.