Pub Date : 2022-02-02DOI: 10.1108/ijqrm-09-2021-0337
Hasan Uvet, H. Celik, Sedat Cevikparmak, Saban Adana, Yavuz Idug
PurposeIn the last 20 years, e-waste has become a serious issue resulting from an overwhelming amount of electronics consumption. However, there has been limited research on how to decrease such waste in a structured manner. Toward study was to use a simulation methodology to investigate the dynamics of upfront investment in reliability enhancement promoted by performance-based contracting (PBC), based on the number of spare parts and duration of the contract.Design/methodology/approachThe present research first details the relevant mathematical equations and uses game theory to demonstrate the utility for supplier and buyer relationships. Next, the effects of reliability enhancement, spare partsPBC are analyzed using a BlockSim simulation model.FindingsThe results indicate strong relationships among system design cost, reliability, availability and service cost. The authors found that investment in reliability increases system availability while reducing total service costs. Furthermore, increasing the spare parts inventory was determined to have less influence on the readiness of highly reliable systems. The findings support the notion that PBC reduces e-waste by increasing system availability, incentivizing upfront investment in reliability growth.Research limitations/implicationsRecognition of these findings in the context of buyer–supplier relationships will help managers better understand the value of upfront reliability investment, reducing maintenance, repair and overhaul requirements, avoiding the need to plan for extra spare parts and minimizing volume and the resulting e-waste.Practical implicationsThis study also clarifies the uncertainty associated with upfront investment and provides potential incentives for suppliers.Originality/valueThe main contribution of this study is its use of PBC for e-waste reduction, highlighting the effects of upfront investment in reliability enhancement. The authors applied a game theory model to illustrate the relationship between incentives and upfront investment and demonstrate how increased levels of spare parts can be counterproductive to achieving readiness, reducing inventory and consequent e-waste.
{"title":"Decreasing e-waste through reliability enhancement encouraged by performance-based contracting","authors":"Hasan Uvet, H. Celik, Sedat Cevikparmak, Saban Adana, Yavuz Idug","doi":"10.1108/ijqrm-09-2021-0337","DOIUrl":"https://doi.org/10.1108/ijqrm-09-2021-0337","url":null,"abstract":"PurposeIn the last 20 years, e-waste has become a serious issue resulting from an overwhelming amount of electronics consumption. However, there has been limited research on how to decrease such waste in a structured manner. Toward study was to use a simulation methodology to investigate the dynamics of upfront investment in reliability enhancement promoted by performance-based contracting (PBC), based on the number of spare parts and duration of the contract.Design/methodology/approachThe present research first details the relevant mathematical equations and uses game theory to demonstrate the utility for supplier and buyer relationships. Next, the effects of reliability enhancement, spare partsPBC are analyzed using a BlockSim simulation model.FindingsThe results indicate strong relationships among system design cost, reliability, availability and service cost. The authors found that investment in reliability increases system availability while reducing total service costs. Furthermore, increasing the spare parts inventory was determined to have less influence on the readiness of highly reliable systems. The findings support the notion that PBC reduces e-waste by increasing system availability, incentivizing upfront investment in reliability growth.Research limitations/implicationsRecognition of these findings in the context of buyer–supplier relationships will help managers better understand the value of upfront reliability investment, reducing maintenance, repair and overhaul requirements, avoiding the need to plan for extra spare parts and minimizing volume and the resulting e-waste.Practical implicationsThis study also clarifies the uncertainty associated with upfront investment and provides potential incentives for suppliers.Originality/valueThe main contribution of this study is its use of PBC for e-waste reduction, highlighting the effects of upfront investment in reliability enhancement. The authors applied a game theory model to illustrate the relationship between incentives and upfront investment and demonstrate how increased levels of spare parts can be counterproductive to achieving readiness, reducing inventory and consequent e-waste.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":" ","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46859993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-02-01DOI: 10.1108/ijqrm-07-2021-0206
Ramadas Thekkoote
PurposeQuality 4.0 (Q4.0) is related to quality management in the era of Industry 4.0 (I4.0). In particular, it concentrates on digital techniques used to improve organizational capabilities and ensure the delivery of the best quality products and services to its customer. The aim of this research to examine the vital elements for the Q4.0 implementation.Design/methodology/approachA review of the literature was carried out to analyze past studies in this emerging research field.FindingsThis research identified ten factors that contribute to the successful implementation of Q4.0. The key factors are (1) data, (2) analytics, (3) connectivity, (4) collaboration, (5) development of APP, (6) scalability, (7) compliance, (8) organization culture, (9) leadership and (10) training for Q4.0.Originality/valueAs a result of the research, a new understanding of factors of successful implementation of Q4.0 in the digital transformation era can assist firms in developing new ways to implement Q4.0.
{"title":"Enabler toward successful implementation of Quality 4.0 in digital transformation era: a comprehensive review and future research agenda","authors":"Ramadas Thekkoote","doi":"10.1108/ijqrm-07-2021-0206","DOIUrl":"https://doi.org/10.1108/ijqrm-07-2021-0206","url":null,"abstract":"PurposeQuality 4.0 (Q4.0) is related to quality management in the era of Industry 4.0 (I4.0). In particular, it concentrates on digital techniques used to improve organizational capabilities and ensure the delivery of the best quality products and services to its customer. The aim of this research to examine the vital elements for the Q4.0 implementation.Design/methodology/approachA review of the literature was carried out to analyze past studies in this emerging research field.FindingsThis research identified ten factors that contribute to the successful implementation of Q4.0. The key factors are (1) data, (2) analytics, (3) connectivity, (4) collaboration, (5) development of APP, (6) scalability, (7) compliance, (8) organization culture, (9) leadership and (10) training for Q4.0.Originality/valueAs a result of the research, a new understanding of factors of successful implementation of Q4.0 in the digital transformation era can assist firms in developing new ways to implement Q4.0.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":" ","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46590460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-02-01DOI: 10.1108/ijqrm-07-2021-0216
S. Patil, A. Bewoor
PurposeThis study focuses on the application of reliability-centered maintenance (RCM) to a textile industry steam boiler. The study aims to demonstrate the development and application of RCM to a steam boiler used in the textile industry.Design/methodology/approachRCM is a structured process that develops maintenance activities needed on physical resources in their operational environment to realize their inherent reliability by logically incorporating an appropriate mixture of reactive, preventive, condition-based and proactive maintenance methods. A detailed analysis of the RCM approach is presented to develop preventive maintenance (PM) program and improve the reliability and availability of the steam boiler system.FindingsThe research reveals that the identification of PM tasks is a good indicator of the PM program's efficiency and can serve as an important maintenance-related downtime source. It is also discovered that the majority of maintenance programs that claim to be proactive are, in fact, reactive. This article also shows how RCM may be successfully implemented to any system, resulting in increased system reliability.Research limitations/implicationsThe paper focuses on a pilot study of the development and implementation of the RCM technique to a textile industry steam boiler. It is suggested that the developed RCM model can be applied to the entire plant.Originality/valueThe paper presents a comprehensive RCM model framework as well as an RCM decision framework, providing maintenance managers and engineers with a step-by-step approach to RCM implementation. The proposed framework is significant in that it may be utilized for both qualitative and quantitative analysis at the same time.
{"title":"Optimization of maintenance strategies for steam boiler system using reliability-centered maintenance (RCM) model – A case study from Indian textile industries","authors":"S. Patil, A. Bewoor","doi":"10.1108/ijqrm-07-2021-0216","DOIUrl":"https://doi.org/10.1108/ijqrm-07-2021-0216","url":null,"abstract":"PurposeThis study focuses on the application of reliability-centered maintenance (RCM) to a textile industry steam boiler. The study aims to demonstrate the development and application of RCM to a steam boiler used in the textile industry.Design/methodology/approachRCM is a structured process that develops maintenance activities needed on physical resources in their operational environment to realize their inherent reliability by logically incorporating an appropriate mixture of reactive, preventive, condition-based and proactive maintenance methods. A detailed analysis of the RCM approach is presented to develop preventive maintenance (PM) program and improve the reliability and availability of the steam boiler system.FindingsThe research reveals that the identification of PM tasks is a good indicator of the PM program's efficiency and can serve as an important maintenance-related downtime source. It is also discovered that the majority of maintenance programs that claim to be proactive are, in fact, reactive. This article also shows how RCM may be successfully implemented to any system, resulting in increased system reliability.Research limitations/implicationsThe paper focuses on a pilot study of the development and implementation of the RCM technique to a textile industry steam boiler. It is suggested that the developed RCM model can be applied to the entire plant.Originality/valueThe paper presents a comprehensive RCM model framework as well as an RCM decision framework, providing maintenance managers and engineers with a step-by-step approach to RCM implementation. The proposed framework is significant in that it may be utilized for both qualitative and quantitative analysis at the same time.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":" ","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43311720","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-02-01DOI: 10.1108/ijqrm-08-2021-0283
M. Saini, Drishty Goyal, Ashish Kumar, R. Patil
PurposeThe demand of sewage treatment plants is increasing day by day, especially in the countries like India. Biological and chemical unit of such sewage treatment plants are critical and needs to be designed and developed to achieve desired level of reliability, maintainability and availability.Design/methodology/approachThis paper investigates and optimizes the availability of biological and chemical unit of a sewage treatment plant. A novel mathematical model for this unit is developed using the Markovian birth-death process. A set of Chapman–Kolmogorov differential equations are derived for the model and a generalized solution is discovered using soft computing techniques namely genetic algorithm (GA) and particle swarm optimization (PSO).FindingsNature-inspired optimization techniques results of availability function depicted that PSO outperforms GA. The optimum value of the availability of biological and chemical processing unit is 0.9324 corresponding to population size 100, the number of evolutions 300, mutation 0.6 and crossover 0.85 achieved using GA while PSO results reflect that optimum achieved availability is 0.936240 after 45 iterations. Finally, it is revealed that PSO outperforms than GA.Research limitations/implicationsThis paper investigates and optimizes the availability of biological and chemical units of a sewage treatment plant. A novel mathematical model for this unit is developed using the Markovian birth-death process.Originality/valueAvailability model of biological and chemical units of a sewage treatment is developed using field failure data and judgments collected from the experts. Furthermore, availability of the system has been optimized to achieve desired level of reliability and maintainability.
{"title":"Availability optimization of biological and chemical processing unit using genetic algorithm and particle swarm optimization","authors":"M. Saini, Drishty Goyal, Ashish Kumar, R. Patil","doi":"10.1108/ijqrm-08-2021-0283","DOIUrl":"https://doi.org/10.1108/ijqrm-08-2021-0283","url":null,"abstract":"PurposeThe demand of sewage treatment plants is increasing day by day, especially in the countries like India. Biological and chemical unit of such sewage treatment plants are critical and needs to be designed and developed to achieve desired level of reliability, maintainability and availability.Design/methodology/approachThis paper investigates and optimizes the availability of biological and chemical unit of a sewage treatment plant. A novel mathematical model for this unit is developed using the Markovian birth-death process. A set of Chapman–Kolmogorov differential equations are derived for the model and a generalized solution is discovered using soft computing techniques namely genetic algorithm (GA) and particle swarm optimization (PSO).FindingsNature-inspired optimization techniques results of availability function depicted that PSO outperforms GA. The optimum value of the availability of biological and chemical processing unit is 0.9324 corresponding to population size 100, the number of evolutions 300, mutation 0.6 and crossover 0.85 achieved using GA while PSO results reflect that optimum achieved availability is 0.936240 after 45 iterations. Finally, it is revealed that PSO outperforms than GA.Research limitations/implicationsThis paper investigates and optimizes the availability of biological and chemical units of a sewage treatment plant. A novel mathematical model for this unit is developed using the Markovian birth-death process.Originality/valueAvailability model of biological and chemical units of a sewage treatment is developed using field failure data and judgments collected from the experts. Furthermore, availability of the system has been optimized to achieve desired level of reliability and maintainability.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":" ","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46456487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PurposeProcess monitoring is a way to manage the quality characteristics of products in manufacturing processes. Several process monitoring based on machine learning algorithms have been proposed in the literature and have gained the attention of many researchers. In this paper, the authors developed machine learning-based control charts for monitoring fraction non-conforming products in smart manufacturing. This study proposed a relevance vector machine using Bayesian sparse kernel optimized by differential evolution algorithm for efficient monitoring in manufacturing.Design/methodology/approachA new approach was carried out about data analysis, modelling and monitoring in the manufacturing industry. This study developed a relevance vector machine using Bayesian sparse kernel technique to improve the support vector machine used to both regression and classification problems. The authors compared the performance of proposed relevance vector machine with other machine learning algorithms, such as support vector machine, artificial neural network and beta regression model. The proposed approach was evaluated by different shift scenarios of average run length using Monte Carlo simulation.FindingsThe authors analyse a real case study in a manufacturing company, based on best machine learning algorithms. The results indicate that proposed relevance vector machine-based process monitoring are excellent quality tools for monitoring defective products in manufacturing process. A comparative analysis with four machine learning models is used to evaluate the performance of the proposed approach. The relevance vector machine has slightly better performance than support vector machine, artificial neural network and beta models.Originality/valueThis research is different from the others by providing approaches for monitoring defective products. Machine learning-based control charts are used to monitor product failures in smart manufacturing process. Besides, the key contribution of this study is to develop different models for fault detection and to identify any change point in the manufacturing process. Moreover, the authors’ research indicates that machine learning models are adequate tools for the modelling and monitoring of the fraction non-conforming product in the industrial process.
{"title":"Machine learning-based control charts for monitoring fraction nonconforming product in smart manufacturing","authors":"Simone Massulini Acosta, Angelo Marcio Oliveira Sant’Anna","doi":"10.1108/ijqrm-07-2021-0210","DOIUrl":"https://doi.org/10.1108/ijqrm-07-2021-0210","url":null,"abstract":"PurposeProcess monitoring is a way to manage the quality characteristics of products in manufacturing processes. Several process monitoring based on machine learning algorithms have been proposed in the literature and have gained the attention of many researchers. In this paper, the authors developed machine learning-based control charts for monitoring fraction non-conforming products in smart manufacturing. This study proposed a relevance vector machine using Bayesian sparse kernel optimized by differential evolution algorithm for efficient monitoring in manufacturing.Design/methodology/approachA new approach was carried out about data analysis, modelling and monitoring in the manufacturing industry. This study developed a relevance vector machine using Bayesian sparse kernel technique to improve the support vector machine used to both regression and classification problems. The authors compared the performance of proposed relevance vector machine with other machine learning algorithms, such as support vector machine, artificial neural network and beta regression model. The proposed approach was evaluated by different shift scenarios of average run length using Monte Carlo simulation.FindingsThe authors analyse a real case study in a manufacturing company, based on best machine learning algorithms. The results indicate that proposed relevance vector machine-based process monitoring are excellent quality tools for monitoring defective products in manufacturing process. A comparative analysis with four machine learning models is used to evaluate the performance of the proposed approach. The relevance vector machine has slightly better performance than support vector machine, artificial neural network and beta models.Originality/valueThis research is different from the others by providing approaches for monitoring defective products. Machine learning-based control charts are used to monitor product failures in smart manufacturing process. Besides, the key contribution of this study is to develop different models for fault detection and to identify any change point in the manufacturing process. Moreover, the authors’ research indicates that machine learning models are adequate tools for the modelling and monitoring of the fraction non-conforming product in the industrial process.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":" ","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44667356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-26DOI: 10.1108/ijqrm-09-2021-0334
V. Tambe, G. Bansod, S. Khurana, Shardul Khandekar
PurposeThe purpose of this study is to test the Internet of things (IoT) devices with respect to reliability and quality.Design/methodology/approachIn this paper, the authors have presented the analysis on design metrics such as perception, communication and computation layers for a constrained environment. In this paper, based on their literature survey, the authors have also presented a study that shows multipath routing is more efficient than single-path, and the retransmission mechanism is not preferable in an IoT environment.FindingsThis paper discusses the reliability of various layers of IoT subject methodologies used in those layers. The authors ran performance tests on Arduino nano and raspberry pi using the AES-128 algorithm. It was empirically determined that the time required to process a message increases exponentially and is more than what benchmark time estimates as the message size is increased. From these results, the authors can accurately determine the optimal size of the message that can be processed by an IoT system employing controllers, which are running 8-bit or 64-bit architectures.Originality/valueThe authors have tested the performance of standard security algorithms on different computational architectures and discuss the implications of the results. Empirical results demonstrate that encryption and decryption times increase nonlinearly rather than linearly as message size increases.
{"title":"Reliability and availability of IoT devices in resource constrained environments","authors":"V. Tambe, G. Bansod, S. Khurana, Shardul Khandekar","doi":"10.1108/ijqrm-09-2021-0334","DOIUrl":"https://doi.org/10.1108/ijqrm-09-2021-0334","url":null,"abstract":"PurposeThe purpose of this study is to test the Internet of things (IoT) devices with respect to reliability and quality.Design/methodology/approachIn this paper, the authors have presented the analysis on design metrics such as perception, communication and computation layers for a constrained environment. In this paper, based on their literature survey, the authors have also presented a study that shows multipath routing is more efficient than single-path, and the retransmission mechanism is not preferable in an IoT environment.FindingsThis paper discusses the reliability of various layers of IoT subject methodologies used in those layers. The authors ran performance tests on Arduino nano and raspberry pi using the AES-128 algorithm. It was empirically determined that the time required to process a message increases exponentially and is more than what benchmark time estimates as the message size is increased. From these results, the authors can accurately determine the optimal size of the message that can be processed by an IoT system employing controllers, which are running 8-bit or 64-bit architectures.Originality/valueThe authors have tested the performance of standard security algorithms on different computational architectures and discuss the implications of the results. Empirical results demonstrate that encryption and decryption times increase nonlinearly rather than linearly as message size increases.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":" ","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43917995","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-25DOI: 10.1108/ijqrm-07-2021-0245
Tobias Mueller, Alexander Segin, Christoph Weigand, R. H. Schmitt
PurposeIn the determination of the measurement uncertainty, the GUM procedure requires the building of a measurement model that establishes a functional relationship between the measurand and all influencing quantities. Since the effort of modelling as well as quantifying the measurement uncertainties depend on the number of influencing quantities considered, the aim of this study is to determine relevant influencing quantities and to remove irrelevant ones from the dataset.Design/methodology/approachIn this work, it was investigated whether the effort of modelling for the determination of measurement uncertainty can be reduced by the use of feature selection (FS) methods. For this purpose, 9 different FS methods were tested on 16 artificial test datasets, whose properties (number of data points, number of features, complexity, features with low influence and redundant features) were varied via a design of experiments.FindingsBased on a success metric, the stability, universality and complexity of the method, two FS methods could be identified that reliably identify relevant and irrelevant influencing quantities for a measurement model.Originality/valueFor the first time, FS methods were applied to datasets with properties of classical measurement processes. The simulation-based results serve as a basis for further research in the field of FS for measurement models. The identified algorithms will be applied to real measurement processes in the future.
{"title":"Feature selection for measurement models","authors":"Tobias Mueller, Alexander Segin, Christoph Weigand, R. H. Schmitt","doi":"10.1108/ijqrm-07-2021-0245","DOIUrl":"https://doi.org/10.1108/ijqrm-07-2021-0245","url":null,"abstract":"PurposeIn the determination of the measurement uncertainty, the GUM procedure requires the building of a measurement model that establishes a functional relationship between the measurand and all influencing quantities. Since the effort of modelling as well as quantifying the measurement uncertainties depend on the number of influencing quantities considered, the aim of this study is to determine relevant influencing quantities and to remove irrelevant ones from the dataset.Design/methodology/approachIn this work, it was investigated whether the effort of modelling for the determination of measurement uncertainty can be reduced by the use of feature selection (FS) methods. For this purpose, 9 different FS methods were tested on 16 artificial test datasets, whose properties (number of data points, number of features, complexity, features with low influence and redundant features) were varied via a design of experiments.FindingsBased on a success metric, the stability, universality and complexity of the method, two FS methods could be identified that reliably identify relevant and irrelevant influencing quantities for a measurement model.Originality/valueFor the first time, FS methods were applied to datasets with properties of classical measurement processes. The simulation-based results serve as a basis for further research in the field of FS for measurement models. The identified algorithms will be applied to real measurement processes in the future.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":"26 13","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41245245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-13DOI: 10.1108/ijqrm-07-2021-0214
E. Verna, D. Maisano
PurposeNowadays, companies are increasingly adopting additive manufacturing (AM) technologies due to their flexibility and product customization, combined with non-dramatic increases in per unit cost. Moreover, many companies deploy a plurality of distributed AM centers to enhance flexibility and customer proximity. Although AM centers are characterized by similar equipment and working methods, their production mix and volumes may be variable. The purpose of this paper is to propose a novel methodology to (1) monitor the quality of the production of individual AM centers and (2) perform a benchmarking of different AM centers.Design/methodology/approachThis paper analyzes the quality of the production output of AM centers in terms of compliance with specifications. Quality is assessed through a multivariate statistical analysis of measurement data concerning several geometric quality characteristics. A novel operational methodology is suggested to estimate the fraction nonconforming of each AM center at three different levels: (1) overall production, (2) individual product typologies in the production mix and (3) individual quality characteristics.FindingsThe proposed methodology allows performing a benchmark analysis on the quality performance of distributed AM centers during regular production, without requiring any ad hoc experimental test.Originality/valueThis research assesses the capability of distributed AM centers to meet crucial quality requirements. The results can guide production managers toward improving the quality of the production of AM centers, in order to meet customer expectations and enhance business performance.
{"title":"A benchmark analysis of the quality of distributed additive manufacturing centers","authors":"E. Verna, D. Maisano","doi":"10.1108/ijqrm-07-2021-0214","DOIUrl":"https://doi.org/10.1108/ijqrm-07-2021-0214","url":null,"abstract":"PurposeNowadays, companies are increasingly adopting additive manufacturing (AM) technologies due to their flexibility and product customization, combined with non-dramatic increases in per unit cost. Moreover, many companies deploy a plurality of distributed AM centers to enhance flexibility and customer proximity. Although AM centers are characterized by similar equipment and working methods, their production mix and volumes may be variable. The purpose of this paper is to propose a novel methodology to (1) monitor the quality of the production of individual AM centers and (2) perform a benchmarking of different AM centers.Design/methodology/approachThis paper analyzes the quality of the production output of AM centers in terms of compliance with specifications. Quality is assessed through a multivariate statistical analysis of measurement data concerning several geometric quality characteristics. A novel operational methodology is suggested to estimate the fraction nonconforming of each AM center at three different levels: (1) overall production, (2) individual product typologies in the production mix and (3) individual quality characteristics.FindingsThe proposed methodology allows performing a benchmark analysis on the quality performance of distributed AM centers during regular production, without requiring any ad hoc experimental test.Originality/valueThis research assesses the capability of distributed AM centers to meet crucial quality requirements. The results can guide production managers toward improving the quality of the production of AM centers, in order to meet customer expectations and enhance business performance.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":" ","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42169864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-11DOI: 10.1108/ijqrm-07-2021-0227
Daniel Ashagrie Tegegne, D. Azene, Eshetie Berhan Atanaw
PurposeThis study aims to design a multivariate control chart that improves the applicability of the traditional Hotelling T2 chart. This new type of multivariate control chart displays sufficient information about the states and relationships of the variables in the production process. It is used to make better quality control decisions during the production process.Design/methodology/approachMultivariate data are collected at an equal time interval and are represented by nodes of the graph. The edges connecting the nodes represent the sequence of operation. Each node is plotted on the control chart based on their Hotelling T2 statistical distance. The changing behavior of each pair of input and output nodes is studied by the neural network. A case study from the cement industry is conducted to validate the control chart.FindingsThe finding of this paper is that the points and lines in the classic Hotelling T2 chart are effectively substituted by nodes and edges of the graph respectively. Nodes and edges have dimension and color and represent several attributes. As a result, this control chart displays much more information than the traditional Hotelling T2 control chart. The pattern of the plot represents whether the process is normal or not. The effect of the sequence of operation is visible in the control chart. The frequency of the happening of nodes is recognized by the size of nodes. The decision to change the product feature is assisted by finding the shortest path between nodes. Moreover, consecutive nodes have different behaviors, and that behavior change is recognized by neural network.Originality/valueModifying the classical Hotelling T2 control chart by integrating with the concept of graph theory and neural network is new of its kind.
{"title":"Design multivariate statistical process control procedure in the case of Ethio cement","authors":"Daniel Ashagrie Tegegne, D. Azene, Eshetie Berhan Atanaw","doi":"10.1108/ijqrm-07-2021-0227","DOIUrl":"https://doi.org/10.1108/ijqrm-07-2021-0227","url":null,"abstract":"PurposeThis study aims to design a multivariate control chart that improves the applicability of the traditional Hotelling T2 chart. This new type of multivariate control chart displays sufficient information about the states and relationships of the variables in the production process. It is used to make better quality control decisions during the production process.Design/methodology/approachMultivariate data are collected at an equal time interval and are represented by nodes of the graph. The edges connecting the nodes represent the sequence of operation. Each node is plotted on the control chart based on their Hotelling T2 statistical distance. The changing behavior of each pair of input and output nodes is studied by the neural network. A case study from the cement industry is conducted to validate the control chart.FindingsThe finding of this paper is that the points and lines in the classic Hotelling T2 chart are effectively substituted by nodes and edges of the graph respectively. Nodes and edges have dimension and color and represent several attributes. As a result, this control chart displays much more information than the traditional Hotelling T2 control chart. The pattern of the plot represents whether the process is normal or not. The effect of the sequence of operation is visible in the control chart. The frequency of the happening of nodes is recognized by the size of nodes. The decision to change the product feature is assisted by finding the shortest path between nodes. Moreover, consecutive nodes have different behaviors, and that behavior change is recognized by neural network.Originality/valueModifying the classical Hotelling T2 control chart by integrating with the concept of graph theory and neural network is new of its kind.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":" ","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42568782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-11DOI: 10.1108/ijqrm-09-2021-0344
Angelo Marcio Oliveira Sant’Anna
PurposeE-waste management can reduce relevant impact of the business activity without affecting reliability, quality or performance. Statistical process monitoring is an effective way for managing reliability and quality to devices in manufacturing processes. This paper proposes an approach for monitoring the proportion of e-waste devices based on Beta regression model and particle swarm optimization. A statistical process monitoring scheme integrating residual useful life techniques for efficient monitoring of e-waste components or equipment was developed.Design/methodology/approachAn approach integrating regression method and particle swarm optimization algorithm was developed for increasing the accuracy of regression model estimates. The control chart tools were used for monitoring the proportion of e-waste devices from fault detection of electronic devices in manufacturing process.FindingsThe results showed that the proposed statistical process monitoring was an excellent reliability and quality scheme for monitoring the proportion of e-waste devices in toner manufacturing process. The optimized regression model estimates showed a significant influence of the process variables for both individually injection rate and toner treads and the interactions between injection rate, toner treads, viscosity and density.Originality/valueThis research is different from others by providing an approach for modeling and monitoring the proportion of e-waste devices. Statistical process monitoring can be used to monitor waste product in manufacturing. Besides, the key contribution in this study is to develop different models for fault detection and identify any change point in the manufacturing process. The optimized model used can be replicated to other Electronic Industry and allows support of a satisfactory e-waste management.
{"title":"Statistical process monitoring for e-waste based on beta regression and particle swarm optimization","authors":"Angelo Marcio Oliveira Sant’Anna","doi":"10.1108/ijqrm-09-2021-0344","DOIUrl":"https://doi.org/10.1108/ijqrm-09-2021-0344","url":null,"abstract":"PurposeE-waste management can reduce relevant impact of the business activity without affecting reliability, quality or performance. Statistical process monitoring is an effective way for managing reliability and quality to devices in manufacturing processes. This paper proposes an approach for monitoring the proportion of e-waste devices based on Beta regression model and particle swarm optimization. A statistical process monitoring scheme integrating residual useful life techniques for efficient monitoring of e-waste components or equipment was developed.Design/methodology/approachAn approach integrating regression method and particle swarm optimization algorithm was developed for increasing the accuracy of regression model estimates. The control chart tools were used for monitoring the proportion of e-waste devices from fault detection of electronic devices in manufacturing process.FindingsThe results showed that the proposed statistical process monitoring was an excellent reliability and quality scheme for monitoring the proportion of e-waste devices in toner manufacturing process. The optimized regression model estimates showed a significant influence of the process variables for both individually injection rate and toner treads and the interactions between injection rate, toner treads, viscosity and density.Originality/valueThis research is different from others by providing an approach for modeling and monitoring the proportion of e-waste devices. Statistical process monitoring can be used to monitor waste product in manufacturing. Besides, the key contribution in this study is to develop different models for fault detection and identify any change point in the manufacturing process. The optimized model used can be replicated to other Electronic Industry and allows support of a satisfactory e-waste management.","PeriodicalId":14193,"journal":{"name":"International Journal of Quality & Reliability Management","volume":" ","pages":""},"PeriodicalIF":2.5,"publicationDate":"2022-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45985502","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}