Healthcare waste (HCW) management is a complex issue influenced by many factors, including technological, economic, environmental, and social factors. It is possible to regard the evaluation of the best treatment technique for HCW management as a challenging case of MCDM (multi-criteria decision-making), where various alternatives and evaluation criteria must be considered. The presentation and handling of the shaky data are crucial to choosing the HCW treatment technology. In order to address the issue of MCDM issues with Fermatean fuzzy (FF) data, we first build a consensus-based WASPAS approach in this study. In the suggested integrated methodology, the rank of the alternatives is determined using the WASPAS method in an FF environment, and the attribute weights are estimated using the entropy measure technique. In the preceding, an HCW treatment technology assessment issue is considered to make the proposed structure's applicability more transparent. In this study, four HCW treatment methods—chemical disinfection, microwave disinfection, cremation, and autoclaving—are considered options. According to the study's findings, autoclaving is the most effective HCW treatment method. Additionally, we demonstrate a sensitivity assessment using several criteria weight sets to test the stability of our intriguing proposed approach. We also call attention to a contrast between our suggested approach to decision-making and the practices now in use.
{"title":"A consensus-based Fermatean fuzzy WASPAS methodology for selection of healthcare waste treatment technology selection","authors":"Chandana Narasimha Rao, Matta Sujatha","doi":"10.31181/dmame622023621","DOIUrl":"https://doi.org/10.31181/dmame622023621","url":null,"abstract":"Healthcare waste (HCW) management is a complex issue influenced by many factors, including technological, economic, environmental, and social factors. It is possible to regard the evaluation of the best treatment technique for HCW management as a challenging case of MCDM (multi-criteria decision-making), where various alternatives and evaluation criteria must be considered. The presentation and handling of the shaky data are crucial to choosing the HCW treatment technology. In order to address the issue of MCDM issues with Fermatean fuzzy (FF) data, we first build a consensus-based WASPAS approach in this study. In the suggested integrated methodology, the rank of the alternatives is determined using the WASPAS method in an FF environment, and the attribute weights are estimated using the entropy measure technique. In the preceding, an HCW treatment technology assessment issue is considered to make the proposed structure's applicability more transparent. In this study, four HCW treatment methods—chemical disinfection, microwave disinfection, cremation, and autoclaving—are considered options. According to the study's findings, autoclaving is the most effective HCW treatment method. Additionally, we demonstrate a sensitivity assessment using several criteria weight sets to test the stability of our intriguing proposed approach. We also call attention to a contrast between our suggested approach to decision-making and the practices now in use.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46236357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In manufacturing industries, material handling equipment plays a vital role and is considered as one of the important pillars to increase production efficiency. Hence, the selection of appropriate material handling equipment for a specific task is well acknowledged, but the complexity of this selection process drastically increases with the rise in the number of alternative equipment available in the market and a set of conflicting evaluation criteria. To resolve this problem, several multi-criteria decision-making (MCDM) techniques have been proposed by past researchers. In this paper, the application potentiality of a newly developed MCDM technique, i.e. R method is explored while solving five material handling equipment selection problems, i.e. conveyor, automated guided vehicle (AGV), stacker, wheel loader and excavator. The derived ranking results are contrasted with other popular MCDM techniques to validate its potentiality in shortlisting the candidate alternatives from the best to the worst, which would ultimately help in improving the overall efficiency of the manufacturing processes.
{"title":"Application of the R method in solving material handling equipment selection problems","authors":"S. Chatterjee, S. Chakraborty","doi":"10.31181/dmame622023391","DOIUrl":"https://doi.org/10.31181/dmame622023391","url":null,"abstract":"In manufacturing industries, material handling equipment plays a vital role and is considered as one of the important pillars to increase production efficiency. Hence, the selection of appropriate material handling equipment for a specific task is well acknowledged, but the complexity of this selection process drastically increases with the rise in the number of alternative equipment available in the market and a set of conflicting evaluation criteria. To resolve this problem, several multi-criteria decision-making (MCDM) techniques have been proposed by past researchers. In this paper, the application potentiality of a newly developed MCDM technique, i.e. R method is explored while solving five material handling equipment selection problems, i.e. conveyor, automated guided vehicle (AGV), stacker, wheel loader and excavator. The derived ranking results are contrasted with other popular MCDM techniques to validate its potentiality in shortlisting the candidate alternatives from the best to the worst, which would ultimately help in improving the overall efficiency of the manufacturing processes.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44171086","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Identifying and evaluating any threat against the critical infrastructures, including history, methods, abilities, and motivations, is essential for crisis management and cities' passive defense. Threats, including natural and unnatural (man-made), are directed at cities' critical assets and infrastructures. Essential assets are considered valuable components, so the slightest malfunction or damage to the body causes damage to the system. This study uses Tehran, the capital of Iran, as a case study to identify and assess man-made dangers to cities and their vital resources. This work creates an innovative integrated MCDM approach that can handle information ambiguity in crisis management. Therefore, at this stage of identifying man-made threats, library methods and interviews with experts were used, and multi-criteria decision-making techniques were implemented. Moreover, this research benefits from the grey Best-Worst method (BWM) to evaluate the research criteria and grey Measurement of Alternatives and Ranking according to COmpromise Solution (MARCOS) to rank the threats. The research findings indicated that the three main threats to Tehran city are cyber, military, and terrorist attacks. Finally, a sensitivity analysis based on two practical experiments is done, and research results are verified.
{"title":"Identification and assessment of man-made threats to cities using integrated Grey BWM- Grey MARCOS method","authors":"M. Bitarafan, K. Hosseini, S. Zolfani","doi":"10.31181/dmame622023747","DOIUrl":"https://doi.org/10.31181/dmame622023747","url":null,"abstract":"Identifying and evaluating any threat against the critical infrastructures, including history, methods, abilities, and motivations, is essential for crisis management and cities' passive defense. Threats, including natural and unnatural (man-made), are directed at cities' critical assets and infrastructures. Essential assets are considered valuable components, so the slightest malfunction or damage to the body causes damage to the system. This study uses Tehran, the capital of Iran, as a case study to identify and assess man-made dangers to cities and their vital resources. This work creates an innovative integrated MCDM approach that can handle information ambiguity in crisis management. Therefore, at this stage of identifying man-made threats, library methods and interviews with experts were used, and multi-criteria decision-making techniques were implemented. Moreover, this research benefits from the grey Best-Worst method (BWM) to evaluate the research criteria and grey Measurement of Alternatives and Ranking according to COmpromise Solution (MARCOS) to rank the threats. The research findings indicated that the three main threats to Tehran city are cyber, military, and terrorist attacks. Finally, a sensitivity analysis based on two practical experiments is done, and research results are verified.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43649281","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In addition to economic challenges, employers are also facing a growing generation gap. Generations that are significantly different in terms of values, mindsets and preferences need to be effectively managed in a workplace, which requires complex solutions. In this paper we present sub results of our primary research. In a quantitative procedure, we conducted a pre-tested standardized questionnaire online survey with random sample making method, resulting in 1146 evaluable questionnaires. Descriptive statistics to evaluate the results presented in this study. Descriptive statistics, and bivariate and multivariate analyses were applied to process the quantitative results, to test the hypotheses put forward. The main focus of our research is to investigate the factors that influence employees' choice of jobs, and we analyzed the structure of their preference system. As a result of the research, we were able to identify distinct clusters according to the preference system of job selection. In the context of the resulting segments, we also analyzed which motivational tools could be most effective in encouraging higher performance. We believe that our research has useful implications for practice, by highlighting how to differentiate the pool of employees in terms of job choice and how to apply effective incentives to a specific segment.
{"title":"Characteristics of segments according to the preference system for job selection, opportunities for effective incentives in each employee group","authors":"Mónika Garai-Fodor, Laszlo Vasa, K. Jäckel","doi":"10.31181/dmame622023761","DOIUrl":"https://doi.org/10.31181/dmame622023761","url":null,"abstract":"In addition to economic challenges, employers are also facing a growing generation gap. Generations that are significantly different in terms of values, mindsets and preferences need to be effectively managed in a workplace, which requires complex solutions. In this paper we present sub results of our primary research. In a quantitative procedure, we conducted a pre-tested standardized questionnaire online survey with random sample making method, resulting in 1146 evaluable questionnaires. Descriptive statistics to evaluate the results presented in this study. Descriptive statistics, and bivariate and multivariate analyses were applied to process the quantitative results, to test the hypotheses put forward. The main focus of our research is to investigate the factors that influence employees' choice of jobs, and we analyzed the structure of their preference system. As a result of the research, we were able to identify distinct clusters according to the preference system of job selection. In the context of the resulting segments, we also analyzed which motivational tools could be most effective in encouraging higher performance. We believe that our research has useful implications for practice, by highlighting how to differentiate the pool of employees in terms of job choice and how to apply effective incentives to a specific segment.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41833965","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In recent years, despite the strict "zero tolerance" crackdown on the financial fraud and violation behavior of listed companies, the cases of financial fraud, revenue and profit overstatement, and suspected fraud have continued to be exposed. This study first established a financial fraud index system and used the XGBoost algorithm to construct a prediction model for financial fraud and violations of listed companies. The indicators were selected and input into the model. A dataset was obtained for experiments. The XGBoost algorithm was compared with two other algorithms. The receiver operator characteristic (ROC) curves showed that the XGBoost algorithm had the best prediction performance among the three algorithms. It was found that the precision of the XGBoost algorithm was 93.17%, the recall rate was 92.23%, the value was 0.9270, and the area under the curve was 0.90, indicating a better performance than the prediction models based on the Gradient Boosted Decision Tree (GBDT) algorithm and the Logistics algorithm. Considering the data of various evaluation indicators, it is found that the predictive effect of the financial fraud and violation prediction model built by the XGBoost algorithm is the best.
{"title":"Ensemble learning algorithm - research analysis on the management of financial fraud and violation in listed companies","authors":"Weihong Li, Xiujuan Xu","doi":"10.31181/dmame622023785","DOIUrl":"https://doi.org/10.31181/dmame622023785","url":null,"abstract":"In recent years, despite the strict \"zero tolerance\" crackdown on the financial fraud and violation behavior of listed companies, the cases of financial fraud, revenue and profit overstatement, and suspected fraud have continued to be exposed. This study first established a financial fraud index system and used the XGBoost algorithm to construct a prediction model for financial fraud and violations of listed companies. The indicators were selected and input into the model. A dataset was obtained for experiments. The XGBoost algorithm was compared with two other algorithms. The receiver operator characteristic (ROC) curves showed that the XGBoost algorithm had the best prediction performance among the three algorithms. It was found that the precision of the XGBoost algorithm was 93.17%, the recall rate was 92.23%, the value was 0.9270, and the area under the curve was 0.90, indicating a better performance than the prediction models based on the Gradient Boosted Decision Tree (GBDT) algorithm and the Logistics algorithm. Considering the data of various evaluation indicators, it is found that the predictive effect of the financial fraud and violation prediction model built by the XGBoost algorithm is the best.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47045049","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Technological advancements, abrupt changes in market conditions, and political reforms, among other things, necessitate strong regulatory oversight, and accurate measurement of performance related indicators. The more accurate, information rich, and transparent these measurements/signals, the lower the level of uncertainty felt by value chain participants, who are thus able to recognize and observe whether the market’s state is efficient. Its lack, may lead to indecisiveness, translating into false interpretations that could lead to wrong policy directions. This paper provides an ex-post evaluation tool intending to deliver additional insights or quality information that would aid the regulator in assessing the state of the market. The tool is applied to the UK wholesale natural gas market for the period between 2011 and 2020, assessing and testing the market’s weak-form efficiency. It claims that today’s gas prices reflect a specific type of information, primarily past gas prices, and that only new information can help predict future prices. In this manuscript, based solely on a limited and available untapped dataset (day-ahead price time series), and working under the assumption that gas prices are the result of market processes, a variety of information metrics (gas price randomness, distribution of extreme prices, ability to predict prices - based on historical sets) is extracted with the use of suitable mathematical statistical models. A weighted entropy index is then computed, and measures the state of the commodity market. The results indicate that the analysis has helped gain information, thus reducing uncertainty (relative to a pre-analysis) by 86.5 %. Additionally, there is sufficient evidence that the UK natural gas prices are weak-form efficient.
{"title":"Measuring the competitiveness of commodity markets using price signals and information theory","authors":"Anis Hoayek, Hassan Hamie","doi":"10.31181/dmame622023548","DOIUrl":"https://doi.org/10.31181/dmame622023548","url":null,"abstract":"Technological advancements, abrupt changes in market conditions, and political reforms, among other things, necessitate strong regulatory oversight, and accurate measurement of performance related indicators. The more accurate, information rich, and transparent these measurements/signals, the lower the level of uncertainty felt by value chain participants, who are thus able to recognize and observe whether the market’s state is efficient. Its lack, may lead to indecisiveness, translating into false interpretations that could lead to wrong policy directions. This paper provides an ex-post evaluation tool intending to deliver additional insights or quality information that would aid the regulator in assessing the state of the market. The tool is applied to the UK wholesale natural gas market for the period between 2011 and 2020, assessing and testing the market’s weak-form efficiency. It claims that today’s gas prices reflect a specific type of information, primarily past gas prices, and that only new information can help predict future prices. In this manuscript, based solely on a limited and available untapped dataset (day-ahead price time series), and working under the assumption that gas prices are the result of market processes, a variety of information metrics (gas price randomness, distribution of extreme prices, ability to predict prices - based on historical sets) is extracted with the use of suitable mathematical statistical models. A weighted entropy index is then computed, and measures the state of the commodity market. The results indicate that the analysis has helped gain information, thus reducing uncertainty (relative to a pre-analysis) by 86.5 %. Additionally, there is sufficient evidence that the UK natural gas prices are weak-form efficient.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42175754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Measurement and adequate performance management of a company represent an imperative regarding the attainment of competitive advantage in the market. This need is especially pronounced in medium-sized and large companies, which represent complex organizations and require the application of scientific principles for solving complex issues in practice. Therefore, the performance management process in medium-sized and large companies is singled out as the subject of the research. It is important to point out that the main concept of the paper is the macro aspect of the performance management process. The primary goal of the research is the development of a new model for performance management based on considering all important parameters of business operations of these companies on the territory of the Republic of Srpska. The research was conducted via combined methods – a multiple case study which implies four companies and a survey questionnaire. Through the analysis of many models in the literature, seven were singled out which were studied further. The main elements and conceptual bases of these models served as the basis for conducting research and creating a new model for performance management. The result of the published research represents “SOFI” model (strategic, organizational, financial and information–technological aspects), whose application contributes towards easier management by managers, as well as making correct management decisions in the conditions of uncertainty.
{"title":"The development of a new SOFI model for performance management of medium-sized and large companies","authors":"Biljana Kovačević","doi":"10.31181/dmame622023754","DOIUrl":"https://doi.org/10.31181/dmame622023754","url":null,"abstract":"Measurement and adequate performance management of a company represent an imperative regarding the attainment of competitive advantage in the market. This need is especially pronounced in medium-sized and large companies, which represent complex organizations and require the application of scientific principles for solving complex issues in practice. Therefore, the performance management process in medium-sized and large companies is singled out as the subject of the research. It is important to point out that the main concept of the paper is the macro aspect of the performance management process. The primary goal of the research is the development of a new model for performance management based on considering all important parameters of business operations of these companies on the territory of the Republic of Srpska. The research was conducted via combined methods – a multiple case study which implies four companies and a survey questionnaire. Through the analysis of many models in the literature, seven were singled out which were studied further. The main elements and conceptual bases of these models served as the basis for conducting research and creating a new model for performance management. The result of the published research represents “SOFI” model (strategic, organizational, financial and information–technological aspects), whose application contributes towards easier management by managers, as well as making correct management decisions in the conditions of uncertainty.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41705207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we present the Rat Swarm Optimization with Decision Making (HDRSO), a hybrid metaheuristic algorithm inspired by the hunting behavior of rats, for solving the Traveling Salesman Problem (TSP). The TSP is a well-known NP-hard combinatorial optimization problem with important applications in transportation, logistics, and manufacturing systems. To improve the search process and avoid getting stuck in local minima, we added a natural mechanism to HDRSO through the incorporation of crossover and selection operators. In addition, we applied 2-opt and 3-opt heuristics to the best solution found by HDRSO. The performance of HDRSO was evaluated on a set of symmetric instances from the TSPLIB library and the results demonstrated that HDRSO is a competitive and robust method for solving the TSP, achieving better results than the best-known solutions in some cases.
{"title":"Artificial rat optimization with decision-making: A bio-inspired metaheuristic algorithm for solving the traveling salesman problem","authors":"Toufik Mzili, I. Mzili, M. E. Riffi","doi":"10.31181/dmame622023644","DOIUrl":"https://doi.org/10.31181/dmame622023644","url":null,"abstract":"In this paper, we present the Rat Swarm Optimization with Decision Making (HDRSO), a hybrid metaheuristic algorithm inspired by the hunting behavior of rats, for solving the Traveling Salesman Problem (TSP). The TSP is a well-known NP-hard combinatorial optimization problem with important applications in transportation, logistics, and manufacturing systems. To improve the search process and avoid getting stuck in local minima, we added a natural mechanism to HDRSO through the incorporation of crossover and selection operators. In addition, we applied 2-opt and 3-opt heuristics to the best solution found by HDRSO. The performance of HDRSO was evaluated on a set of symmetric instances from the TSPLIB library and the results demonstrated that HDRSO is a competitive and robust method for solving the TSP, achieving better results than the best-known solutions in some cases.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47645731","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Risks in supply chains are first identified and then prioritized based on their probability of occurrence and their impact. Attempts to mitigate risks in the absence of complete and accurate information about their likelihood and impact may constitute a significant waste of resources. Since the resources available for risk management are usually limited, firms need to know how to allocate these funds appropriately. That is, a strategy is required to determine which risks are a priority in terms of acquiring complete and accurate information. We develop a model that incorporates two conflicting terms to address this issue. The first, captured by entropy, measures the resources wasted due to risk factors for which there is inaccurate information about the probability of occurrence and impact. The second is the cost associated with the efforts expended in collecting accurate information about risk factors. To solve the model, we propose a stopping-rule algorithm. Its efficiency is verified using data gathered from a real-world pharmaceutical and generalized green supply chains. Numerous computerized experiments show that the stopping-rule algorithm prevails over the widely used risk-management Pareto rule, and that the algorithm is able to achieve the optimal solution in 94% of investigated cases.
{"title":"An efficient stopping rule for mitigating risk factors: Applications in pharmaceutical and generalized green supply chains","authors":"Avi Herbon, Dmitry Tsadikovich","doi":"10.31181/dmame622023677","DOIUrl":"https://doi.org/10.31181/dmame622023677","url":null,"abstract":"Risks in supply chains are first identified and then prioritized based on their probability of occurrence and their impact. Attempts to mitigate risks in the absence of complete and accurate information about their likelihood and impact may constitute a significant waste of resources. Since the resources available for risk management are usually limited, firms need to know how to allocate these funds appropriately. That is, a strategy is required to determine which risks are a priority in terms of acquiring complete and accurate information. We develop a model that incorporates two conflicting terms to address this issue. The first, captured by entropy, measures the resources wasted due to risk factors for which there is inaccurate information about the probability of occurrence and impact. The second is the cost associated with the efforts expended in collecting accurate information about risk factors. To solve the model, we propose a stopping-rule algorithm. Its efficiency is verified using data gathered from a real-world pharmaceutical and generalized green supply chains. Numerous computerized experiments show that the stopping-rule algorithm prevails over the widely used risk-management Pareto rule, and that the algorithm is able to achieve the optimal solution in 94% of investigated cases.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135351101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The focus of our research was to examine consumer perceptions of attitudes towards digitalisation. The perception of digitalisation was also analysed from a generation-specific perspective, given that the difference in values between generations is reflected in the perception of consumer trends, including the trend towards digitalisation. The primary data presented in this study are the results of a quantitative data collection was carried out among Hungarian consumers using an arbitrary sampling procedure. A pre-tested, standardised online questionnaire survey was used, which resulted in 3,515 evaluable questionnaires. Descriptive statistics, bivariate and multivariate analyses were used to process the quantitative results and test the hypotheses. As a result of the study, we were able to characterise three significantly distinguishable target groups using a K-means clustering procedure: the group of "consumers sceptical about digitalisation", the segment of "Accepting consumers who feel the differentiating effects of digitalisation", and The 'positive digital consumer'. We have been able to demonstrate that the perception of digitalisation can be used as a segmentation criterion, and we can also statistically demonstrate that the segments according to the perception of digitalisation carry generation-specific elements. In our opinion the results may help to increase consumer acceptance of digitalisation processes and related technologies. A limitation of the research is that, the results are valid for the population under consideration, cannot be considered representative. We believe that characterising the individual segments can help to differentiate the education process according to the awareness and attitudes of each consumer.
{"title":"Characteristics of consumer segments based on perceptions of the impact of digitalisation","authors":"Mónika Garai-Fodor, László Vasa, Katalin Jäckel","doi":"10.31181/dmame622023940","DOIUrl":"https://doi.org/10.31181/dmame622023940","url":null,"abstract":"The focus of our research was to examine consumer perceptions of attitudes towards digitalisation. The perception of digitalisation was also analysed from a generation-specific perspective, given that the difference in values between generations is reflected in the perception of consumer trends, including the trend towards digitalisation. The primary data presented in this study are the results of a quantitative data collection was carried out among Hungarian consumers using an arbitrary sampling procedure. A pre-tested, standardised online questionnaire survey was used, which resulted in 3,515 evaluable questionnaires. Descriptive statistics, bivariate and multivariate analyses were used to process the quantitative results and test the hypotheses. As a result of the study, we were able to characterise three significantly distinguishable target groups using a K-means clustering procedure: the group of \"consumers sceptical about digitalisation\", the segment of \"Accepting consumers who feel the differentiating effects of digitalisation\", and The 'positive digital consumer'. We have been able to demonstrate that the perception of digitalisation can be used as a segmentation criterion, and we can also statistically demonstrate that the segments according to the perception of digitalisation carry generation-specific elements. In our opinion the results may help to increase consumer acceptance of digitalisation processes and related technologies. A limitation of the research is that, the results are valid for the population under consideration, cannot be considered representative. We believe that characterising the individual segments can help to differentiate the education process according to the awareness and attitudes of each consumer.","PeriodicalId":32695,"journal":{"name":"Decision Making Applications in Management and Engineering","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135537622","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}