Pub Date : 2024-11-15DOI: 10.1016/j.ejor.2024.11.008
Sebastian Rojas Gonzalez, Juergen Branke, Inneke Van Nieuwenhuyse
We consider bi-objective ranking and selection problems, where the goal is to correctly identify the Pareto-optimal solutions among a finite set of candidates for which the objective function values have to be estimated from noisy evaluations. When identifying these solutions, the noise perturbing the observed performance may lead to two types of errors: solutions that are truly Pareto-optimal may appear to be dominated, and solutions that are truly dominated may appear to be Pareto-optimal. We propose a novel Bayesian bi-objective ranking and selection method that sequentially allocates extra samples to competitive solutions, in view of reducing the misclassification errors when identifying the solutions with the best expected performance. The approach uses stochastic kriging to build reliable predictive distributions of the objectives, and exploits this information to decide how to resample. The experiments are designed to evaluate the algorithm on several artificial and practical test problems. The proposed approach is observed to consistently outperform its competitors (a well-known state-of-the-art algorithm and the standard equal allocation method), which may also benefit from the use of stochastic kriging information.
{"title":"Bi-objective ranking and selection using stochastic kriging","authors":"Sebastian Rojas Gonzalez, Juergen Branke, Inneke Van Nieuwenhuyse","doi":"10.1016/j.ejor.2024.11.008","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.11.008","url":null,"abstract":"We consider bi-objective ranking and selection problems, where the goal is to correctly identify the Pareto-optimal solutions among a finite set of candidates for which the objective function values have to be estimated from noisy evaluations. When identifying these solutions, the noise perturbing the observed performance may lead to two types of errors: solutions that are truly Pareto-optimal may appear to be dominated, and solutions that are truly dominated may appear to be Pareto-optimal. We propose a novel Bayesian bi-objective ranking and selection method that sequentially allocates extra samples to competitive solutions, in view of reducing the misclassification errors when identifying the solutions with the best expected performance. The approach uses stochastic kriging to build reliable predictive distributions of the objectives, and exploits this information to decide how to resample. The experiments are designed to evaluate the algorithm on several artificial and practical test problems. The proposed approach is observed to consistently outperform its competitors (a well-known state-of-the-art algorithm and the standard equal allocation method), which may also benefit from the use of stochastic kriging information.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"99 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper we study single-machine preemptive scheduling to minimize the total weighted late work with assignable due dates or assignable weights. For the problem with assignable due dates, we show that it is binary NP-hard, solvable in pseudo-polynomial time, and solvable in polynomial time when all the jobs have agreeable processing times and weights. For the problem with assignable weights, we show that it is solvable in polynomial time. For the problem with assignable due dates and assignable weights, we show that it is binary NP-hard, solvable in pseudo-polynomial time, and solvable in polynomial time when all the jobs have the same processing times.
{"title":"Single-machine preemptive scheduling with assignable due dates or assignable weights to minimize total weighted late work","authors":"Rubing Chen, Xinyu Dong, Jinjiang Yuan, C.T. Ng, T.C.E. Cheng","doi":"10.1016/j.ejor.2024.11.010","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.11.010","url":null,"abstract":"In this paper we study single-machine preemptive scheduling to minimize the total weighted late work with assignable due dates or assignable weights. For the problem with assignable due dates, we show that it is binary <mml:math altimg=\"si170.svg\" display=\"inline\"><mml:mrow><mml:mi>N</mml:mi><mml:mi>P</mml:mi></mml:mrow></mml:math>-hard, solvable in pseudo-polynomial time, and solvable in polynomial time when all the jobs have agreeable processing times and weights. For the problem with assignable weights, we show that it is solvable in polynomial time. For the problem with assignable due dates and assignable weights, we show that it is binary <mml:math altimg=\"si170.svg\" display=\"inline\"><mml:mrow><mml:mi>N</mml:mi><mml:mi>P</mml:mi></mml:mrow></mml:math>-hard, solvable in pseudo-polynomial time, and solvable in polynomial time when all the jobs have the same processing times.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"18 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670533","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-12DOI: 10.1016/j.ejor.2024.10.042
Babak Akbarzadeh, Broos Maenhout
In this paper, we study the patient planning and surgeon scheduling in the operating room theatre. The problem considers the simultaneous planning of patients and the assignment of time blocks to surgeons so that they can perform the surgery of their patients. The timing and length of the allotted time blocks depend on the patient characteristics on the surgeons’ waiting lists. Solving this problem in an exact manner is challenging due to the large number of rooms, surgeons, and patients involved. To overcome this challenge, we propose an efficient branch-price-and-cut algorithm to find an optimal solution in an acceptable time span. For that purpose, we include different dedicated mechanisms to accelerate the solution-finding process. In this regard, the branch-price-and-cut tree is set up using an intelligent branching scheme, the nodes are searched in order of the lowest number of fractional variables, and improved bounds are computed to prune nodes earlier. To tighten the convex hull of the linear programming relaxation in each node, the algorithm relies on a row generation mechanism for adding valid inequalities. We conducted various computational experiments to demonstrate the performance of our algorithm and validate for each component the contribution of the implemented optimisation principles. Additionally, we show the superior performance of the proposed algorithm to alternative optimisation procedures.
{"title":"A dedicated branch-price-and-cut algorithm for advance patient planning and surgeon scheduling","authors":"Babak Akbarzadeh, Broos Maenhout","doi":"10.1016/j.ejor.2024.10.042","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.10.042","url":null,"abstract":"In this paper, we study the patient planning and surgeon scheduling in the operating room theatre. The problem considers the simultaneous planning of patients and the assignment of time blocks to surgeons so that they can perform the surgery of their patients. The timing and length of the allotted time blocks depend on the patient characteristics on the surgeons’ waiting lists. Solving this problem in an exact manner is challenging due to the large number of rooms, surgeons, and patients involved. To overcome this challenge, we propose an efficient branch-price-and-cut algorithm to find an optimal solution in an acceptable time span. For that purpose, we include different dedicated mechanisms to accelerate the solution-finding process. In this regard, the branch-price-and-cut tree is set up using an intelligent branching scheme, the nodes are searched in order of the lowest number of fractional variables, and improved bounds are computed to prune nodes earlier. To tighten the convex hull of the linear programming relaxation in each node, the algorithm relies on a row generation mechanism for adding valid inequalities. We conducted various computational experiments to demonstrate the performance of our algorithm and validate for each component the contribution of the implemented optimisation principles. Additionally, we show the superior performance of the proposed algorithm to alternative optimisation procedures.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"13 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-12DOI: 10.1016/j.ejor.2024.11.002
Wensheng Yang, Jingtang Ma, Zhenyu Cui
Rough volatility models are a new class of stochastic volatility models that have been shown to provide a consistently good fit to implied volatility smiles of SPX options. They are continuous-time stochastic volatility models, whose volatility process is driven by a fractional Brownian motion with the corresponding Hurst parameter less than a half. Albeit the empirical success, the valuation of derivative securities under rough volatility models is challenging. The reason is that it is neither a semi-martingale nor a Markov process. This paper proposes a novel valuation framework for rough stochastic local volatility (RSLV) models. In particular, we introduce the perturbed stochastic local volatility (PSLV) model as the semi-martingale approximation for the RSLV model and establish its existence, uniqueness, Markovian representation and convergence. Then we propose a fast continuous-time Markov chain (CTMC) approximation algorithm to the PSLV model and establish its convergence. Numerical experiments demonstrate the convergence of our approximation method to the true prices, and also the remarkable accuracy and efficiency of the method in pricing European, barrier and American options. Comparing with existing literature, a significant reduction in the CPU time to arrive at the same level of accuracy is observed.
{"title":"A general valuation framework for rough stochastic local volatility models and applications","authors":"Wensheng Yang, Jingtang Ma, Zhenyu Cui","doi":"10.1016/j.ejor.2024.11.002","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.11.002","url":null,"abstract":"Rough volatility models are a new class of stochastic volatility models that have been shown to provide a consistently good fit to implied volatility smiles of SPX options. They are continuous-time stochastic volatility models, whose volatility process is driven by a fractional Brownian motion with the corresponding Hurst parameter less than a half. Albeit the empirical success, the valuation of derivative securities under rough volatility models is challenging. The reason is that it is neither a semi-martingale nor a Markov process. This paper proposes a novel valuation framework for rough stochastic local volatility (RSLV) models. In particular, we introduce the perturbed stochastic local volatility (PSLV) model as the semi-martingale approximation for the RSLV model and establish its existence, uniqueness, Markovian representation and convergence. Then we propose a fast continuous-time Markov chain (CTMC) approximation algorithm to the PSLV model and establish its convergence. Numerical experiments demonstrate the convergence of our approximation method to the true prices, and also the remarkable accuracy and efficiency of the method in pricing European, barrier and American options. Comparing with existing literature, a significant reduction in the CPU time to arrive at the same level of accuracy is observed.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"74 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-12DOI: 10.1016/j.ejor.2024.11.019
Yinghao Pan, Jie Wu, Chao-Chao Zhang, Muhammad Ali Nasir
The most complex challenge facing the energy market is identifying effective solutions to reduce CO2 emissions (CEs) and enhance environmental performance (EP). Coal production within the power sector is the primary source of these emissions. In this study, we developed a novel linear programming model that accounts for undesirable outputs to assess the EP of 15 power enterprises in eastern China from 2016 to 2020. In addition, we employed a global non-radial Malmquist-Luenberger productivity index (GNML) to analyse the mechanisms influencing changes in efficiency among these enterprises. Our findings indicate that, while the EP of the power industry in eastern China improved, it remains at a relatively low level and exhibits instability. Moreover, technological efficiency (TE) and scale efficiency (SE) play a significant role in determining production efficiency within the sector. Therefore, it is essential for industry managers to implement standardized production management regulations, enhance technological development and scale investments, and strengthen control over unintended emissions that could facilitate energy transition.
{"title":"Measuring carbon emission performance in China's energy market: Evidence from improved non-radial directional distance function data envelopment analysis","authors":"Yinghao Pan, Jie Wu, Chao-Chao Zhang, Muhammad Ali Nasir","doi":"10.1016/j.ejor.2024.11.019","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.11.019","url":null,"abstract":"The most complex challenge facing the energy market is identifying effective solutions to reduce CO<ce:inf loc=\"post\">2</ce:inf> emissions (CEs) and enhance environmental performance (EP). Coal production within the power sector is the primary source of these emissions. In this study, we developed a novel linear programming model that accounts for undesirable outputs to assess the EP of 15 power enterprises in eastern China from 2016 to 2020. In addition, we employed a global non-radial Malmquist-Luenberger productivity index (GNML) to analyse the mechanisms influencing changes in efficiency among these enterprises. Our findings indicate that, while the EP of the power industry in eastern China improved, it remains at a relatively low level and exhibits instability. Moreover, technological efficiency (TE) and scale efficiency (SE) play a significant role in determining production efficiency within the sector. Therefore, it is essential for industry managers to implement standardized production management regulations, enhance technological development and scale investments, and strengthen control over unintended emissions that could facilitate energy transition.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"176 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-10DOI: 10.1016/j.ejor.2024.11.012
Alexandre Cailhier, Irène Abi-Zeid, Roxane Lavoie, Francis Marleau-Donais, Jérôme Cerutti
In response to the growing recognition of the vital role played by streets as public spaces in enhancing the vibrancy of urban life, various concepts aiming at creating greener and more inclusive streets have gained popularity in recent years, especially in North America. Shared streets are one example of such concepts that have attracted the attention of citizens and of urban and transportation planning professionals alike. This was the case in the city of Sherbrooke (Quebec, Canada) where, in response to numerous citizens’ requests, a need was identified to develop decision aid tools to help evaluate and rank street segments based on their potential to become shared streets. To achieve this, an action-research project was initiated in which we conducted a socio-technical process based on MACBETH, a multicriteria evaluation method. The project led to the development of a spatial decision support tool, operationally used today by the city professionals. This tool ensures a more informed and transparent decision-making process and supports shared streets planning policy. The methods developed are generalizable and can be adapted to other cities facing similar planning problems.
{"title":"Where to plan shared streets: Development and application of a multicriteria spatial decision support tool","authors":"Alexandre Cailhier, Irène Abi-Zeid, Roxane Lavoie, Francis Marleau-Donais, Jérôme Cerutti","doi":"10.1016/j.ejor.2024.11.012","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.11.012","url":null,"abstract":"In response to the growing recognition of the vital role played by streets as public spaces in enhancing the vibrancy of urban life, various concepts aiming at creating greener and more inclusive streets have gained popularity in recent years, especially in North America. Shared streets are one example of such concepts that have attracted the attention of citizens and of urban and transportation planning professionals alike. This was the case in the city of Sherbrooke (Quebec, Canada) where, in response to numerous citizens’ requests, a need was identified to develop decision aid tools to help evaluate and rank street segments based on their potential to become shared streets. To achieve this, an action-research project was initiated in which we conducted a socio-technical process based on MACBETH, a multicriteria evaluation method. The project led to the development of a spatial decision support tool, operationally used today by the city professionals. This tool ensures a more informed and transparent decision-making process and supports shared streets planning policy. The methods developed are generalizable and can be adapted to other cities facing similar planning problems.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"5 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670542","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-10DOI: 10.1016/j.ejor.2024.11.014
Hubert Pun, Salar Ghamat
Cap-and-trade, a widely used carbon regulation policy, encourages firms to adopt carbon abatement technologies to reduce emissions. Traditional supply-chain literature on this policy assumes symmetrical information, overlooking the fact that carbon abatement efforts and costs are often private and vary significantly across geographies, industries, and pollutants. In this paper we explore a dual-channel setting involving a manufacturer and a retailer, where the manufacturer, subject to cap-and-trade regulations, has undisclosed information about its carbon abatement costs. Our findings reveal that high abatement costs can paradoxically benefit the manufacturer, the environment, consumers, and overall social welfare. Our result also cautions that a higher carbon trading price (e.g., due to more ambitious emission reduction targets) can disincentivize the manufacturer from investing in carbon abatement. Moreover, a higher production cost, while resulting in lower market output, can increase pollution generation. We contribute the following to the practitioner debate about the impact of carbon policies: for an industry with a large market size, our findings lend support to governments to implement a cap-and-trade policy, because the manufacturer, customers and social welfare can be better off under a cap-and-trade policy than under a tax policy or no carbon policy. Additionally, we suggest that in such industries, governments need not enforce information transparency within the supply chain.
{"title":"Cap-and-trade under a dual-channel setting in the presence of information asymmetry","authors":"Hubert Pun, Salar Ghamat","doi":"10.1016/j.ejor.2024.11.014","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.11.014","url":null,"abstract":"Cap-and-trade, a widely used carbon regulation policy, encourages firms to adopt carbon abatement technologies to reduce emissions. Traditional supply-chain literature on this policy assumes symmetrical information, overlooking the fact that carbon abatement efforts and costs are often private and vary significantly across geographies, industries, and pollutants. In this paper we explore a dual-channel setting involving a manufacturer and a retailer, where the manufacturer, subject to cap-and-trade regulations, has undisclosed information about its carbon abatement costs. Our findings reveal that high abatement costs can paradoxically benefit the manufacturer, the environment, consumers, and overall social welfare. Our result also cautions that a higher carbon trading price (e.g., due to more ambitious emission reduction targets) can disincentivize the manufacturer from investing in carbon abatement. Moreover, a higher production cost, while resulting in lower market output, can increase pollution generation. We contribute the following to the practitioner debate about the impact of carbon policies: for an industry with a large market size, our findings lend support to governments to implement a cap-and-trade policy, because the manufacturer, customers and social welfare can be better off under a cap-and-trade policy than under a tax policy or no carbon policy. Additionally, we suggest that in such industries, governments need not enforce information transparency within the supply chain.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"54 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We study how managers allocate resources in response to algorithmic recommendations that are programmed with specific levels of risk aversion. Using the anchoring and adjustment heuristic, we derive our predictions and test them in a series of multi-item newsvendor experiments. We find that highly risk-averse algorithmic recommendations have a strong and persistent influence on order decisions, even after the recommendations are no longer available. Furthermore, we show that these effects are similar regardless of factors such as source of advice (i.e., human vs. algorithm) and decision autonomy (i.e., whether the algorithm is externally assigned or chosen by the subjects themselves). Finally, we disentangle the effect of risk attitude from that of anchor distance and find that subjects selectively adjust their order decisions by relying more on algorithmic advice that contrasts with their inherent risk preferences. Our findings suggest that organizations can strategically utilize risk-averse algorithmic tools to improve inventory decisions while preserving managerial autonomy.
{"title":"Risk-averse algorithmic support and inventory management","authors":"Pranadharthiharan Narayanan, Jeeva Somasundaram, Matthias Seifert","doi":"10.1016/j.ejor.2024.11.013","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.11.013","url":null,"abstract":"We study how managers allocate resources in response to algorithmic recommendations that are programmed with specific levels of risk aversion. Using the anchoring and adjustment heuristic, we derive our predictions and test them in a series of multi-item newsvendor experiments. We find that highly risk-averse algorithmic recommendations have a strong and persistent influence on order decisions, even after the recommendations are no longer available. Furthermore, we show that these effects are similar regardless of factors such as source of advice (i.e., human vs. algorithm) and decision autonomy (i.e., whether the algorithm is externally assigned or chosen by the subjects themselves). Finally, we disentangle the effect of risk attitude from that of anchor distance and find that subjects selectively adjust their order decisions by relying more on algorithmic advice that contrasts with their inherent risk preferences. Our findings suggest that organizations can strategically utilize risk-averse algorithmic tools to improve inventory decisions while preserving managerial autonomy.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"32 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670541","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-09DOI: 10.1016/j.ejor.2024.10.046
Jiancheng Tu, Zhibin Wu
An accurate and interpretable credit scoring model plays a crucial role in helping financial institutions reduce losses by promptly detecting, containing, and preventing defaulters. However, existing models often face a trade-off between interpretability and predictive accuracy. Traditional models like Logistic Regression (LR) offer high interpretability but may have limited predictive performance, while more complex models may improve accuracy at the expense of interpretability. In this paper, we tackle the credit scoring problem with imbalanced data by proposing two new classification models based on the optimal classification tree with hyperplane splits (OCT-H). OCT-H provides transparency and easy interpretation with ‘if-then’ decision tree rules. The first model, the cost-sensitive optimal classification tree with hyperplane splits (CSOCT-H). The second model, the optimal classification tree with hyperplane splits based on maximizing F1-Score (OCT-H-F1), aims to directly maximize the F1-score. To enhance model scalability, we introduce a data sample reduction method using data binning and feature selection. We then propose two solution methods: a heuristic approach and a method utilizing warm-start techniques to accelerate the solving process. We evaluated the proposed models on four public datasets. The results show that OCT-H significantly outperforms traditional interpretable models, such as Decision Trees (DT) and Logistic Regression (LR), in both predictive performance and interpretability. On certain datasets, OCT-H performs as well as or better than advanced ensemble tree models, effectively narrowing the gap between interpretable models and black-box models.
准确且可解释的信用评分模型在帮助金融机构及时发现、控制和预防违约者以减少损失方面发挥着至关重要的作用。然而,现有的模型往往要在可解释性和预测准确性之间做出权衡。逻辑回归(LR)等传统模型具有较高的可解释性,但预测性能可能有限,而更复杂的模型可能会以牺牲可解释性为代价来提高准确性。本文针对不平衡数据的信用评分问题,提出了两种基于超平面分割最优分类树(OCT-H)的新分类模型。OCT-H 通过 "if-then "决策树规则提供透明度和简易解释。第一个模型是具有超平面分割的成本敏感最优分类树(CSOCT-H)。第二个模型是基于 F1 分数最大化的带超平面分割的最优分类树(OCT-H-F1),旨在直接使 F1 分数最大化。为了增强模型的可扩展性,我们引入了一种利用数据分选和特征选择减少数据样本的方法。然后,我们提出了两种求解方法:一种是启发式方法,另一种是利用热启动技术加速求解过程的方法。我们在四个公共数据集上对所提出的模型进行了评估。结果表明,OCT-H 在预测性能和可解释性方面都明显优于传统的可解释模型,如决策树(DT)和逻辑回归(LR)。在某些数据集上,OCT-H 的表现不亚于甚至优于先进的集合树模型,有效缩小了可解释模型与黑盒模型之间的差距。
{"title":"Inherently interpretable machine learning for credit scoring: Optimal classification tree with hyperplane splits","authors":"Jiancheng Tu, Zhibin Wu","doi":"10.1016/j.ejor.2024.10.046","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.10.046","url":null,"abstract":"An accurate and interpretable credit scoring model plays a crucial role in helping financial institutions reduce losses by promptly detecting, containing, and preventing defaulters. However, existing models often face a trade-off between interpretability and predictive accuracy. Traditional models like Logistic Regression (LR) offer high interpretability but may have limited predictive performance, while more complex models may improve accuracy at the expense of interpretability. In this paper, we tackle the credit scoring problem with imbalanced data by proposing two new classification models based on the optimal classification tree with hyperplane splits (OCT-H). OCT-H provides transparency and easy interpretation with ‘if-then’ decision tree rules. The first model, the cost-sensitive optimal classification tree with hyperplane splits (CSOCT-H). The second model, the optimal classification tree with hyperplane splits based on maximizing F1-Score (OCT-H-F1), aims to directly maximize the F1-score. To enhance model scalability, we introduce a data sample reduction method using data binning and feature selection. We then propose two solution methods: a heuristic approach and a method utilizing warm-start techniques to accelerate the solving process. We evaluated the proposed models on four public datasets. The results show that OCT-H significantly outperforms traditional interpretable models, such as Decision Trees (DT) and Logistic Regression (LR), in both predictive performance and interpretability. On certain datasets, OCT-H performs as well as or better than advanced ensemble tree models, effectively narrowing the gap between interpretable models and black-box models.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"33 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670543","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-09DOI: 10.1016/j.ejor.2024.10.044
Karel Devriesere, László Csató, Dries Goossens
Every sport needs rules. Tournament design refers to the rules that determine how a tournament, a series of games between a number of competitors, is organized. This study aims to provide an overview of the tournament design literature from the perspective of operational research. Three important design criteria are discussed: efficacy, fairness, and attractiveness. Our survey classifies the papers discussing these properties according to the main components of tournament design: format, seeding, draw, scheduling, and ranking. We also outline several open questions and promising directions for future research.
{"title":"Tournament design: A review from an operational research perspective","authors":"Karel Devriesere, László Csató, Dries Goossens","doi":"10.1016/j.ejor.2024.10.044","DOIUrl":"https://doi.org/10.1016/j.ejor.2024.10.044","url":null,"abstract":"Every sport needs rules. Tournament design refers to the rules that determine how a tournament, a series of games between a number of competitors, is organized. This study aims to provide an overview of the tournament design literature from the perspective of operational research. Three important design criteria are discussed: efficacy, fairness, and attractiveness. Our survey classifies the papers discussing these properties according to the main components of tournament design: format, seeding, draw, scheduling, and ranking. We also outline several open questions and promising directions for future research.","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":"74 1","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142670544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}