Pub Date : 2023-04-03DOI: 10.15837/ijccc.2023.2.5286
M. Visan, Firicel Mone
Answering climate change challenges, the paper proposes an intelligent decision support system (DSS) for the management of green-blue infrastructure (GBI). Addressing the gaps identified in other studies, the designed DSS incorporates four key elements: 1/interdisciplinary collaboration among all stakeholders 2/inclusion of practical operation and maintenance activities, 3/main components of distributed DSS, with practical examples of use, 4/consideration of conditions specific to the location. The multi-layered DSS architecture can be implemented as a unified platform that provides a comprehensive, customizable, and flexible framework based on AI tools, big data and analytics, edge computing, cloud, and mobile, IIoT, and biometric system tools. The use of cobots and digital clones alongside humans results in the implementation of hybrid human-machine units. DSS for GBI increases decision-making capacity and can serve as a foundation for the implementation of similar systems by governments and local communities to build sustainable and resilient communities.
{"title":"Computer-Supported Smart Green-Blue Infrastructure Management","authors":"M. Visan, Firicel Mone","doi":"10.15837/ijccc.2023.2.5286","DOIUrl":"https://doi.org/10.15837/ijccc.2023.2.5286","url":null,"abstract":"\u0000Answering climate change challenges, the paper proposes an intelligent decision support system (DSS) for the management of green-blue infrastructure (GBI). Addressing the gaps identified in other studies, the designed DSS incorporates four key elements: 1/interdisciplinary collaboration among all stakeholders 2/inclusion of practical operation and maintenance activities, 3/main components of distributed DSS, with practical examples of use, 4/consideration of conditions specific to the location. The multi-layered DSS architecture can be implemented as a unified platform that provides a comprehensive, customizable, and flexible framework based on AI tools, big data and analytics, edge computing, cloud, and mobile, IIoT, and biometric system tools. The use of cobots and digital clones alongside humans results in the implementation of hybrid human-machine units. DSS for GBI increases decision-making capacity and can serve as a foundation for the implementation of similar systems by governments and local communities to build sustainable and resilient communities.\u0000","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128224745","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.15837/ijccc.2023.2.5049
Bin Yang, Huilai Li, Ying Xing, F. Zeng, Chen Qian, Youzhi Shen, Jiong Wang
With the advent of the information age, the iterative speed of software update is gradually accelerating which makes software development severely limited by software testing. Test case prioritization is an effective way to accelerate software testing progress. With the introduction of heuristic algorithm to this task, the processing efficiency of test cases has been greatly improved. However, to overcome the shortcomings of slow convergence speed and easy fall into local optimum, the improved whale optimization algorithm is proposed for test case prioritization. Firstly, a model called n-dimensional directed search space is established for the swarm intelligence algorithm. Secondly, the enhanced whale optimization algorithm is applied to test case prioritization while the backtracking behavior is conducted for individuals when hitting the wall. In addition, a separate storage space for Pareto second optimization is also designed to filter the optimal solutions of the multi-objective tasks. Finally, both single-objective and multi-objective optimization experiments are carried out for open source projects and real-world projects, respectively. The results show that the improved whale optimization algorithm using n-dimensional directed search space is more conducive to the decisions of test case prioritization with fast convergence speed.
{"title":"Directed Search Based on Improved Whale Optimization Algorithm for Test Case Prioritization","authors":"Bin Yang, Huilai Li, Ying Xing, F. Zeng, Chen Qian, Youzhi Shen, Jiong Wang","doi":"10.15837/ijccc.2023.2.5049","DOIUrl":"https://doi.org/10.15837/ijccc.2023.2.5049","url":null,"abstract":"\u0000With the advent of the information age, the iterative speed of software update is gradually accelerating which makes software development severely limited by software testing. Test case prioritization is an effective way to accelerate software testing progress. With the introduction of heuristic algorithm to this task, the processing efficiency of test cases has been greatly improved. However, to overcome the shortcomings of slow convergence speed and easy fall into local optimum, the improved whale optimization algorithm is proposed for test case prioritization. Firstly, a model called n-dimensional directed search space is established for the swarm intelligence algorithm. Secondly, the enhanced whale optimization algorithm is applied to test case prioritization while the backtracking behavior is conducted for individuals when hitting the wall. In addition, a separate storage space for Pareto second optimization is also designed to filter the optimal solutions of the multi-objective tasks. Finally, both single-objective and multi-objective optimization experiments are carried out for open source projects and real-world projects, respectively. The results show that the improved whale optimization algorithm using n-dimensional directed search space is more conducive to the decisions of test case prioritization with fast convergence speed.\u0000","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127155465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.15837/ijccc.2023.2.5299
Xing-yuan Chen, Yong Deng
The measurement of uncertainty has been an important topic of research. In Dempster’s framework, Deng entropy serves as a reliable tool for such measurements. However, it fails to consider more comprehensive information, resulting in the loss of critical data. An improved belief entropy is proposed in this paper, which preserves all the merits of Deng entropy. When there is only a single element, it can be degraded to Shannon entropy. When dealing with multiple elements, the partitioning method employed for mass functions makes it more responsive and efficient than alternative measures of uncertainty. Some numerical examples are given to further illustrate the effectiveness and applicability of the proposed entropy measure. Additionally, a case study is conducted on software risk analysis, demonstrating the practical value and relevance of the proposed method in real-world scenarios.
{"title":"A new belief entropy and its application in software risk analysis","authors":"Xing-yuan Chen, Yong Deng","doi":"10.15837/ijccc.2023.2.5299","DOIUrl":"https://doi.org/10.15837/ijccc.2023.2.5299","url":null,"abstract":"\u0000The measurement of uncertainty has been an important topic of research. In Dempster’s framework, Deng entropy serves as a reliable tool for such measurements. However, it fails to consider more comprehensive information, resulting in the loss of critical data. An improved belief entropy is proposed in this paper, which preserves all the merits of Deng entropy. When there is only a single element, it can be degraded to Shannon entropy. When dealing with multiple elements, the partitioning method employed for mass functions makes it more responsive and efficient than alternative measures of uncertainty. Some numerical examples are given to further illustrate the effectiveness and applicability of the proposed entropy measure. Additionally, a case study is conducted on software risk analysis, demonstrating the practical value and relevance of the proposed method in real-world scenarios.\u0000","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"84 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126216328","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.15837/ijccc.2023.2.5320
B. Stanojević, M. Stanojevic
Business Analytics – which unites Descriptive, Predictive and Prescriptive Analytics – represents an important component in the framework of Big Data. It aims to transform data into information, enabling improvements in making decisions. Within Big Data, optimization is mostly related to the prescriptive analysis, but in this paper, we present one of its applications to a predictive analysis based on regression in fuzzy environment.The tools offered by a regression analysis can be used either to identify the correlation of a dependency between the observed inputs and outputs; or to provide a convenient approximation to the output data set, thus enabling its simplified manipulation. In this paper we introduce a new approach to predict the outputs of a fuzzy in – fuzzy out system through a fuzzy regression analysis developed in full accordance to the extension principle. Within our approach, a couple of mathematical optimization problems are solve for each desired α−level. The optimization models derive the left and right endpoints of the α−cut of the predicted fuzzy output, as minimum and maximum of all crisp values that can be obtained as predicted outputs to at least one regression problem with observed crisp data in the α−cut ranges of the corresponding fuzzy observed data. Relevant examples from the literature are recalled and used to illustrate the theoretical findings.
{"title":"Optimization-Based Fuzzy Regression in Full Compliance with the Extension Principle","authors":"B. Stanojević, M. Stanojevic","doi":"10.15837/ijccc.2023.2.5320","DOIUrl":"https://doi.org/10.15837/ijccc.2023.2.5320","url":null,"abstract":"\u0000Business Analytics – which unites Descriptive, Predictive and Prescriptive Analytics – represents an important component in the framework of Big Data. It aims to transform data into information, enabling improvements in making decisions. Within Big Data, optimization is mostly related to the prescriptive analysis, but in this paper, we present one of its applications to a predictive analysis based on regression in fuzzy environment.\u0000The tools offered by a regression analysis can be used either to identify the correlation of a dependency between the observed inputs and outputs; or to provide a convenient approximation to the output data set, thus enabling its simplified manipulation. In this paper we introduce a new approach to predict the outputs of a fuzzy in – fuzzy out system through a fuzzy regression analysis developed in full accordance to the extension principle. Within our approach, a couple of mathematical optimization problems are solve for each desired α−level. The optimization models derive the left and right endpoints of the α−cut of the predicted fuzzy output, as minimum and maximum of all crisp values that can be obtained as predicted outputs to at least one regression problem with observed crisp data in the α−cut ranges of the corresponding fuzzy observed data. Relevant examples from the literature are recalled and used to illustrate the theoretical findings.\u0000","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"9 10","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131574536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.15837/ijccc.2023.2.4998
Mohammed Sanad Alshammari, Aadil Alshammari
Today’s internet consists of an abundant amount of information that makes it difficult for recommendation engines to produce satisfying outputs. This huge stream of unrelated data increases its sparsity, which makes the recommender system’s job more challenging. Facebook’s main recommendation task is to recommend a friendship connection based on the idea that a friend of a friend is also a friend; however, the majority of recommendations using this approach lead to little to no interaction. We propose a model using the matrix factorization technique that leverages interactions between Facebook users and generates a list of friendship connections that are very likely to be interactive. We tested our model using a real dataset with over 33 million interactions between users. The accuracy of the proposed algorithm is measured using the error rate of the predicted number of interactions between possible friends in comparison to the actual values.
{"title":"Friend Recommendation Engine for Facebook Users via Collaborative Filtering","authors":"Mohammed Sanad Alshammari, Aadil Alshammari","doi":"10.15837/ijccc.2023.2.4998","DOIUrl":"https://doi.org/10.15837/ijccc.2023.2.4998","url":null,"abstract":"\u0000Today’s internet consists of an abundant amount of information that makes it difficult for recommendation engines to produce satisfying outputs. This huge stream of unrelated data increases its sparsity, which makes the recommender system’s job more challenging. Facebook’s main recommendation task is to recommend a friendship connection based on the idea that a friend of a friend is also a friend; however, the majority of recommendations using this approach lead to little to no interaction. We propose a model using the matrix factorization technique that leverages interactions between Facebook users and generates a list of friendship connections that are very likely to be interactive. We tested our model using a real dataset with over 33 million interactions between users. The accuracy of the proposed algorithm is measured using the error rate of the predicted number of interactions between possible friends in comparison to the actual values.\u0000","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133225851","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.15837/ijccc.2023.2.5241
Qianqian Zhang, Guining Geng, Qun Tu
Technological innovation is vital for the survival and development of enterprises. In the era of intelligent information interconnection and knowledge-driven economy, there is a growing interest in how to manage high-volume data, unlock its potential value, and provide intelligent analysis and decision-making support for enterprise’s technological innovation. This paper proposes an improved knowledge association analysis method based on the semantic concept model. This approach enables the discovery of potential correlations and interaction modes between the influencing factors of enterprise’s technological innovation, and provides a useful reference for decision-making by combining the analysis with the enterprise’s own situation.
{"title":"Association mining-based method for enterprise's technological innovation intelligent decision making under big data","authors":"Qianqian Zhang, Guining Geng, Qun Tu","doi":"10.15837/ijccc.2023.2.5241","DOIUrl":"https://doi.org/10.15837/ijccc.2023.2.5241","url":null,"abstract":"\u0000Technological innovation is vital for the survival and development of enterprises. In the era of intelligent information interconnection and knowledge-driven economy, there is a growing interest in how to manage high-volume data, unlock its potential value, and provide intelligent analysis and decision-making support for enterprise’s technological innovation. This paper proposes an improved knowledge association analysis method based on the semantic concept model. This approach enables the discovery of potential correlations and interaction modes between the influencing factors of enterprise’s technological innovation, and provides a useful reference for decision-making by combining the analysis with the enterprise’s own situation.\u0000","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127318148","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.15837/ijccc.2023.2.5118
Lorena Popa
The ranking of intuitionistic fuzzy numbers is paramount in the decision making process in a fuzzy and uncertain environment. In this paper, a new ranking function is defined, which is based on Robust’s ranking index of the membership function and the non-membership function of trapezoidal intuitionistic fuzzy numbers. The mentioned function also incorporates a parameter for the attitude of the decision factors. The given method is illustrated through several numerical examples and is studied in comparison to other already-existent methods. Starting from the new classification method, an algorithm for solving fuzzy multi-criteria decision-making (MCDM) problems is proposed. The application of said algorithm implies accepting the subjectivity of the deciding factors, and offers a clear perspective on the way in which the subjective attitude influences the decision-making process. Finally, a MCDM problem is solved to outline the advantages of the algorithm proposed in this paper.
{"title":"A new ranking method for trapezoidal intuitionistic fuzzy numbers and its application to multi-criteria decision making","authors":"Lorena Popa","doi":"10.15837/ijccc.2023.2.5118","DOIUrl":"https://doi.org/10.15837/ijccc.2023.2.5118","url":null,"abstract":"The ranking of intuitionistic fuzzy numbers is paramount in the decision making process in a fuzzy and uncertain environment. In this paper, a new ranking function is defined, which is based on Robust’s ranking index of the membership function and the non-membership function of trapezoidal intuitionistic fuzzy numbers. The mentioned function also incorporates a parameter for the attitude of the decision factors. The given method is illustrated through several numerical examples and is studied in comparison to other already-existent methods. Starting from the new classification method, an algorithm for solving fuzzy multi-criteria decision-making (MCDM) problems is proposed. The application of said algorithm implies accepting the subjectivity of the deciding factors, and offers a clear perspective on the way in which the subjective attitude influences the decision-making process. Finally, a MCDM problem is solved to outline the advantages of the algorithm proposed in this paper.","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131427918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.15837/ijccc.2023.2.4666
L. Jenila, R. Canessane
Video transmission using sensor networks plays a most significant role in industrial and surveillance applications. Multimedia transmission is also a challenging task in case of guaranteeing quality of service in conditions like limited bandwidth, high congestion, multi-hop routing, etc. Cross layer approach is carried out to handle multimedia transmission over sensor networks for improving network adaptivity. Cross layer based energy aware and packet scheduling algorithm is proposed here to reduce congestion ratio and to improve link quality between the routing nodes. Link quality estimation among nodes is done using Semi-Markov process. Node congestion rate is determined for identifying node’s data channel rate. Packet scheduling process determines the highly prioritized packets by using queue scheduler component thereby the active nodes are selected through link quality process and the packets are transmitted to sink based on prioritize level. Simulation analysis is carried out and the efficiency of the proposed mechanism is proved to be better while comparing with the conventional schemes.
{"title":"Cross Layer based Energy Aware and Packet Scheduling Algorithm for Wireless Multimedia Sensor Network","authors":"L. Jenila, R. Canessane","doi":"10.15837/ijccc.2023.2.4666","DOIUrl":"https://doi.org/10.15837/ijccc.2023.2.4666","url":null,"abstract":"Video transmission using sensor networks plays a most significant role in industrial and surveillance applications. Multimedia transmission is also a challenging task in case of guaranteeing quality of service in conditions like limited bandwidth, high congestion, multi-hop routing, etc. Cross layer approach is carried out to handle multimedia transmission over sensor networks for improving network adaptivity. Cross layer based energy aware and packet scheduling algorithm is proposed here to reduce congestion ratio and to improve link quality between the routing nodes. Link quality estimation among nodes is done using Semi-Markov process. Node congestion rate is determined for identifying node’s data channel rate. Packet scheduling process determines the highly prioritized packets by using queue scheduler component thereby the active nodes are selected through link quality process and the packets are transmitted to sink based on prioritize level. Simulation analysis is carried out and the efficiency of the proposed mechanism is proved to be better while comparing with the conventional schemes.","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130319560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.15837/ijccc.2023.2.5088
Xiaoxuan Ma, Yibo Deng, Lei Zhang, Zhiwen Li
Damaged image inpainting is one of the hottest research fields in computer image processing. The development of deep learning, especially Convolutional Neural Network (CNN), has significantly enhanced the effect of image inpainting. However, the direct connection between convolution layers may increase the risk of gradient disappearance or overfitting during training process. In addition, pixel artifacts or visual inconsistencies may occur if the damaged area is inpainted directly. To solve the above problems, we propose a novel Dense Gated Convolutional Network (DGCN) for generative image inpainting by modifying the gated convolutional network structure in this paper. Firstly, Holistically-nested edge detector (HED) is utilized to predict the edge information of the missing areas to assist the subsequent inpainting task to reduce the generation of artifacts. Then, dense connections are added to the generative network to reduce the network parameters while reducing the risk of instability in the training process. Finally, the experimental results on CelebA and Places2 datasets show that the proposed model achieves better inpainting results in terms of PSNR, SSIM and visual effects compared with other classical image inpainting models. DGCN has the common advantages of gated convolution and dense connection, which can reduce network parameters and improve the inpainting effect of the network.
{"title":"A Novel Generative Image Inpainting Model with Dense Gated Convolutional Network","authors":"Xiaoxuan Ma, Yibo Deng, Lei Zhang, Zhiwen Li","doi":"10.15837/ijccc.2023.2.5088","DOIUrl":"https://doi.org/10.15837/ijccc.2023.2.5088","url":null,"abstract":"\u0000Damaged image inpainting is one of the hottest research fields in computer image processing. The development of deep learning, especially Convolutional Neural Network (CNN), has significantly enhanced the effect of image inpainting. However, the direct connection between convolution layers may increase the risk of gradient disappearance or overfitting during training process. In addition, pixel artifacts or visual inconsistencies may occur if the damaged area is inpainted directly. To solve the above problems, we propose a novel Dense Gated Convolutional Network (DGCN) for generative image inpainting by modifying the gated convolutional network structure in this paper. Firstly, Holistically-nested edge detector (HED) is utilized to predict the edge information of the missing areas to assist the subsequent inpainting task to reduce the generation of artifacts. Then, dense connections are added to the generative network to reduce the network parameters while reducing the risk of instability in the training process. Finally, the experimental results on CelebA and Places2 datasets show that the proposed model achieves better inpainting results in terms of PSNR, SSIM and visual effects compared with other classical image inpainting models. DGCN has the common advantages of gated convolution and dense connection, which can reduce network parameters and improve the inpainting effect of the network.\u0000","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128840811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
During the COVID-19 epidemic, the online prescription pattern of Internet healthcare provides guarantee for the patients with chronic diseases and reduces the risk of cross-infection, but it also raises the burden of decision-making for doctors. Online drug recommendation system can effectively assist doctors by analysing the electronic medical records (EMR) of patients. Unlike commercial recommendations, the accuracy of drug recommendations should be very high due to their relevance to patient health. Besides, concept drift may occur in the drug treatment data streams, handling drift and location drift causes is critical to the accuracy and reliability of the recommended results. This paper proposes a multi-model fusion online drug recommendation system based on the association of drug and pathological features with online-nearline-offline architecture. The system transforms drug recommendation into pattern classification and adopts interpretable concept drift detection and adaptive ensemble classification algorithms. We apply the system to the Percutaneous Coronary Intervention (PCI) treatment process. The experiment results show our system performs nearly as good as doctors, the accuracy is close to 100%
{"title":"Ensemble Learning for Interpretable Concept Drift and Its Application to Drug Recommendation","authors":"Yunjuan Peng, Qi Qiu, Dalin Zhang, Tianyu Yang, Hailong Zhang","doi":"10.15837/ijccc.2023.1.5011","DOIUrl":"https://doi.org/10.15837/ijccc.2023.1.5011","url":null,"abstract":"During the COVID-19 epidemic, the online prescription pattern of Internet healthcare provides guarantee for the patients with chronic diseases and reduces the risk of cross-infection, but it also raises the burden of decision-making for doctors. Online drug recommendation system can effectively assist doctors by analysing the electronic medical records (EMR) of patients. Unlike commercial recommendations, the accuracy of drug recommendations should be very high due to their relevance to patient health. Besides, concept drift may occur in the drug treatment data streams, handling drift and location drift causes is critical to the accuracy and reliability of the recommended results. This paper proposes a multi-model fusion online drug recommendation system based on the association of drug and pathological features with online-nearline-offline architecture. The system transforms drug recommendation into pattern classification and adopts interpretable concept drift detection and adaptive ensemble classification algorithms. We apply the system to the Percutaneous Coronary Intervention (PCI) treatment process. The experiment results show our system performs nearly as good as doctors, the accuracy is close to 100%","PeriodicalId":179619,"journal":{"name":"Int. J. Comput. Commun. Control","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129897708","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}