Pub Date : 2024-01-09DOI: 10.3390/computers13010021
Sabrine Belmekki, Dominique Gruyer
In the dynamic landscape of vehicular communication systems, connected vehicles (CVs) present unprecedented capabilities in perception, cooperation, and, notably, probability of collision management. This paper’s main concern is the collision probability of collision estimation. Achieving effective collision estimation heavily relies on the sensor perception of obstacles and a critical collision probability prediction system. This paper is dedicated to refining the estimation of collision probability through the intentional integration of CV communications, with a specific focus on the collective perception of connected vehicles. The primary objective is to enhance the understanding of the potential probability of collisions in the surrounding environment by harnessing the collective insights gathered through inter-vehicular communication and collaboration. This improvement enables a superior anticipation capacity for both the driving system and the human driver, thereby enhancing road safety. Furthermore, the incorporation of extended perception strategies holds the potential for more accurate collision probability estimation, providing the driving system or human driver with increased time to react and make informed decisions, further fortifying road safety measures. The results underscore a significant enhancement in collision probability awareness, as connected vehicles collectively contribute to a more comprehensive collision probability landscape. Consequently, this heightened collective collision probability perception improves the anticipation capacity of both the driving system and the human driver, contributing to an elevated level of road safety. For future work, the exploration of our extended perception techniques to achieve real-time probability of collision estimation is proposed. Such endeavors aim to drive the development of robust and anticipatory autonomous driving systems that truly harness the benefits of connected vehicle technologies.
{"title":"Advanced Road Safety: Collective Perception for Probability of Collision Estimation of Connected Vehicles","authors":"Sabrine Belmekki, Dominique Gruyer","doi":"10.3390/computers13010021","DOIUrl":"https://doi.org/10.3390/computers13010021","url":null,"abstract":"In the dynamic landscape of vehicular communication systems, connected vehicles (CVs) present unprecedented capabilities in perception, cooperation, and, notably, probability of collision management. This paper’s main concern is the collision probability of collision estimation. Achieving effective collision estimation heavily relies on the sensor perception of obstacles and a critical collision probability prediction system. This paper is dedicated to refining the estimation of collision probability through the intentional integration of CV communications, with a specific focus on the collective perception of connected vehicles. The primary objective is to enhance the understanding of the potential probability of collisions in the surrounding environment by harnessing the collective insights gathered through inter-vehicular communication and collaboration. This improvement enables a superior anticipation capacity for both the driving system and the human driver, thereby enhancing road safety. Furthermore, the incorporation of extended perception strategies holds the potential for more accurate collision probability estimation, providing the driving system or human driver with increased time to react and make informed decisions, further fortifying road safety measures. The results underscore a significant enhancement in collision probability awareness, as connected vehicles collectively contribute to a more comprehensive collision probability landscape. Consequently, this heightened collective collision probability perception improves the anticipation capacity of both the driving system and the human driver, contributing to an elevated level of road safety. For future work, the exploration of our extended perception techniques to achieve real-time probability of collision estimation is proposed. Such endeavors aim to drive the development of robust and anticipatory autonomous driving systems that truly harness the benefits of connected vehicle technologies.","PeriodicalId":46292,"journal":{"name":"Computers","volume":"13 1","pages":""},"PeriodicalIF":2.8,"publicationDate":"2024-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139441667","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-09DOI: 10.3390/computers13010020
Faraz Sasani, Mohammad Moghareh Dehkordi, Zahra Ebrahimi, Hakimeh Dustmohammadloo, Parisa Bouzari, P. Ebrahimi, E. Lencsés, M. Fekete-Farkas
Liquidity is the ease of converting an asset (physical/digital) into cash or another asset without loss and is shown by the relationship between the time scale and the price scale of an investment. This article examines the illiquidity of Bitcoin (BTC). Bitcoin hash rate information was collected at three different time intervals; parallel to these data, textual information related to these intervals was collected from Twitter for each day. Due to the regression nature of illiquidity prediction, approaches based on recurrent networks were suggested. Seven approaches: ANN, SVM, SANN, LSTM, Simple RNN, GRU, and IndRNN, were tested on these data. To evaluate these approaches, three evaluation methods were used: random split (paper), random split (run) and linear split (run). The research results indicate that the IndRNN approach provided better results.
{"title":"Forecasting of Bitcoin Illiquidity Using High-Dimensional and Textual Features","authors":"Faraz Sasani, Mohammad Moghareh Dehkordi, Zahra Ebrahimi, Hakimeh Dustmohammadloo, Parisa Bouzari, P. Ebrahimi, E. Lencsés, M. Fekete-Farkas","doi":"10.3390/computers13010020","DOIUrl":"https://doi.org/10.3390/computers13010020","url":null,"abstract":"Liquidity is the ease of converting an asset (physical/digital) into cash or another asset without loss and is shown by the relationship between the time scale and the price scale of an investment. This article examines the illiquidity of Bitcoin (BTC). Bitcoin hash rate information was collected at three different time intervals; parallel to these data, textual information related to these intervals was collected from Twitter for each day. Due to the regression nature of illiquidity prediction, approaches based on recurrent networks were suggested. Seven approaches: ANN, SVM, SANN, LSTM, Simple RNN, GRU, and IndRNN, were tested on these data. To evaluate these approaches, three evaluation methods were used: random split (paper), random split (run) and linear split (run). The research results indicate that the IndRNN approach provided better results.","PeriodicalId":46292,"journal":{"name":"Computers","volume":"29 6","pages":""},"PeriodicalIF":2.8,"publicationDate":"2024-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139443567","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-08DOI: 10.3390/computers13010018
Raja Rao Budaraju, Sastry Kodanda Rama Jammalamadaka
Many data mining studies have focused on mining positive associations among frequent and regular item sets. However, none have considered time and regularity bearing in mind such associations. The frequent and regular item sets will be huge, even when regularity and frequency are considered without any time consideration. Negative associations are equally important in medical databases, reflecting considerable discrepancies in medications used to treat various disorders. It is important to find the most effective negative associations. The mined associations should be as small as possible so that the most important disconnections can be found. This paper proposes a mining method that mines medical databases to find regular, frequent, closed, and maximal item sets that reflect minimal negative associations. The proposed algorithm reduces the negative associations by 70% when the maximal and closed properties have been used, considering any sample size, regularity, or frequency threshold.
{"title":"Mining Negative Associations from Medical Databases Considering Frequent, Regular, Closed and Maximal Patterns","authors":"Raja Rao Budaraju, Sastry Kodanda Rama Jammalamadaka","doi":"10.3390/computers13010018","DOIUrl":"https://doi.org/10.3390/computers13010018","url":null,"abstract":"Many data mining studies have focused on mining positive associations among frequent and regular item sets. However, none have considered time and regularity bearing in mind such associations. The frequent and regular item sets will be huge, even when regularity and frequency are considered without any time consideration. Negative associations are equally important in medical databases, reflecting considerable discrepancies in medications used to treat various disorders. It is important to find the most effective negative associations. The mined associations should be as small as possible so that the most important disconnections can be found. This paper proposes a mining method that mines medical databases to find regular, frequent, closed, and maximal item sets that reflect minimal negative associations. The proposed algorithm reduces the negative associations by 70% when the maximal and closed properties have been used, considering any sample size, regularity, or frequency threshold.","PeriodicalId":46292,"journal":{"name":"Computers","volume":"30 12","pages":""},"PeriodicalIF":2.8,"publicationDate":"2024-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139444998","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-08DOI: 10.3390/computers13010019
M. Cruz, Abílio Oliveira, Alessandro Pinheiro
With the evolution of technologies, virtual reality allows us to dive into cyberspace through different devices and have immersive experiences in different contexts, which, in a simple way, we call virtual worlds or multiverse (integrating Metaverse versions). Through virtual reality, it is possible to create infinite simulated environments to immerse ourselves in. Future internet may be slightly different from what we use today. Virtual immersion situations are common (particularly in gaming), and the Metaverse has become a lived and almost real experience claiming its presence in our daily lives. To investigate possible perspectives or concepts regarding the Metaverse, virtual reality, and immersion, we considered a main research question: To what extent can a film centered on the multiverse be associated with adults’ Metaverse perceptions? Considering that all participants are adults, the objectives of this study are: (1) Verify the representations of the Metaverse; (2) Verify the representations of immersion; (3) Verify the representations of the multiverse; (4) Verify the importance of a film (related to the Metaverse and the multiverse) on the representations found. This study—framed in a Ph.D. research project—analyzed the participants’ answers through an online survey using two films to gather thoughts, ideas, emotions, sentiments, and reactions according to our research objectives. Some limitations were considered, such as the number of participants, number of the questionnaire questions and the knowledge or lack of the main concepts. Our results showed that a virtual world created by a movie might stimulate the perception of almost living in that supposed reality, accepting the multiverse and Metaverse not as distant concepts but as close experiences, even in an unconscious form. This finding is also a positive contribution to a discussion in progress aiming for an essential understanding of the Metaverse as a complex concept.
{"title":"Faraway, so Close: Perceptions of the Metaverse on the Edge of Madness","authors":"M. Cruz, Abílio Oliveira, Alessandro Pinheiro","doi":"10.3390/computers13010019","DOIUrl":"https://doi.org/10.3390/computers13010019","url":null,"abstract":"With the evolution of technologies, virtual reality allows us to dive into cyberspace through different devices and have immersive experiences in different contexts, which, in a simple way, we call virtual worlds or multiverse (integrating Metaverse versions). Through virtual reality, it is possible to create infinite simulated environments to immerse ourselves in. Future internet may be slightly different from what we use today. Virtual immersion situations are common (particularly in gaming), and the Metaverse has become a lived and almost real experience claiming its presence in our daily lives. To investigate possible perspectives or concepts regarding the Metaverse, virtual reality, and immersion, we considered a main research question: To what extent can a film centered on the multiverse be associated with adults’ Metaverse perceptions? Considering that all participants are adults, the objectives of this study are: (1) Verify the representations of the Metaverse; (2) Verify the representations of immersion; (3) Verify the representations of the multiverse; (4) Verify the importance of a film (related to the Metaverse and the multiverse) on the representations found. This study—framed in a Ph.D. research project—analyzed the participants’ answers through an online survey using two films to gather thoughts, ideas, emotions, sentiments, and reactions according to our research objectives. Some limitations were considered, such as the number of participants, number of the questionnaire questions and the knowledge or lack of the main concepts. Our results showed that a virtual world created by a movie might stimulate the perception of almost living in that supposed reality, accepting the multiverse and Metaverse not as distant concepts but as close experiences, even in an unconscious form. This finding is also a positive contribution to a discussion in progress aiming for an essential understanding of the Metaverse as a complex concept.","PeriodicalId":46292,"journal":{"name":"Computers","volume":"58 11","pages":""},"PeriodicalIF":2.8,"publicationDate":"2024-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139447758","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-01-03DOI: 10.3390/computers13010014
Dimitrios Stamatakis, Dimitrios G. Kogias, Pericles Papadopoulos, Panagiotis A. Karkazis, Helen C. Leligou
The advancement and acceptance of new technologies often hinges on the level of understanding and trust among potential users. Blockchain technology, despite its broad applications across diverse sectors, is often met with skepticism due to a general lack of understanding and incidents of illicit activities in the cryptocurrency domain. This study aims to demystify blockchain technology by providing an in-depth examination of its application in a novel blockchain-based card game, centered around renewable energy and sustainable resource management. This paper introduces a serious game that uses blockchain to enhance user interaction, ownership, and gameplay, demonstrating the technology’s potential to revolutionize the gaming industry. Notable aspects of the game, such as ownership of virtual assets, transparent transaction histories, trustless game mechanics, user-driven content creation, gasless transactions, and mechanisms for in-game asset trading and cross-platform asset reuse are analyzed. The paper discusses how these features, not only provide a richer gaming experience but also serve as effective tools for raising awareness about sustainable energy and resource management, thereby bridging the gap between entertainment and education. The case study offers valuable insights into how blockchain can create dynamic, secure, and participatory virtual environments, shifting the paradigm of traditional online gaming.
{"title":"Blockchain-Powered Gaming: Bridging Entertainment with Serious Game Objectives","authors":"Dimitrios Stamatakis, Dimitrios G. Kogias, Pericles Papadopoulos, Panagiotis A. Karkazis, Helen C. Leligou","doi":"10.3390/computers13010014","DOIUrl":"https://doi.org/10.3390/computers13010014","url":null,"abstract":"The advancement and acceptance of new technologies often hinges on the level of understanding and trust among potential users. Blockchain technology, despite its broad applications across diverse sectors, is often met with skepticism due to a general lack of understanding and incidents of illicit activities in the cryptocurrency domain. This study aims to demystify blockchain technology by providing an in-depth examination of its application in a novel blockchain-based card game, centered around renewable energy and sustainable resource management. This paper introduces a serious game that uses blockchain to enhance user interaction, ownership, and gameplay, demonstrating the technology’s potential to revolutionize the gaming industry. Notable aspects of the game, such as ownership of virtual assets, transparent transaction histories, trustless game mechanics, user-driven content creation, gasless transactions, and mechanisms for in-game asset trading and cross-platform asset reuse are analyzed. The paper discusses how these features, not only provide a richer gaming experience but also serve as effective tools for raising awareness about sustainable energy and resource management, thereby bridging the gap between entertainment and education. The case study offers valuable insights into how blockchain can create dynamic, secure, and participatory virtual environments, shifting the paradigm of traditional online gaming.","PeriodicalId":46292,"journal":{"name":"Computers","volume":"58 12","pages":""},"PeriodicalIF":2.8,"publicationDate":"2024-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139451097","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-30DOI: 10.3390/computers13010012
Ju Zhang, Bin Chen, Jiahui Qiu, Lingfan Zhuang, Zhiyuan Wang, Liu Liu
In recent years, Long-Term Evolution Vehicle-to-Everything (LTE-V2X) communication technology has received extensive attention. Timing synchronization is a crucial step in the receiving process, addressing Timing Offsets (TOs) resulting from random propagation delays, sampling frequency mismatches between the transmitter and receiver or a combination of both. However, the presence of high-speed relative movement between nodes and a low antenna height leads to a significant Doppler frequency offset, resulting in a low Signal-to-Noise Ratio (SNR) for received signals in LTE-V2X communication scenarios. This paper aims to investigate LTE-V2X technology with a specific focus on time synchronization. The research centers on the time synchronization method utilizing the Primary Sidelink Synchronization Signal (PSSS) and conducts a comprehensive analysis of existing algorithms, highlighting their respective advantages and disadvantages. On this basis, a robust timing synchronization algorithm for LTE-V2X communication scenarios is proposed. The algorithm comprises three key steps: coarse synchronization, frequency offset estimation and fine synchronization. Enhanced robustness is achieved through algorithm fusion, optimal decision threshold design and predefined frequency offset values. Furthermore, a hardware-in-the-loop simulation platform is established. The simulation results demonstrate a substantial performance improvement for the proposed algorithm compared to existing methods under adverse channel conditions characterized by high frequency offsets and low SNR.
{"title":"A Robust Timing Synchronization Algorithm Based on PSSS for LTE-V2X","authors":"Ju Zhang, Bin Chen, Jiahui Qiu, Lingfan Zhuang, Zhiyuan Wang, Liu Liu","doi":"10.3390/computers13010012","DOIUrl":"https://doi.org/10.3390/computers13010012","url":null,"abstract":"In recent years, Long-Term Evolution Vehicle-to-Everything (LTE-V2X) communication technology has received extensive attention. Timing synchronization is a crucial step in the receiving process, addressing Timing Offsets (TOs) resulting from random propagation delays, sampling frequency mismatches between the transmitter and receiver or a combination of both. However, the presence of high-speed relative movement between nodes and a low antenna height leads to a significant Doppler frequency offset, resulting in a low Signal-to-Noise Ratio (SNR) for received signals in LTE-V2X communication scenarios. This paper aims to investigate LTE-V2X technology with a specific focus on time synchronization. The research centers on the time synchronization method utilizing the Primary Sidelink Synchronization Signal (PSSS) and conducts a comprehensive analysis of existing algorithms, highlighting their respective advantages and disadvantages. On this basis, a robust timing synchronization algorithm for LTE-V2X communication scenarios is proposed. The algorithm comprises three key steps: coarse synchronization, frequency offset estimation and fine synchronization. Enhanced robustness is achieved through algorithm fusion, optimal decision threshold design and predefined frequency offset values. Furthermore, a hardware-in-the-loop simulation platform is established. The simulation results demonstrate a substantial performance improvement for the proposed algorithm compared to existing methods under adverse channel conditions characterized by high frequency offsets and low SNR.","PeriodicalId":46292,"journal":{"name":"Computers","volume":" 70","pages":""},"PeriodicalIF":2.8,"publicationDate":"2023-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139139501","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-30DOI: 10.3390/computers13010013
Luis Manuel Pereira, A. Salazar, L. Vergara
Automatic data fusion is an important field of machine learning that has been increasingly studied. The objective is to improve the classification performance from several individual classifiers in terms of accuracy and stability of the results. This paper presents a comparative study on recent data fusion methods. The fusion step can be applied at early and/or late stages of the classification procedure. Early fusion consists of combining features from different sources or domains to form the observation vector before the training of the individual classifiers. On the contrary, late fusion consists of combining the results from the individual classifiers after the testing stage. Late fusion has two setups, combination of the posterior probabilities (scores), which is called soft fusion, and combination of the decisions, which is called hard fusion. A theoretical analysis of the conditions for applying the three kinds of fusion (early, late, and late hard) is introduced. Thus, we propose a comparative analysis with different schemes of fusion, including weaknesses and strengths of the state-of-the-art methods studied from the following perspectives: sensors, features, scores, and decisions.
{"title":"A Comparative Study on Recent Automatic Data Fusion Methods","authors":"Luis Manuel Pereira, A. Salazar, L. Vergara","doi":"10.3390/computers13010013","DOIUrl":"https://doi.org/10.3390/computers13010013","url":null,"abstract":"Automatic data fusion is an important field of machine learning that has been increasingly studied. The objective is to improve the classification performance from several individual classifiers in terms of accuracy and stability of the results. This paper presents a comparative study on recent data fusion methods. The fusion step can be applied at early and/or late stages of the classification procedure. Early fusion consists of combining features from different sources or domains to form the observation vector before the training of the individual classifiers. On the contrary, late fusion consists of combining the results from the individual classifiers after the testing stage. Late fusion has two setups, combination of the posterior probabilities (scores), which is called soft fusion, and combination of the decisions, which is called hard fusion. A theoretical analysis of the conditions for applying the three kinds of fusion (early, late, and late hard) is introduced. Thus, we propose a comparative analysis with different schemes of fusion, including weaknesses and strengths of the state-of-the-art methods studied from the following perspectives: sensors, features, scores, and decisions.","PeriodicalId":46292,"journal":{"name":"Computers","volume":" 43","pages":""},"PeriodicalIF":2.8,"publicationDate":"2023-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139141294","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-26DOI: 10.3390/computers13010010
Jhon Fernando Sánchez-Álvarez, Gloria Patricia Jaramillo-Álvarez, J. A. Jiménez-Builes
Augmentative and alternative communication techniques (AAC) are essential to assist individuals facing communication difficulties. (1) Background: It is acknowledged that dynamic solutions that adjust to the changing needs of patients are necessary in the context of neuromuscular diseases. (2) Methods: In order address this concern, a differential approach was suggested that entailed the prior identification of the disease state. This approach employs fuzzy logic to ascertain the disease stage by analyzing intuitive patterns; it is contrasted with two intelligent systems. (3) Results: The results indicate that the AAC system’s adaptability enhances with the progression of the disease’s phases, thereby ensuring its utility throughout the lifespan of the individual. Although the adaptive AAC system exhibits signs of improvement, an expanded assessment involving a greater number of patients is required. (4) Conclusions: Qualitative assessments of comparative studies shed light on the difficulties associated with enhancing accuracy and adaptability. This research highlights the significance of investigating the use of fuzzy logic or artificial intelligence methods in order to solve the issue of symptom variability in disease staging.
{"title":"Facilitating Communication in Neuromuscular Diseases: An Adaptive Approach with Fuzzy Logic and Machine Learning in Augmentative and Alternative Communication Systems","authors":"Jhon Fernando Sánchez-Álvarez, Gloria Patricia Jaramillo-Álvarez, J. A. Jiménez-Builes","doi":"10.3390/computers13010010","DOIUrl":"https://doi.org/10.3390/computers13010010","url":null,"abstract":"Augmentative and alternative communication techniques (AAC) are essential to assist individuals facing communication difficulties. (1) Background: It is acknowledged that dynamic solutions that adjust to the changing needs of patients are necessary in the context of neuromuscular diseases. (2) Methods: In order address this concern, a differential approach was suggested that entailed the prior identification of the disease state. This approach employs fuzzy logic to ascertain the disease stage by analyzing intuitive patterns; it is contrasted with two intelligent systems. (3) Results: The results indicate that the AAC system’s adaptability enhances with the progression of the disease’s phases, thereby ensuring its utility throughout the lifespan of the individual. Although the adaptive AAC system exhibits signs of improvement, an expanded assessment involving a greater number of patients is required. (4) Conclusions: Qualitative assessments of comparative studies shed light on the difficulties associated with enhancing accuracy and adaptability. This research highlights the significance of investigating the use of fuzzy logic or artificial intelligence methods in order to solve the issue of symptom variability in disease staging.","PeriodicalId":46292,"journal":{"name":"Computers","volume":"106 1","pages":""},"PeriodicalIF":2.8,"publicationDate":"2023-12-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139154853","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In the face of numerous challenges in supply chain management, new technologies are being implemented to overcome obstacles and improve overall performance. Among these technologies, blockchain, a part of the distributed ledger family, offers several advantages when integrated with ERP systems, such as transparency, traceability, and data security. However, blockchain remains a novel, complex, and costly technology. The purpose of this paper is to guide decision-makers in determining whether integrating blockchain technology with ERP systems is appropriate during the pre-implementation phase. This paper focuses on the literature reviews, theories, and expert opinions to achieve its objectives. It first provides an overview of blockchain technology, then discusses its potential benefits to the supply chain, and finally proposes a framework to assist decision-makers in determining whether blockchain meets the needs of their consortium and whether this integration aligns with available resources. The results highlight the complexity of blockchain, the importance of detailed and in-depth research in deciding whether to integrate blockchain technology into ERP systems, and future research prospects. The findings of this article also present the critical decisions to be made prior to the implementation of blockchain, in the event that decision-makers choose to proceed with blockchain integration. The findings of this article augment the existing literature and can be applied in real-world contexts by stakeholders involved in blockchain integration projects with ERP systems.
{"title":"Towards Blockchain-Integrated Enterprise Resource Planning: A Pre-Implementation Guide","authors":"Lahlou Imane, Motaki Noureddine, Sarsri Driss, L’yarfi Hanane","doi":"10.3390/computers13010011","DOIUrl":"https://doi.org/10.3390/computers13010011","url":null,"abstract":"In the face of numerous challenges in supply chain management, new technologies are being implemented to overcome obstacles and improve overall performance. Among these technologies, blockchain, a part of the distributed ledger family, offers several advantages when integrated with ERP systems, such as transparency, traceability, and data security. However, blockchain remains a novel, complex, and costly technology. The purpose of this paper is to guide decision-makers in determining whether integrating blockchain technology with ERP systems is appropriate during the pre-implementation phase. This paper focuses on the literature reviews, theories, and expert opinions to achieve its objectives. It first provides an overview of blockchain technology, then discusses its potential benefits to the supply chain, and finally proposes a framework to assist decision-makers in determining whether blockchain meets the needs of their consortium and whether this integration aligns with available resources. The results highlight the complexity of blockchain, the importance of detailed and in-depth research in deciding whether to integrate blockchain technology into ERP systems, and future research prospects. The findings of this article also present the critical decisions to be made prior to the implementation of blockchain, in the event that decision-makers choose to proceed with blockchain integration. The findings of this article augment the existing literature and can be applied in real-world contexts by stakeholders involved in blockchain integration projects with ERP systems.","PeriodicalId":46292,"journal":{"name":"Computers","volume":"3 8‐9","pages":""},"PeriodicalIF":2.8,"publicationDate":"2023-12-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139156652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-25DOI: 10.3390/computers13010009
Lucas Daudt Franck, G. Ginja, J. P. Carmo, José A. Afonso, M. Luppe
The growth of digital communications has driven the development of numerous cryptographic methods for secure data transfer and storage. The SHA-256 algorithm is a cryptographic hash function widely used for validating data authenticity, identity, and integrity. The inherent SHA-256 computational overhead has motivated the search for more efficient hardware solutions, such as application-specific integrated circuits (ASICs). This work presents a custom ASIC hardware accelerator for the SHA-256 algorithm entirely created using open-source electronic design automation tools. The integrated circuit was synthesized using SkyWater SKY130 130 nm process technology through the OpenLANE automated workflow. The proposed final design is compatible with 32-bit microcontrollers, has a total area of 104,585 µm2, and operates at a maximum clock frequency of 97.9 MHz. Several optimization configurations were tested and analyzed during the synthesis phase to enhance the performance of the final design.
{"title":"Custom ASIC Design for SHA-256 Using Open-Source Tools","authors":"Lucas Daudt Franck, G. Ginja, J. P. Carmo, José A. Afonso, M. Luppe","doi":"10.3390/computers13010009","DOIUrl":"https://doi.org/10.3390/computers13010009","url":null,"abstract":"The growth of digital communications has driven the development of numerous cryptographic methods for secure data transfer and storage. The SHA-256 algorithm is a cryptographic hash function widely used for validating data authenticity, identity, and integrity. The inherent SHA-256 computational overhead has motivated the search for more efficient hardware solutions, such as application-specific integrated circuits (ASICs). This work presents a custom ASIC hardware accelerator for the SHA-256 algorithm entirely created using open-source electronic design automation tools. The integrated circuit was synthesized using SkyWater SKY130 130 nm process technology through the OpenLANE automated workflow. The proposed final design is compatible with 32-bit microcontrollers, has a total area of 104,585 µm2, and operates at a maximum clock frequency of 97.9 MHz. Several optimization configurations were tested and analyzed during the synthesis phase to enhance the performance of the final design.","PeriodicalId":46292,"journal":{"name":"Computers","volume":"43 7","pages":""},"PeriodicalIF":2.8,"publicationDate":"2023-12-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139158849","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}