首页 > 最新文献

Open Computer Science最新文献

英文 中文
Study on the random walk classification algorithm of polyant colony 多蚁群随机行走分类算法研究
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2022-0248
Wenhai Qiu
Abstract With the sustained and healthy development of economy, saving energy and reducing consumption and improving energy utilization rate is a major task that enterprises need to solve. With the complex and large-scale chemical process, the heat exchange network has become complex and diverse. For more and more complex and large-scale industrial heat exchange networks, there are many different kinds of heat exchangers, the flow is complex, so the heat exchange network presents a high degree of complexity, a node status change; its disturbance transfer will influence the stability of other nodes associated with it, because of the system coupling, thus affecting the controllability and reliability of the whole heat exchanger network. Process optimization design of heat exchange network is one of the main methods of energy saving in the industrial field. As a typical simulated evolutionary algorithm in swarm intelligence algorithm, ant colony algorithm combined with random walk classification algorithm, this article proposes an optimized heat transfer network based on multi-ant colony random walk classification algorithm. The heat exchanger was abstracted as a node, and the heat exchanger pipeline was abstracted as a side. According to the maximum geometric multiplicity of the eigenvalue of the adjacency matrix and the linear correlation row vector of the matrix, and combining the importance of the edge of the heat exchange network with the controllable range of the driving edge, the optimal control driving edge of the heat exchange network is identified. The results show that compared with the traditional heat exchanger, the size of the enhanced heat transfer equipment and the influence of pressure drop change. Compared with the results of the size of the heat exchanger strengthening heat transfer equipment and the stepwise optimization of the heat exchange network in this study, the cost of public engineering is reduced by 5.98% and the total cost is reduced by 8.83%.
摘要随着经济的持续健康发展,节能降耗、提高能源利用率是企业需要解决的一项重大任务。随着复杂大规模的化学过程,换热网络变得复杂多样。对于越来越复杂、规模越来越大的工业换热网络,换热器种类繁多,流动复杂,因此换热网络呈现出高度复杂、节点状态变化的特点;由于系统的耦合,其扰动传递会影响与其相关的其他节点的稳定性,从而影响整个换热器网络的可控性和可靠性。换热网络的工艺优化设计是工业领域节能的主要方法之一。作为群体智能算法中一种典型的模拟进化算法——蚁群算法与随机游动分类算法相结合,本文提出了一种基于多蚁群随机游动分类法的优化传热网络。将换热器抽象为一个节点,将换热管道抽象为一条边。根据邻接矩阵特征值的最大几何多重性和矩阵的线性相关行向量,结合换热网络边缘的重要性和驱动边缘的可控范围,识别换热网络的最优控制驱动边缘。结果表明,与传统换热器相比,强化传热设备的尺寸和压降的影响都发生了变化。与本研究换热器强化传热设备的尺寸和换热网络的逐步优化结果相比,公共工程成本降低了5.98%,总成本降低了8.83%。
{"title":"Study on the random walk classification algorithm of polyant colony","authors":"Wenhai Qiu","doi":"10.1515/comp-2022-0248","DOIUrl":"https://doi.org/10.1515/comp-2022-0248","url":null,"abstract":"Abstract With the sustained and healthy development of economy, saving energy and reducing consumption and improving energy utilization rate is a major task that enterprises need to solve. With the complex and large-scale chemical process, the heat exchange network has become complex and diverse. For more and more complex and large-scale industrial heat exchange networks, there are many different kinds of heat exchangers, the flow is complex, so the heat exchange network presents a high degree of complexity, a node status change; its disturbance transfer will influence the stability of other nodes associated with it, because of the system coupling, thus affecting the controllability and reliability of the whole heat exchanger network. Process optimization design of heat exchange network is one of the main methods of energy saving in the industrial field. As a typical simulated evolutionary algorithm in swarm intelligence algorithm, ant colony algorithm combined with random walk classification algorithm, this article proposes an optimized heat transfer network based on multi-ant colony random walk classification algorithm. The heat exchanger was abstracted as a node, and the heat exchanger pipeline was abstracted as a side. According to the maximum geometric multiplicity of the eigenvalue of the adjacency matrix and the linear correlation row vector of the matrix, and combining the importance of the edge of the heat exchange network with the controllable range of the driving edge, the optimal control driving edge of the heat exchange network is identified. The results show that compared with the traditional heat exchanger, the size of the enhanced heat transfer equipment and the influence of pressure drop change. Compared with the results of the size of the heat exchanger strengthening heat transfer equipment and the stepwise optimization of the heat exchange network in this study, the cost of public engineering is reduced by 5.98% and the total cost is reduced by 8.83%.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42849563","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Design of a web laboratory interface for ECG signal analysis using MATLAB builder NE 利用MATLAB builder NE设计一个用于心电信号分析的网络实验室界面
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2022-0244
Hussain A. Jaber, Hadeel K. Aljobouri, Ilyas Çankaya
Abstract An electrocardiogram (ECG) is a noninvasive test, determining any defect in the heart rate or rhythm or changes in the shape of the QRS complex is very significant to detect cardiac arrhythmia. In this study, novel web-ECG simulation tools were proposed using MATLAB Builder NE with WebFigure and ASP.NET platform. The proposed web-ECG simulation tools consisted of two components. First, involved the analyses of normal real ECG signals by calculating the P, Q, R, S, and T values and detecting heart rate, while the second part related to extracting the futures of several types of abnormality real ECG. For calculating the PQRST values, simple and new mathematical equations are proposed in the current study using MATLAB. The Web ECG is capable to plot normal ECG signals and five arrhythmia cases, so the users are able to calculate PQRST easily using the proposed simple method. ECG simulation tools have been tested for validity and educational contributions with 62 undergraduate and graduate students at the Al-Nahrain University-Biomedical Engineering Department, Iraq. The proposed ECG simulation tools have been designed for academic learning to be run easily by a student using only any web browsers without the need for installing MATLAB or any extra programs. The proposed tools could provide a laboratory course for ECG signal analysis using a few buttons, as well as increase and develop the educational skills of students and researchers.
心电图(electrocardiogram, ECG)是一种无创检查,确定心率或心律的任何缺陷或QRS复合体形状的变化对检测心律失常具有重要意义。本研究利用MATLAB Builder NE,结合WebFigure和ASP,提出了一种新的网络心电仿真工具。网络平台。所提出的网络心电仿真工具由两个部分组成。首先对正常的真实心电信号进行分析,计算P、Q、R、S、T值并检测心率;第二部分对几种异常的真实心电信号进行未来提取。为了计算PQRST值,本研究在MATLAB中提出了简单的新的数学方程。网络心电能够绘制正常心电信号和五种心律失常病例,因此用户可以使用本文提出的简单方法轻松计算PQRST。伊拉克Al-Nahrain大学生物医学工程系的62名本科生和研究生对ECG模拟工具的有效性和教育贡献进行了测试。所提出的心电仿真工具是为学术学习而设计的,学生只需使用任何web浏览器即可轻松运行,而无需安装MATLAB或任何额外的程序。所提出的工具可以提供一个使用几个按钮进行心电信号分析的实验课程,并提高和发展学生和研究人员的教育技能。
{"title":"Design of a web laboratory interface for ECG signal analysis using MATLAB builder NE","authors":"Hussain A. Jaber, Hadeel K. Aljobouri, Ilyas Çankaya","doi":"10.1515/comp-2022-0244","DOIUrl":"https://doi.org/10.1515/comp-2022-0244","url":null,"abstract":"Abstract An electrocardiogram (ECG) is a noninvasive test, determining any defect in the heart rate or rhythm or changes in the shape of the QRS complex is very significant to detect cardiac arrhythmia. In this study, novel web-ECG simulation tools were proposed using MATLAB Builder NE with WebFigure and ASP.NET platform. The proposed web-ECG simulation tools consisted of two components. First, involved the analyses of normal real ECG signals by calculating the P, Q, R, S, and T values and detecting heart rate, while the second part related to extracting the futures of several types of abnormality real ECG. For calculating the PQRST values, simple and new mathematical equations are proposed in the current study using MATLAB. The Web ECG is capable to plot normal ECG signals and five arrhythmia cases, so the users are able to calculate PQRST easily using the proposed simple method. ECG simulation tools have been tested for validity and educational contributions with 62 undergraduate and graduate students at the Al-Nahrain University-Biomedical Engineering Department, Iraq. The proposed ECG simulation tools have been designed for academic learning to be run easily by a student using only any web browsers without the need for installing MATLAB or any extra programs. The proposed tools could provide a laboratory course for ECG signal analysis using a few buttons, as well as increase and develop the educational skills of students and researchers.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45678639","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A study on the big data scientific research model and the key mechanism based on blockchain 基于b区块链的大数据科研模式及关键机制研究
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2022-0258
Shen Wen
Abstract In an era of open data sharing, the scientific research field puts forward an urgent need for the value of big data. However, big data still form “data islands,” which seriously affects the level of scientific research and the progress of scientific research. In this regard, this article proposes the research and realization of the big data scientific research model and key mechanism based on blockchain. This article uses the K-means algorithm to cluster scientific research data and reasonably utilizes the decentralization, smart contracts, and non-tampering characteristics of the blockchain to design a distributed data model based on the blockchain. This article proposes that a BIZi network is formed based on a blockchain Interplanetary File System (IPFS) and Zigzag code (blockchain, IPF Sand Zigzag code, BIZi for short) to achieve reliable data connection and through a set of data access control mechanisms and data service customization mechanism to effectively provide data requirements for scientific research. Finally, IPFS network transmission speed performance can better meet the needs of scientific research. The larger the number of file blocks, the higher the fault tolerance rate of the scheme and the better the storage efficiency. In a completely open data-sharing scenario, the fault tolerance rate of Byzantine nodes is extremely high to ensure the stability of the blockchain. The current optimal consensus algorithm fault tolerance rate reaches 49%.
在数据开放共享的时代,科研领域对大数据的价值提出了迫切的需求。但是,大数据仍然形成了“数据孤岛”,严重影响了科研水平和科研进程。对此,本文提出了基于区块链的大数据科研模式和关键机制的研究与实现。本文采用K-means算法对科研数据进行聚类,合理利用区块链的去中心化、智能合约、不可篡改等特性,设计基于区块链的分布式数据模型。本文提出基于区块链星际文件系统(IPFS)和Zigzag码(区块链,IPF Sand Zigzag码,简称BIZi)组成BIZi网络,实现可靠的数据连接,并通过一套数据访问控制机制和数据服务定制机制,有效地为科研提供数据需求。最后,IPFS网络传输速度性能能更好地满足科研需要。文件块数量越大,该方案的容错率越高,存储效率越高。在完全开放的数据共享场景下,拜占庭节点的容错率极高,保证了区块链的稳定性。目前最优共识算法容错率达到49%。
{"title":"A study on the big data scientific research model and the key mechanism based on blockchain","authors":"Shen Wen","doi":"10.1515/comp-2022-0258","DOIUrl":"https://doi.org/10.1515/comp-2022-0258","url":null,"abstract":"Abstract In an era of open data sharing, the scientific research field puts forward an urgent need for the value of big data. However, big data still form “data islands,” which seriously affects the level of scientific research and the progress of scientific research. In this regard, this article proposes the research and realization of the big data scientific research model and key mechanism based on blockchain. This article uses the K-means algorithm to cluster scientific research data and reasonably utilizes the decentralization, smart contracts, and non-tampering characteristics of the blockchain to design a distributed data model based on the blockchain. This article proposes that a BIZi network is formed based on a blockchain Interplanetary File System (IPFS) and Zigzag code (blockchain, IPF Sand Zigzag code, BIZi for short) to achieve reliable data connection and through a set of data access control mechanisms and data service customization mechanism to effectively provide data requirements for scientific research. Finally, IPFS network transmission speed performance can better meet the needs of scientific research. The larger the number of file blocks, the higher the fault tolerance rate of the scheme and the better the storage efficiency. In a completely open data-sharing scenario, the fault tolerance rate of Byzantine nodes is extremely high to ensure the stability of the blockchain. The current optimal consensus algorithm fault tolerance rate reaches 49%.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47776424","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Sign language identification and recognition: A comparative study 手语识别与识别的比较研究
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2022-0240
Ahmed A. Sultan, Walied Makram, Mohammed Kayed, Abdelmaged Amin Ali
Abstract Sign Language (SL) is the main language for handicapped and disabled people. Each country has its own SL that is different from other countries. Each sign in a language is represented with variant hand gestures, body movements, and facial expressions. Researchers in this field aim to remove any obstacles that prevent the communication with deaf people by replacing all device-based techniques with vision-based techniques using Artificial Intelligence (AI) and Deep Learning. This article highlights two main SL processing tasks: Sign Language Recognition (SLR) and Sign Language Identification (SLID). The latter task is targeted to identify the signer language, while the former is aimed to translate the signer conversation into tokens (signs). The article addresses the most common datasets used in the literature for the two tasks (static and dynamic datasets that are collected from different corpora) with different contents including numerical, alphabets, words, and sentences from different SLs. It also discusses the devices required to build these datasets, as well as the different preprocessing steps applied before training and testing. The article compares the different approaches and techniques applied on these datasets. It discusses both the vision-based and the data-gloves-based approaches, aiming to analyze and focus on main methods used in vision-based approaches such as hybrid methods and deep learning algorithms. Furthermore, the article presents a graphical depiction and a tabular representation of various SLR approaches.
摘要手语是残疾人的主要语言。每个国家都有自己不同于其他国家的SL。语言中的每个符号都用不同的手势、肢体动作和面部表情来表示。该领域的研究人员旨在通过使用人工智能(AI)和深度学习,用基于视觉的技术取代所有基于设备的技术,消除阻碍与聋人交流的任何障碍。本文重点介绍了两个主要的SL处理任务:手语识别(SLR)和手语识别(SLID)。后一项任务旨在识别签名者的语言,而前一项任务则旨在将签名者的对话转换为令牌(符号)。这篇文章介绍了文献中用于这两项任务的最常见的数据集(从不同语料库中收集的静态和动态数据集),其内容不同,包括来自不同SL的数字、字母、单词和句子。它还讨论了构建这些数据集所需的设备,以及在训练和测试之前应用的不同预处理步骤。本文比较了应用于这些数据集的不同方法和技术。它讨论了基于视觉和基于数据手套的方法,旨在分析和关注基于视觉的方法中使用的主要方法,如混合方法和深度学习算法。此外,本文还提供了各种SLR方法的图形描述和表格表示。
{"title":"Sign language identification and recognition: A comparative study","authors":"Ahmed A. Sultan, Walied Makram, Mohammed Kayed, Abdelmaged Amin Ali","doi":"10.1515/comp-2022-0240","DOIUrl":"https://doi.org/10.1515/comp-2022-0240","url":null,"abstract":"Abstract Sign Language (SL) is the main language for handicapped and disabled people. Each country has its own SL that is different from other countries. Each sign in a language is represented with variant hand gestures, body movements, and facial expressions. Researchers in this field aim to remove any obstacles that prevent the communication with deaf people by replacing all device-based techniques with vision-based techniques using Artificial Intelligence (AI) and Deep Learning. This article highlights two main SL processing tasks: Sign Language Recognition (SLR) and Sign Language Identification (SLID). The latter task is targeted to identify the signer language, while the former is aimed to translate the signer conversation into tokens (signs). The article addresses the most common datasets used in the literature for the two tasks (static and dynamic datasets that are collected from different corpora) with different contents including numerical, alphabets, words, and sentences from different SLs. It also discusses the devices required to build these datasets, as well as the different preprocessing steps applied before training and testing. The article compares the different approaches and techniques applied on these datasets. It discusses both the vision-based and the data-gloves-based approaches, aiming to analyze and focus on main methods used in vision-based approaches such as hybrid methods and deep learning algorithms. Furthermore, the article presents a graphical depiction and a tabular representation of various SLR approaches.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44728764","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Construction of a gas condensate field development model 凝析气田开发模型的构建
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2020-0226
A. Skiba
Abstract This article has developed and verified a mathematical aggregated approximate model of developing a gas condensate field using a cyclic process. The essence of the cyclic process is to pump the drained gas into the productive formation to reduce the pressure drop into the deposit. This process allows for increased condensate recovery in the future. The model discussed in this article is a continuous dynamic system with control parameters. It is a modification of the dynamic aggregated model of a purely gas field, designed for planning for a sufficiently long period with limited information about the state of the reservoir (the initial flow rate of wells, the initial recoverable gas reserves, the initial reservoir pressure, the dependence of potential condensate content per unit volume of fatty gas on the reservoir pressure). A non-standard approach underlies the model construction. Logical simplifications and a priori assumptions about the processes occurring in the field during its development are at its core. The instruments in the model are the increase in production and injection wells and the proportion of injection wells involved in the production. The purpose of the article is to calculate various variants of the dynamics of the fundamental indicators of the development of a gas condensate field for a sufficiently long-term period at the stage of preliminary design.
摘要本文建立并验证了一个利用循环过程开发凝析气田的数学聚合近似模型。循环过程的本质是将排出的气体泵入生产地层,以减少进入矿床的压降。该工艺可在未来提高冷凝水回收率。本文讨论的模型是一个具有控制参数的连续动态系统。它是对纯气田动态聚合模型的修改,设计用于在储层状态信息有限的情况下进行足够长的时间规划(井的初始流速、初始可采天然气储量、初始储层压力、每单位体积脂肪气的潜在凝析油含量对储层压力的依赖性)。模型构建的基础是非标准方法。关于该领域发展过程中发生的过程的逻辑简化和先验假设是其核心。模型中的仪器是产量和注入井的增加以及注入井在生产中所占的比例。本文的目的是在初步设计阶段计算凝析气田开发的基本指标在足够长的时间内的动态变化。
{"title":"Construction of a gas condensate field development model","authors":"A. Skiba","doi":"10.1515/comp-2020-0226","DOIUrl":"https://doi.org/10.1515/comp-2020-0226","url":null,"abstract":"Abstract This article has developed and verified a mathematical aggregated approximate model of developing a gas condensate field using a cyclic process. The essence of the cyclic process is to pump the drained gas into the productive formation to reduce the pressure drop into the deposit. This process allows for increased condensate recovery in the future. The model discussed in this article is a continuous dynamic system with control parameters. It is a modification of the dynamic aggregated model of a purely gas field, designed for planning for a sufficiently long period with limited information about the state of the reservoir (the initial flow rate of wells, the initial recoverable gas reserves, the initial reservoir pressure, the dependence of potential condensate content per unit volume of fatty gas on the reservoir pressure). A non-standard approach underlies the model construction. Logical simplifications and a priori assumptions about the processes occurring in the field during its development are at its core. The instruments in the model are the increase in production and injection wells and the proportion of injection wells involved in the production. The purpose of the article is to calculate various variants of the dynamics of the fundamental indicators of the development of a gas condensate field for a sufficiently long-term period at the stage of preliminary design.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45315359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Rainfall prediction system for Bangladesh using long short-term memory 使用长短期记忆的孟加拉国降雨量预测系统
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2022-0254
M. Billah, Md. Nasim Adnan, Mostafijur Rahman Akhond, Romana Rahman Ema, Md. Alam Hossain, S. Galib
Abstract Rainfall prediction is a challenging task and has extreme significance in weather forecasting. Accurate rainfall prediction can play a great role in agricultural, aviation, natural phenomenon, flood, construction, transport, etc. Weather or climate is assumed to be one of the most complex systems. Again, chaos, also called as “butterfly effect,” limits our ability to make weather predictable. So, it is not easy to predict rainfall by conventional machine learning approaches. However, several kinds of research have been proposed to predict rainfall by using different computational methods. To accomplish chaotic rainfall prediction system for Bangladesh, in this study, historical data set-driven long short term memory (LSTM) networks method has been used, which overcomes the complexities and chaos-related problems faced by other approaches. The proposed method has three principal phases: (i) The most useful 10 features are chosen from 20 data attributes. (ii) After that, a two-layer LSTM model is designed. (iii) Both conventional machine learning approaches and recent works are compared with the LSTM model. This approach has gained 97.14% accuracy in predicting rainfall (in millimeters), which outperforms the state-of-the-art solutions. Also, this work is a pioneer work to the rainfall prediction system for Bangladesh.
降雨预测是一项极具挑战性的任务,在天气预报中具有极其重要的意义。准确的降雨预报可以在农业、航空、自然现象、防洪、建筑、交通等方面发挥很大的作用。天气或气候被认为是最复杂的系统之一。同样,混沌,也被称为“蝴蝶效应”,限制了我们预测天气的能力。因此,通过传统的机器学习方法预测降雨并不容易。然而,已经提出了几种使用不同计算方法来预测降雨的研究。为了实现孟加拉国的混沌降雨预报系统,本研究采用了历史数据集驱动的长短期记忆(LSTM)网络方法,克服了其他方法所面临的复杂性和与混沌相关的问题。提出的方法有三个主要阶段:(i)从20个数据属性中选择最有用的10个特征。(ii)之后,设计两层LSTM模型。(iii)将传统的机器学习方法和最近的研究成果与LSTM模型进行比较。该方法在预测降雨量(以毫米为单位)方面的准确率达到97.14%,优于最先进的解决方案。此外,这项工作是孟加拉国降雨预报系统的先驱工作。
{"title":"Rainfall prediction system for Bangladesh using long short-term memory","authors":"M. Billah, Md. Nasim Adnan, Mostafijur Rahman Akhond, Romana Rahman Ema, Md. Alam Hossain, S. Galib","doi":"10.1515/comp-2022-0254","DOIUrl":"https://doi.org/10.1515/comp-2022-0254","url":null,"abstract":"Abstract Rainfall prediction is a challenging task and has extreme significance in weather forecasting. Accurate rainfall prediction can play a great role in agricultural, aviation, natural phenomenon, flood, construction, transport, etc. Weather or climate is assumed to be one of the most complex systems. Again, chaos, also called as “butterfly effect,” limits our ability to make weather predictable. So, it is not easy to predict rainfall by conventional machine learning approaches. However, several kinds of research have been proposed to predict rainfall by using different computational methods. To accomplish chaotic rainfall prediction system for Bangladesh, in this study, historical data set-driven long short term memory (LSTM) networks method has been used, which overcomes the complexities and chaos-related problems faced by other approaches. The proposed method has three principal phases: (i) The most useful 10 features are chosen from 20 data attributes. (ii) After that, a two-layer LSTM model is designed. (iii) Both conventional machine learning approaches and recent works are compared with the LSTM model. This approach has gained 97.14% accuracy in predicting rainfall (in millimeters), which outperforms the state-of-the-art solutions. Also, this work is a pioneer work to the rainfall prediction system for Bangladesh.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46565814","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
An alternative C++-based HPC system for Hadoop MapReduce 一个可选的基于c++的Hadoop MapReduce HPC系统
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2022-0246
Vignesh Srinivasakumar, Muthumanikandan Vanamoorthy, Siddarth Sairaj, S. Ganesh
Abstract MapReduce (MR) is a technique used to improve distributed data processing vastly and can massively speed up computation. Hadoop and MR rely on memory-intensive JVM and Java. A MR framework based on High-Performance Computing (HPC) could be used, which is both memory-efficient and faster than standard MR. This article explores a C++-based approach to MR and its feasibility on multiple factors like developer friendliness, deployment interface, efficiency, and scalability. This article also introduces Eager Reduction and Delayed Reduction techniques to speed up MR.
MapReduce (MR)是一种用于大幅度改进分布式数据处理的技术,可以大大提高计算速度。Hadoop和MR依赖于内存密集型的JVM和Java。可以使用基于高性能计算(HPC)的MR框架,它既节省内存,又比标准MR更快。本文从开发人员友好性、部署接口、效率和可伸缩性等多个因素探讨了基于c++的MR方法及其可行性。本文还介绍了热切约简和延迟约简技术,以加快MR的速度。
{"title":"An alternative C++-based HPC system for Hadoop MapReduce","authors":"Vignesh Srinivasakumar, Muthumanikandan Vanamoorthy, Siddarth Sairaj, S. Ganesh","doi":"10.1515/comp-2022-0246","DOIUrl":"https://doi.org/10.1515/comp-2022-0246","url":null,"abstract":"Abstract MapReduce (MR) is a technique used to improve distributed data processing vastly and can massively speed up computation. Hadoop and MR rely on memory-intensive JVM and Java. A MR framework based on High-Performance Computing (HPC) could be used, which is both memory-efficient and faster than standard MR. This article explores a C++-based approach to MR and its feasibility on multiple factors like developer friendliness, deployment interface, efficiency, and scalability. This article also introduces Eager Reduction and Delayed Reduction techniques to speed up MR.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48560209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Deep learning-based ensemble model for brain tumor segmentation using multi-parametric MR scans 基于深度学习的多参数MR扫描脑肿瘤分割集成模型
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2022-0242
Suchismita Das, S. Bose, G. K. Nayak, Sanjay Saxena
Abstract Glioma is a type of fast-growing brain tumor in which the shape, size, and location of the tumor vary from patient to patient. Manual extraction of a region of interest (tumor) with the help of a radiologist is a very difficult and time-consuming task. To overcome this problem, we proposed a fully automated deep learning-based ensemble method of brain tumor segmentation on four different 3D multimodal magnetic resonance imaging (MRI) scans. The segmentation is performed by three most efficient encoder–decoder deep models for segmentation and their results are measured through the well-known segmentation metrics. Then, a statistical analysis of the models was performed and an ensemble model is designed by considering the highest Matthews correlation coefficient using a particular MRI modality. There are two main contributions of the article: first the detailed comparison of the three models, and second proposing an ensemble model by combining the three models based on their segmentation accuracy. The model is evaluated using the brain tumor segmentation (BraTS) 2017 dataset and the F1 score of the final combined model is found to be 0.92, 0.95, 0.93, and 0.84 for whole tumor, core, enhancing tumor, and edema sub-tumor, respectively. Experimental results show that the model outperforms the state of the art.
摘要胶质瘤是一种生长迅速的脑肿瘤,其形状、大小和位置因患者而异。在放射科医生的帮助下手动提取感兴趣区域(肿瘤)是一项非常困难和耗时的任务。为了克服这个问题,我们提出了一种基于深度学习的全自动集成方法,用于在四种不同的3D多模式磁共振成像(MRI)扫描上进行脑肿瘤分割。分割由三个最有效的编码器-解码器深度模型执行,并通过众所周知的分割度量来衡量其结果。然后,对模型进行统计分析,并通过使用特定MRI模态考虑最高Matthews相关系数来设计集成模型。本文的主要贡献有两个:第一,对三个模型进行了详细的比较,第二,根据三个模型的分割精度,将它们结合起来,提出了一个集成模型。使用脑肿瘤分割(BraTS)2017数据集对该模型进行评估,发现最终组合模型的整个肿瘤、核心肿瘤、增强肿瘤和水肿亚肿瘤的F1得分分别为0.92、0.95、0.93和0.84。实验结果表明,该模型的性能优于现有技术。
{"title":"Deep learning-based ensemble model for brain tumor segmentation using multi-parametric MR scans","authors":"Suchismita Das, S. Bose, G. K. Nayak, Sanjay Saxena","doi":"10.1515/comp-2022-0242","DOIUrl":"https://doi.org/10.1515/comp-2022-0242","url":null,"abstract":"Abstract Glioma is a type of fast-growing brain tumor in which the shape, size, and location of the tumor vary from patient to patient. Manual extraction of a region of interest (tumor) with the help of a radiologist is a very difficult and time-consuming task. To overcome this problem, we proposed a fully automated deep learning-based ensemble method of brain tumor segmentation on four different 3D multimodal magnetic resonance imaging (MRI) scans. The segmentation is performed by three most efficient encoder–decoder deep models for segmentation and their results are measured through the well-known segmentation metrics. Then, a statistical analysis of the models was performed and an ensemble model is designed by considering the highest Matthews correlation coefficient using a particular MRI modality. There are two main contributions of the article: first the detailed comparison of the three models, and second proposing an ensemble model by combining the three models based on their segmentation accuracy. The model is evaluated using the brain tumor segmentation (BraTS) 2017 dataset and the F1 score of the final combined model is found to be 0.92, 0.95, 0.93, and 0.84 for whole tumor, core, enhancing tumor, and edema sub-tumor, respectively. Experimental results show that the model outperforms the state of the art.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47559941","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Multisource data acquisition based on single-chip microcomputer and sensor technology 基于单片机和传感器技术的多源数据采集
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2022-0261
Yahui Huang, Daozhong Lei
Abstract Today, data and information are flooded every day. Data are a reliable basis for scientific research. Their function is not only to clearly show real problems in various fields, but also to guide people to find the key factors that cause problems. The emergence of big data responds to this era of information explosion, and it is precisely by virtue of the accumulation of quantity that it presents the rules more clearly. No matter political, economic, cultural, and other fields are closely related to data. The application of microcontroller and sensor technology can help explore new branches of multisource data. However, the collection and analysis of multisource data only stays in the aspects of computer and communication technology. In view of the earlier problems, this article carried out scientific data collection and analysis of multisource data based on single-chip microcomputer and sensor technology. The research results showed that based on two algorithms, random early detection and weighted fair queuing, the analysis algorithm according to the Genetic Algorithm had a higher successful conversion rate. The power consumption of a node with better antenna performance was 9–10% lower than that of a node with poor antenna performance, which provided a basis for multisource data collection and analysis.
摘要今天,数据和信息每天都在泛滥。数据是科学研究的可靠基础。它们的作用不仅是清楚地展示各个领域的真实问题,而且引导人们找到导致问题的关键因素。大数据的出现回应了这个信息爆炸的时代,而正是凭借数量的积累,它将规则呈现得更加清晰。无论是政治、经济、文化等领域都与数据密切相关。微控制器和传感器技术的应用可以帮助探索多源数据的新分支。然而,多源数据的收集和分析只停留在计算机和通信技术方面。针对前期存在的问题,本文采用单片机和传感器技术对多源数据进行了科学的数据采集和分析。研究结果表明,基于随机早期检测和加权公平排队两种算法,基于遗传算法的分析算法具有较高的成功转换率。天线性能较好的节点的功耗比天线性能较差的节点低9-10%,这为多源数据收集和分析提供了基础。
{"title":"Multisource data acquisition based on single-chip microcomputer and sensor technology","authors":"Yahui Huang, Daozhong Lei","doi":"10.1515/comp-2022-0261","DOIUrl":"https://doi.org/10.1515/comp-2022-0261","url":null,"abstract":"Abstract Today, data and information are flooded every day. Data are a reliable basis for scientific research. Their function is not only to clearly show real problems in various fields, but also to guide people to find the key factors that cause problems. The emergence of big data responds to this era of information explosion, and it is precisely by virtue of the accumulation of quantity that it presents the rules more clearly. No matter political, economic, cultural, and other fields are closely related to data. The application of microcontroller and sensor technology can help explore new branches of multisource data. However, the collection and analysis of multisource data only stays in the aspects of computer and communication technology. In view of the earlier problems, this article carried out scientific data collection and analysis of multisource data based on single-chip microcomputer and sensor technology. The research results showed that based on two algorithms, random early detection and weighted fair queuing, the analysis algorithm according to the Genetic Algorithm had a higher successful conversion rate. The power consumption of a node with better antenna performance was 9–10% lower than that of a node with poor antenna performance, which provided a basis for multisource data collection and analysis.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48091832","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Wormhole attack detection techniques in ad-hoc network: A systematic review 自组织网络中的虫洞攻击检测技术:系统综述
IF 1.5 Q2 Computer Science Pub Date : 2022-01-01 DOI: 10.1515/comp-2022-0245
C. Gupta, Laxman Singh, Rajdev Tiwari
Abstract Mobile ad hoc networks (MANETs) are considered as decentralized networks, which can communicate without pre-existing infrastructure. Owning to utilization of open medium access and dynamically changing network topology, MANETs are vulnerable to different types of attacks such as blackhole attack, gray hole attack, Sybil attack, rushing attack, jellyfish attack, wormhole attack (WHA), byzantine attack, selfishness attack, and network partition attack. Out of these, worm hole attack is the most common and severe attack that substantially undermines the performance of the network and disrupts the most routing protocols. In the past two decades, numerous researchers have explored the number of techniques to detect and mitigate the effect of WHAs to ensure the safe operation of wireless networks. Hence, in this article, we mainly focus on the WHAs and present the different state of art methods, which have been employed in previous years to discern WHA in wireless networks. The existing WHA detection techniques are lacking due to usage of additional hardware, higher delay, and consumption of higher energy. Round trip time (RTT) based detection methods are showing better results as they do not require additional hardware. Machine learning (ML) techniques can also be applied to ad-hoc network for anomaly detection and has a great influence in future; therefore, ML techniques are also analyzed for WHA detection in this article. SVM technique is mostly used by the researchers for outstanding results. It has been analyzed that hybrid approach which uses the traditional detection technique and ML technique are showing better results for WHA detection. Finally, we have identified the areas where further research can be focused so that we can apply the WHA detection methods for larger topological area for more flexibility and accurate results.
摘要移动自组织网络(manet)被认为是一种分散的网络,它可以在没有预先存在的基础设施的情况下进行通信。由于使用开放的介质访问和动态变化的网络拓扑结构,manet容易受到不同类型的攻击,如黑洞攻击、灰洞攻击、Sybil攻击、rush攻击、水母攻击、虫洞攻击(WHA)、拜占庭攻击、自私攻击、网络分区攻击等。其中,蠕虫洞攻击是最常见和最严重的攻击,它极大地破坏了网络的性能并破坏了大多数路由协议。在过去的二十年中,许多研究人员已经探索了许多技术来检测和减轻无线网络的影响,以确保无线网络的安全运行。因此,在本文中,我们主要关注WHA,并介绍了前几年用于识别无线网络中的WHA的不同最新方法。现有的世卫病毒检测技术由于使用额外的硬件、较高的延迟和较高的能量消耗而缺乏。基于往返时间(RTT)的检测方法显示出更好的结果,因为它们不需要额外的硬件。机器学习(ML)技术也可以应用于ad-hoc网络进行异常检测,在未来有很大的影响;因此,本文还分析了ML技术用于WHA检测。支持向量机技术因其突出的效果而被研究人员广泛使用。分析表明,将传统检测技术与ML技术相结合的混合方法对WHA检测效果较好。最后,我们确定了进一步研究的重点领域,以便我们能够将卫生大会检测方法应用于更大的拓扑区域,以获得更大的灵活性和更准确的结果。
{"title":"Wormhole attack detection techniques in ad-hoc network: A systematic review","authors":"C. Gupta, Laxman Singh, Rajdev Tiwari","doi":"10.1515/comp-2022-0245","DOIUrl":"https://doi.org/10.1515/comp-2022-0245","url":null,"abstract":"Abstract Mobile ad hoc networks (MANETs) are considered as decentralized networks, which can communicate without pre-existing infrastructure. Owning to utilization of open medium access and dynamically changing network topology, MANETs are vulnerable to different types of attacks such as blackhole attack, gray hole attack, Sybil attack, rushing attack, jellyfish attack, wormhole attack (WHA), byzantine attack, selfishness attack, and network partition attack. Out of these, worm hole attack is the most common and severe attack that substantially undermines the performance of the network and disrupts the most routing protocols. In the past two decades, numerous researchers have explored the number of techniques to detect and mitigate the effect of WHAs to ensure the safe operation of wireless networks. Hence, in this article, we mainly focus on the WHAs and present the different state of art methods, which have been employed in previous years to discern WHA in wireless networks. The existing WHA detection techniques are lacking due to usage of additional hardware, higher delay, and consumption of higher energy. Round trip time (RTT) based detection methods are showing better results as they do not require additional hardware. Machine learning (ML) techniques can also be applied to ad-hoc network for anomaly detection and has a great influence in future; therefore, ML techniques are also analyzed for WHA detection in this article. SVM technique is mostly used by the researchers for outstanding results. It has been analyzed that hybrid approach which uses the traditional detection technique and ML technique are showing better results for WHA detection. Finally, we have identified the areas where further research can be focused so that we can apply the WHA detection methods for larger topological area for more flexibility and accurate results.","PeriodicalId":43014,"journal":{"name":"Open Computer Science","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66887337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
期刊
Open Computer Science
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1