首页 > 最新文献

Neural Computing & Applications最新文献

英文 中文
Performance analysis and comparison of Machine Learning and LoRa-based Healthcare model. 基于机器学习和LoRa的医疗保健模型的性能分析和比较。
IF 6 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 Epub Date: 2023-03-07 DOI: 10.1007/s00521-023-08411-5
Navneet Verma, Sukhdip Singh, Devendra Prasad

Diabetes Mellitus (DM) is a widespread condition that is one of the main causes of health disasters around the world, and health monitoring is one of the sustainable development topics. Currently, the Internet of Things (IoT) and Machine Learning (ML) technologies work together to provide a reliable method of monitoring and predicting Diabetes Mellitus. In this paper, we present the performance of a model for patient real-time data collection that employs the Hybrid Enhanced Adaptive Data Rate (HEADR) algorithm for the Long-Range (LoRa) protocol of the IoT. On the Contiki Cooja simulator, the LoRa protocol's performance is measured in terms of high dissemination and dynamic data transmission range allocation. Furthermore, by employing classification methods for the detection of diabetes severity levels on acquired data via the LoRa (HEADR) protocol, Machine Learning prediction takes place. For prediction, a variety of Machine Learning classifiers are employed, and the final results are compared with the already existing models where the Random Forest and Decision Tree classifiers outperform the others in terms of precision, recall, F-measure, and receiver operating curve (ROC) in the Python programming language. We also discovered that using k-fold cross-validation on k-neighbors, Logistic regression (LR), and Gaussian Nave Bayes (GNB) classifiers boosted the accuracy.

糖尿病是一种广泛存在的疾病,是世界各地健康灾难的主要原因之一,健康监测是可持续发展的主题之一。目前,物联网(IoT)和机器学习(ML)技术共同提供了一种监测和预测糖尿病的可靠方法。在本文中,我们介绍了一种用于患者实时数据收集的模型的性能,该模型采用了物联网远程(LoRa)协议的混合增强自适应数据速率(HEADR)算法。在Contiki-Cooja模拟器上,根据高传播性和动态数据传输范围分配来衡量LoRa协议的性能。此外,通过采用分类方法来检测通过LoRa(HEADR)协议获取的数据中的糖尿病严重程度水平,实现了机器学习预测。对于预测,使用了各种机器学习分类器,并将最终结果与现有模型进行了比较,在现有模型中,随机森林和决策树分类器在Python编程语言中的精度、召回率、F-测度和接收器工作曲线(ROC)方面优于其他分类器。我们还发现,在k近邻上使用k倍交叉验证、逻辑回归(LR)和高斯Nave Bayes(GNB)分类器提高了准确性。
{"title":"Performance analysis and comparison of Machine Learning and LoRa-based Healthcare model.","authors":"Navneet Verma,&nbsp;Sukhdip Singh,&nbsp;Devendra Prasad","doi":"10.1007/s00521-023-08411-5","DOIUrl":"10.1007/s00521-023-08411-5","url":null,"abstract":"<p><p>Diabetes Mellitus (DM) is a widespread condition that is one of the main causes of health disasters around the world, and health monitoring is one of the sustainable development topics. Currently, the Internet of Things (IoT) and Machine Learning (ML) technologies work together to provide a reliable method of monitoring and predicting Diabetes Mellitus. In this paper, we present the performance of a model for patient real-time data collection that employs the Hybrid Enhanced Adaptive Data Rate (HEADR) algorithm for the Long-Range (LoRa) protocol of the IoT. On the Contiki Cooja simulator, the LoRa protocol's performance is measured in terms of high dissemination and dynamic data transmission range allocation. Furthermore, by employing classification methods for the detection of diabetes severity levels on acquired data via the LoRa (HEADR) protocol, Machine Learning prediction takes place. For prediction, a variety of Machine Learning classifiers are employed, and the final results are compared with the already existing models where the Random Forest and Decision Tree classifiers outperform the others in terms of precision, recall, <i>F</i>-measure, and receiver operating curve (ROC) in the Python programming language. We also discovered that using <i>k</i>-fold cross-validation on <i>k</i>-neighbors, Logistic regression (LR), and Gaussian Nave Bayes (GNB) classifiers boosted the accuracy.</p>","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 17","pages":"12751-12761"},"PeriodicalIF":6.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9989556/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9479074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Special issue on neuro, fuzzy and their hybridization. 神经、模糊及其杂交专刊。
IF 6 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 DOI: 10.1007/s00521-022-08181-6
Longzhi Yang, Vijayakumar Varadarajan, Yanpeng Qu
{"title":"Special issue on neuro, fuzzy and their hybridization.","authors":"Longzhi Yang,&nbsp;Vijayakumar Varadarajan,&nbsp;Yanpeng Qu","doi":"10.1007/s00521-022-08181-6","DOIUrl":"https://doi.org/10.1007/s00521-022-08181-6","url":null,"abstract":"","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 10","pages":"7147-7148"},"PeriodicalIF":6.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9822807/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9489892","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
E-learningDJUST: E-learning dataset from Jordan university of science and technology toward investigating the impact of COVID-19 pandemic on education. E-learningDJUST:约旦科技大学的电子学习数据集,用于调查 COVID-19 大流行病对教育的影响。
IF 4.5 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 Epub Date: 2021-11-13 DOI: 10.1007/s00521-021-06712-1
Malak Abdullah, Mahmoud Al-Ayyoub, Saif AlRawashdeh, Farah Shatnawi

Recently, the COVID-19 pandemic has triggered different behaviors in education, especially during the lockdown, to contain the virus outbreak in the world. As a result, educational institutions worldwide are currently using online learning platforms to maintain their education presence. This research paper introduces and examines a dataset, E-LearningDJUST, that represents a sample of the student's study progress during the pandemic at Jordan University of Science and Technology (JUST). The dataset depicts a sample of the university's students as it includes 9,246 students from 11 faculties taking four courses in spring 2020, summer 2020, and fall 2021 semesters. To the best of our knowledge, it is the first collected dataset that reflects the students' study progress within a Jordanian institute using e-learning system records. One of this work's key findings is observing a high correlation between e-learning events and the final grades out of 100. Thus, the E-LearningDJUST dataset has been experimented with two robust machine learning models (Random Forest and XGBoost) and one simple deep learning model (Feed Forward Neural Network) to predict students' performances. Using RMSE as the primary evaluation criteria, the RMSE values range between 7 and 17. Among the other main findings, the application of feature selection with the random forest leads to better prediction results for all courses as the RMSE difference ranges between (0-0.20). Finally, a comparison study examined students' grades before and after the Coronavirus pandemic to understand how it impacted their grades. A high success rate has been observed during the pandemic compared to what it was before, and this is expected because the exams were online. However, the proportion of students with high marks remained similar to that of pre-pandemic courses.

最近,COVID-19 大流行引发了不同的教育行为,特别是在封锁期间,以遏制病毒在全球的爆发。因此,世界各地的教育机构目前都在使用在线学习平台来维持其教育存在。本研究论文介绍并研究了一个名为 "E-LearningDJUST "的数据集,该数据集代表了约旦科技大学(JUST)大流行期间学生学习进度的样本。该数据集描述了该大学的学生样本,包括来自 11 个学院的 9246 名学生,他们分别在 2020 年春季、2020 年夏季和 2021 年秋季学期选修了四门课程。据我们所知,这是首个利用电子学习系统记录反映约旦高校学生学习进度的数据集。这项工作的主要发现之一是观察到电子学习事件与期末成绩(满分 100 分)之间的高度相关性。因此,E-LearningDJUST 数据集采用了两种稳健的机器学习模型(随机森林和 XGBoost)和一种简单的深度学习模型(前馈神经网络)来预测学生的成绩。以 RMSE 作为主要评估标准,RMSE 值介于 7 到 17 之间。在其他主要发现中,随机森林的特征选择应用为所有课程带来了更好的预测结果,因为 RMSE 差值在(0-0.20)之间。最后,一项对比研究考察了冠状病毒大流行前后学生的成绩,以了解大流行对学生成绩的影响。与大流行之前相比,大流行期间的成功率较高,这在意料之中,因为考试是在线进行的。不过,获得高分的学生比例仍与大流行前的课程相似。
{"title":"E-learningDJUST: E-learning dataset from Jordan university of science and technology toward investigating the impact of COVID-19 pandemic on education.","authors":"Malak Abdullah, Mahmoud Al-Ayyoub, Saif AlRawashdeh, Farah Shatnawi","doi":"10.1007/s00521-021-06712-1","DOIUrl":"10.1007/s00521-021-06712-1","url":null,"abstract":"<p><p>Recently, the COVID-19 pandemic has triggered different behaviors in education, especially during the lockdown, to contain the virus outbreak in the world. As a result, educational institutions worldwide are currently using online learning platforms to maintain their education presence. This research paper introduces and examines a dataset, E-LearningDJUST, that represents a sample of the student's study progress during the pandemic at Jordan University of Science and Technology (JUST). The dataset depicts a sample of the university's students as it includes 9,246 students from 11 faculties taking four courses in spring 2020, summer 2020, and fall 2021 semesters. To the best of our knowledge, it is the first collected dataset that reflects the students' study progress within a Jordanian institute using e-learning system records. One of this work's key findings is observing a high correlation between e-learning events and the final grades out of 100. Thus, the E-LearningDJUST dataset has been experimented with two robust machine learning models (Random Forest and XGBoost) and one simple deep learning model (Feed Forward Neural Network) to predict students' performances. Using RMSE as the primary evaluation criteria, the RMSE values range between 7 and 17. Among the other main findings, the application of feature selection with the random forest leads to better prediction results for all courses as the RMSE difference ranges between (0-0.20). Finally, a comparison study examined students' grades before and after the Coronavirus pandemic to understand how it impacted their grades. A high success rate has been observed during the pandemic compared to what it was before, and this is expected because the exams were online. However, the proportion of students with high marks remained similar to that of pre-pandemic courses.</p>","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 16","pages":"11481-11495"},"PeriodicalIF":4.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8590139/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9492167","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A neural network approach to optimising treatments for depression using data from specialist and community psychiatric services in Australia, New Zealand and Japan. 利用来自澳大利亚、新西兰和日本的专家和社区精神病学服务的数据,采用神经网络方法优化抑郁症治疗。
IF 6 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 DOI: 10.1007/s00521-021-06710-3
Aidan Cousins, Lucas Nakano, Emma Schofield, Rasa Kabaila

This study investigated the application of a recurrent neural network for optimising pharmacological treatment for depression. A clinical dataset of 458 participants from specialist and community psychiatric services in Australia, New Zealand and Japan were extracted from an existing custom-built, web-based tool called Psynary . This data, which included baseline and self-completed reviews, was used to train and refine a novel algorithm which was a fully connected network feature extractor and long short-term memory algorithm was firstly trained in isolation and then integrated and annealed using slow learning rates due to the low dimensionality of the data. The accuracy of predicting depression remission before processing patient review data was 49.8%. After processing only 2 reviews, the accuracy was 76.5%. When considering a change in medication, the precision of changing medications was 97.4% and the recall was 71.4% . The medications with predicted best results were antipsychotics (88%) and selective serotonin reuptake inhibitors (87.9%). This is the first study that has created an all-in-one algorithm for optimising treatments for all subtypes of depression. Reducing treatment optimisation time for patients suffering with depression may lead to earlier remission and hence reduce the high levels of disability associated with the condition. Furthermore, in a setting where mental health conditions are increasing strain on mental health services, the utilisation of web-based tools for remote monitoring and machine/deep learning algorithms may assist clinicians in both specialist and primary care in extending specialist mental healthcare to a larger patient community.

本研究探讨了应用递归神经网络优化抑郁症的药物治疗。来自澳大利亚、新西兰和日本的专家和社区精神病服务机构的458名参与者的临床数据集是从一个名为Psynary的现有定制网络工具中提取出来的。这些数据包括基线和自我完成的评论,用于训练和完善一种新的算法,该算法是一种完全连接的网络特征提取器,长短期记忆算法首先进行孤立训练,然后由于数据的低维数而使用缓慢的学习速率进行整合和退火。在处理患者回顾资料前预测抑郁缓解的准确率为49.8%。仅处理2条评论后,准确率为76.5%。当考虑更换药物时,更换药物的准确率为97.4%,召回率为71.4%。预测效果最好的药物是抗精神病药物(88%)和选择性血清素再摄取抑制剂(87.9%)。这是首个为优化所有抑郁症亚型的治疗方法而创建一体化算法的研究。减少抑郁症患者的治疗优化时间可能导致早期缓解,从而减少与该病症相关的高水平残疾。此外,在心理健康状况对心理健康服务的压力越来越大的情况下,利用基于网络的工具进行远程监测和机器/深度学习算法,可以帮助专科和初级保健的临床医生将专科心理保健扩展到更大的患者群体。
{"title":"A neural network approach to optimising treatments for depression using data from specialist and community psychiatric services in Australia, New Zealand and Japan.","authors":"Aidan Cousins,&nbsp;Lucas Nakano,&nbsp;Emma Schofield,&nbsp;Rasa Kabaila","doi":"10.1007/s00521-021-06710-3","DOIUrl":"https://doi.org/10.1007/s00521-021-06710-3","url":null,"abstract":"<p><p>This study investigated the application of a recurrent neural network for optimising pharmacological treatment for depression. A clinical dataset of 458 participants from specialist and community psychiatric services in Australia, New Zealand and Japan were extracted from an existing custom-built, web-based tool called <i>Psynary</i> . This data, which included baseline and self-completed reviews, was used to train and refine a novel algorithm which was a fully connected network feature extractor and long short-term memory algorithm was firstly trained in isolation and then integrated and annealed using slow learning rates due to the low dimensionality of the data. The accuracy of predicting depression remission before processing patient review data was 49.8%. After processing only 2 reviews, the accuracy was 76.5%. When considering a change in medication, the precision of changing medications was 97.4% and the recall was 71.4% . The medications with predicted best results were antipsychotics (88%) and selective serotonin reuptake inhibitors (87.9%). <i>This is the first study that has created an all-in-one algorithm for optimising treatments for all subtypes of depression.</i> Reducing treatment optimisation time for patients suffering with depression may lead to earlier remission and hence reduce the high levels of disability associated with the condition. Furthermore, in a setting where mental health conditions are increasing strain on mental health services, the utilisation of web-based tools for remote monitoring and machine/deep learning algorithms may assist clinicians in both specialist and primary care in extending specialist mental healthcare to a larger patient community.</p>","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 16","pages":"11497-11516"},"PeriodicalIF":6.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8754538/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9503950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Toward real-time and efficient cardiovascular monitoring for COVID-19 patients by 5G-enabled wearable medical devices: a deep learning approach. 通过5G可穿戴医疗设备对新冠肺炎患者进行实时高效的心血管监测:一种深度学习方法。
IF 6 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 Epub Date: 2021-07-04 DOI: 10.1007/s00521-021-06219-9
Liang Tan, Keping Yu, Ali Kashif Bashir, Xiaofan Cheng, Fangpeng Ming, Liang Zhao, Xiaokang Zhou

Patients with deaths from COVID-19 often have co-morbid cardiovascular disease. Real-time cardiovascular disease monitoring based on wearable medical devices may effectively reduce COVID-19 mortality rates. However, due to technical limitations, there are three main issues. First, the traditional wireless communication technology for wearable medical devices is difficult to satisfy the real-time requirements fully. Second, current monitoring platforms lack efficient streaming data processing mechanisms to cope with the large amount of cardiovascular data generated in real time. Third, the diagnosis of the monitoring platform is usually manual, which is challenging to ensure that enough doctors online to provide a timely, efficient, and accurate diagnosis. To address these issues, this paper proposes a 5G-enabled real-time cardiovascular monitoring system for COVID-19 patients using deep learning. Firstly, we employ 5G to send and receive data from wearable medical devices. Secondly, Flink streaming data processing framework is applied to access electrocardiogram data. Finally, we use convolutional neural networks and long short-term memory networks model to obtain automatically predict the COVID-19 patient's cardiovascular health. Theoretical analysis and experimental results show that our proposal can well solve the above issues and improve the prediction accuracy of cardiovascular disease to 99.29%.

新冠肺炎死亡患者通常患有合并心血管疾病。基于可穿戴医疗设备的实时心血管疾病监测可以有效降低新冠肺炎死亡率。然而,由于技术限制,主要存在三个问题。首先,传统的可穿戴医疗设备无线通信技术难以完全满足实时性要求。其次,目前的监测平台缺乏有效的流式数据处理机制来应对实时生成的大量心血管数据。第三,监测平台的诊断通常是手动的,这很难确保足够多的医生在线提供及时、高效和准确的诊断。为了解决这些问题,本文提出了一种使用深度学习的新冠肺炎患者5G实时心血管监测系统。首先,我们使用5G来发送和接收来自可穿戴医疗设备的数据。其次,将Flink流式数据处理框架应用于心电数据的访问。最后,我们使用卷积神经网络和长短期记忆网络模型来获得对新冠肺炎患者心血管健康的自动预测。理论分析和实验结果表明,我们的建议可以很好地解决上述问题,并将心血管疾病的预测准确率提高到99.29%。
{"title":"Toward real-time and efficient cardiovascular monitoring for COVID-19 patients by 5G-enabled wearable medical devices: a deep learning approach.","authors":"Liang Tan,&nbsp;Keping Yu,&nbsp;Ali Kashif Bashir,&nbsp;Xiaofan Cheng,&nbsp;Fangpeng Ming,&nbsp;Liang Zhao,&nbsp;Xiaokang Zhou","doi":"10.1007/s00521-021-06219-9","DOIUrl":"10.1007/s00521-021-06219-9","url":null,"abstract":"<p><p>Patients with deaths from COVID-19 often have co-morbid cardiovascular disease. Real-time cardiovascular disease monitoring based on wearable medical devices may effectively reduce COVID-19 mortality rates. However, due to technical limitations, there are three main issues. First, the traditional wireless communication technology for wearable medical devices is difficult to satisfy the real-time requirements fully. Second, current monitoring platforms lack efficient streaming data processing mechanisms to cope with the large amount of cardiovascular data generated in real time. Third, the diagnosis of the monitoring platform is usually manual, which is challenging to ensure that enough doctors online to provide a timely, efficient, and accurate diagnosis. To address these issues, this paper proposes a 5G-enabled real-time cardiovascular monitoring system for COVID-19 patients using deep learning. Firstly, we employ 5G to send and receive data from wearable medical devices. Secondly, Flink streaming data processing framework is applied to access electrocardiogram data. Finally, we use convolutional neural networks and long short-term memory networks model to obtain automatically predict the COVID-19 patient's cardiovascular health. Theoretical analysis and experimental results show that our proposal can well solve the above issues and improve the prediction accuracy of cardiovascular disease to 99.29%.</p>","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 19","pages":"13921-13934"},"PeriodicalIF":6.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1007/s00521-021-06219-9","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9526794","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 58
Machine learning-based diffusion model for prediction of coronavirus-19 outbreak. 基于机器学习的冠状病毒肺炎疫情扩散预测模型。
IF 4.5 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 Epub Date: 2021-08-12 DOI: 10.1007/s00521-021-06376-x
Supriya Raheja, Shreya Kasturia, Xiaochun Cheng, Manoj Kumar

The coronavirus pandemic has been globally impacting the health and prosperity of people. A persistent increase in the number of positive cases has boost the stress among governments across the globe. There is a need of approach which gives more accurate predictions of outbreak. This paper presents a novel approach called diffusion prediction model for prediction of number of coronavirus cases in four countries: India, France, China and Nepal. Diffusion prediction model works on the diffusion process of the human contact. Model considers two forms of spread: when the spread takes time after infecting one person and when the spread is immediate after infecting one person. It makes the proposed model different over other state-of-the art models. It is giving more accurate results than other state-of-the art models. The proposed diffusion prediction model forecasts the number of new cases expected to occur in next 4 weeks. The model has predicted the number of confirmed cases, recovered cases, deaths and active cases. The model can facilitate government to be well prepared for any abrupt rise in this pandemic. The performance is evaluated in terms of accuracy and error rate and compared with the prediction results of support vector machine, logistic regression model and convolution neural network. The results prove the efficiency of the proposed model.

冠状病毒大流行在全球范围内影响着人们的健康和繁荣。阳性病例数量的持续增加加剧了全球各国政府的压力。需要一种对疫情进行更准确预测的方法。本文提出了一种新的方法,称为扩散预测模型,用于预测四个国家的冠状病毒病例数:印度、法国、中国和尼泊尔。扩散预测模型主要研究人体接触的扩散过程。该模型考虑了两种传播形式:感染一个人后传播需要一段时间,感染一个人之后立即传播。这使得所提出的模型与其他现有技术的模型不同。它给出的结果比其他最先进的模型更准确。所提出的扩散预测模型预测了未来4周内预计出现的新病例数量。该模型预测了确诊病例、康复病例、死亡病例和活跃病例的数量。该模式可以帮助政府为这场疫情的任何突然上升做好充分准备。从精度和错误率方面对其性能进行了评估,并与支持向量机、逻辑回归模型和卷积神经网络的预测结果进行了比较。结果证明了该模型的有效性。
{"title":"Machine learning-based diffusion model for prediction of coronavirus-19 outbreak.","authors":"Supriya Raheja, Shreya Kasturia, Xiaochun Cheng, Manoj Kumar","doi":"10.1007/s00521-021-06376-x","DOIUrl":"10.1007/s00521-021-06376-x","url":null,"abstract":"<p><p>The coronavirus pandemic has been globally impacting the health and prosperity of people. A persistent increase in the number of positive cases has boost the stress among governments across the globe. There is a need of approach which gives more accurate predictions of outbreak. This paper presents a novel approach called diffusion prediction model for prediction of number of coronavirus cases in four countries: India, France, China and Nepal. Diffusion prediction model works on the diffusion process of the human contact. Model considers two forms of spread: when the spread takes time after infecting one person and when the spread is immediate after infecting one person. It makes the proposed model different over other state-of-the art models. It is giving more accurate results than other state-of-the art models. The proposed diffusion prediction model forecasts the number of new cases expected to occur in next 4 weeks. The model has predicted the number of confirmed cases, recovered cases, deaths and active cases. The model can facilitate government to be well prepared for any abrupt rise in this pandemic. The performance is evaluated in terms of accuracy and error rate and compared with the prediction results of support vector machine, logistic regression model and convolution neural network. The results prove the efficiency of the proposed model.</p>","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 19","pages":"13755-13774"},"PeriodicalIF":4.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8358916/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9526796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
PA during the COVID-19 outbreak in China: a cross-sectional study. 新冠肺炎在中国爆发期间的PA:一项横断面研究。
IF 6 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 Epub Date: 2021-10-01 DOI: 10.1007/s00521-021-06538-x
Yingjun Nie, Yuanyan Ma, Xiaodong Li, Yankong Wu, Weixin Liu, Zhenke Tan, Jiahui Li, Ce Zhang, Chennan Lv, Ting Liu

COVID-19 has undergone several mutations and is still spreading in most countries now. PA has positive benefits in the prevention of COVID-19 infection and counteracting the negative physical and mental effects caused by COVID-19. However, relevant evidence has indicated a high prevalence of physical inactivity among the general population, which has worsened due to the outbreak of the pandemic, and there is a severe lack of exercise guidance and mitigation strategies to advance the knowledge and role of PA to improve physical and mental health in most countries during the epidemic. This study surveyed the effects of COVID-19 on PA in Chinese residents during the pandemic and provided important reference and evidence to inform policymakers and formulate policies and planning for health promotion and strengthening residents' PA during periods of public health emergencies. ANOVA, Kolmogorov-Smirnov, the chi-square test and Spearman correlation analysis were used for statistical analysis. A total of 14,715 participants were included. The results show that nearly 70% of Chinese residents had inadequate PA (95%CI 58.0%-82.19%) during the COVID-19 outbreak, which was more than double the global level (27.5%, 95%CI 25.0%-32.2%). The content, intensity, duration, and frequency of PA were all affected during the period of home isolation, and the types of PA may vary among different ages. The lack of physical facilities and cultural environment is the main factor affecting PA. However, there was no significant correlation between insufficient PA and the infection rate. During the period of home isolation and social distance of epidemic prevention, it is necessary to strengthen the scientific remote network monitoring and guidance for the process of PA in China.

新冠肺炎已经发生了几次变异,目前仍在大多数国家传播。PA在预防新冠肺炎感染和抵消新冠肺炎造成的负面身心影响方面具有积极益处。然而,相关证据表明,普通人群中缺乏体育活动的比例很高,由于疫情的爆发,这种情况有所恶化,而且在疫情期间,大多数国家严重缺乏锻炼指导和缓解策略来提高PA的知识和作用,以改善身心健康。本研究调查了疫情期间新冠肺炎对我国居民PA的影响,为突发公共卫生事件期间决策者制定健康促进和加强居民PA的政策和规划提供了重要参考和依据。采用方差分析、Kolmogorov-Smirnov、卡方检验和Spearman相关分析进行统计分析。共有14715名参与者参加。结果显示,新冠肺炎暴发期间,近70%的中国居民PA不足(95%CI 58.0%-82.19%),是全球水平(27.5%,95%CI 25.0%-32.2%)的两倍多。物理设施和文化环境的缺乏是影响PA的主要因素。然而,PA不足与感染率没有显著相关性。在居家隔离和防疫社交距离期间,有必要加强对中国PA过程的科学远程网络监测和指导。
{"title":"PA during the COVID-19 outbreak in China: a cross-sectional study.","authors":"Yingjun Nie,&nbsp;Yuanyan Ma,&nbsp;Xiaodong Li,&nbsp;Yankong Wu,&nbsp;Weixin Liu,&nbsp;Zhenke Tan,&nbsp;Jiahui Li,&nbsp;Ce Zhang,&nbsp;Chennan Lv,&nbsp;Ting Liu","doi":"10.1007/s00521-021-06538-x","DOIUrl":"10.1007/s00521-021-06538-x","url":null,"abstract":"<p><p>COVID-19 has undergone several mutations and is still spreading in most countries now. PA has positive benefits in the prevention of COVID-19 infection and counteracting the negative physical and mental effects caused by COVID-19. However, relevant evidence has indicated a high prevalence of physical inactivity among the general population, which has worsened due to the outbreak of the pandemic, and there is a severe lack of exercise guidance and mitigation strategies to advance the knowledge and role of PA to improve physical and mental health in most countries during the epidemic. This study surveyed the effects of COVID-19 on PA in Chinese residents during the pandemic and provided important reference and evidence to inform policymakers and formulate policies and planning for health promotion and strengthening residents' PA during periods of public health emergencies. ANOVA, Kolmogorov-Smirnov, the chi-square test and Spearman correlation analysis were used for statistical analysis. A total of 14,715 participants were included. The results show that nearly 70% of Chinese residents had inadequate PA (95%CI 58.0%-82.19%) during the COVID-19 outbreak, which was more than double the global level (27.5%, 95%CI 25.0%-32.2%). The content, intensity, duration, and frequency of PA were all affected during the period of home isolation, and the types of PA may vary among different ages. The lack of physical facilities and cultural environment is the main factor affecting PA. However, there was no significant correlation between insufficient PA and the infection rate. During the period of home isolation and social distance of epidemic prevention, it is necessary to strengthen the scientific remote network monitoring and guidance for the process of PA in China.</p>","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 19","pages":"13739-13754"},"PeriodicalIF":6.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8485310/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9529747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Topical collection on machine learning for big data analytics in smart healthcare systems. 智能医疗系统中用于大数据分析的机器学习专题集。
IF 6 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 Epub Date: 2023-05-09 DOI: 10.1007/s00521-023-08627-5
Mian Ahmad Jan, Houbing Song, Fazlullah Khan, Ateeq Ur Rehman, Lie-Liang Yang
{"title":"Topical collection on machine learning for big data analytics in smart healthcare systems.","authors":"Mian Ahmad Jan,&nbsp;Houbing Song,&nbsp;Fazlullah Khan,&nbsp;Ateeq Ur Rehman,&nbsp;Lie-Liang Yang","doi":"10.1007/s00521-023-08627-5","DOIUrl":"10.1007/s00521-023-08627-5","url":null,"abstract":"","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 20","pages":"14469-14471"},"PeriodicalIF":6.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10169121/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9576949","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Towards automated check-worthy sentence detection using Gated Recurrent Unit. 用门控循环单元实现句子的自动检出。
IF 6 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 DOI: 10.1007/s00521-023-08300-x
Ria Jha, Ena Motwani, Nivedita Singhal, Rishabh Kaushal

People are exposed to a lot of information daily, which is a mix of facts, opinions, and false claims. The rate at which information is created and spread has necessitated an automated fact-checking mechanism. In this work, we focus on the first step of the fact-checking system, which is to identify whether a given sentence is factual. We propose a glove embedding-based gated recurrent unit pipeline for check-worthy sentence detection, referred to as G2CW framework. It detects whether a given sentence has check-worthy content in it or not; furthermore, if it has check-worthy content, whether it is important or not, from a fact-checking perspective. We evaluate our proposed framework on two datasets: a standard ClaimBuster dataset commonly used by the research community for this problem and a self-curated IndianClaim dataset. Our G2CW framework outperforms prior work with 0.92 as F1-score. Furthermore, our G2CW framework, when trained on the ClaimBuster dataset, performs the best on the IndianClaims dataset.

人们每天都会接触到大量的信息,这些信息是事实、观点和虚假声明的混合体。信息被创造和传播的速度使得一种自动的事实核查机制成为必要。在这项工作中,我们专注于事实检查系统的第一步,即识别给定的句子是否真实。我们提出了一种基于手套嵌入的门控循环单元管道,用于值得检查的句子检测,称为G2CW框架。它检测给定的句子中是否有值得检查的内容;此外,从事实核查的角度来看,如果它有值得核查的内容,不管它是否重要。我们在两个数据集上评估了我们提出的框架:一个是研究界常用的标准ClaimBuster数据集,另一个是自我管理的IndianClaim数据集。我们的G2CW框架以0.92的f1得分优于先前的工作。此外,我们的G2CW框架在ClaimBuster数据集上训练时,在IndianClaims数据集上表现最好。
{"title":"Towards automated check-worthy sentence detection using Gated Recurrent Unit.","authors":"Ria Jha,&nbsp;Ena Motwani,&nbsp;Nivedita Singhal,&nbsp;Rishabh Kaushal","doi":"10.1007/s00521-023-08300-x","DOIUrl":"https://doi.org/10.1007/s00521-023-08300-x","url":null,"abstract":"<p><p>People are exposed to a lot of information daily, which is a mix of facts, opinions, and false claims. The rate at which information is created and spread has necessitated an automated fact-checking mechanism. In this work, we focus on the first step of the fact-checking system, which is to identify whether a given sentence is factual. We propose a glove embedding-based gated recurrent unit pipeline for check-worthy sentence detection, referred to as G2CW framework. It detects whether a given sentence has check-worthy content in it or not; furthermore, if it has check-worthy content, whether it is important or not, from a fact-checking perspective. We evaluate our proposed framework on two datasets: a standard ClaimBuster dataset commonly used by the research community for this problem and a self-curated IndianClaim dataset. Our G2CW framework outperforms prior work with 0.92 as F1-score. Furthermore, our G2CW framework, when trained on the ClaimBuster dataset, performs the best on the IndianClaims dataset.</p>","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 15","pages":"11337-11357"},"PeriodicalIF":6.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9916500/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9372483","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Enhanced balancing GAN: minority-class image generation. 增强平衡GAN:少数类图像生成。
IF 6 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Pub Date : 2023-01-01 DOI: 10.1007/s00521-021-06163-8
Gaofeng Huang, Amir Hossein Jafari

Generative adversarial networks (GANs) are one of the most powerful generative models, but always require a large and balanced dataset to train. Traditional GANs are not applicable to generate minority-class images in a highly imbalanced dataset. Balancing GAN (BAGAN) is proposed to mitigate this problem, but it is unstable when images in different classes look similar, e.g., flowers and cells. In this work, we propose a supervised autoencoder with an intermediate embedding model to disperse the labeled latent vectors. With the enhanced autoencoder initialization, we also build an architecture of BAGAN with gradient penalty (BAGAN-GP). Our proposed model overcomes the unstable issue in original BAGAN and converges faster to high-quality generations. Our model achieves high performance on the imbalanced scale-down version of MNIST Fashion, CIFAR-10, and one small-scale medical image dataset. https://github.com/GH920/improved-bagan-gp.

生成对抗网络(GANs)是最强大的生成模型之一,但总是需要一个大而平衡的数据集来训练。传统的gan不适用于在高度不平衡的数据集中生成少数类图像。平衡GAN (BAGAN)被提出来缓解这个问题,但是当不同类别的图像看起来相似时,例如花和细胞,它是不稳定的。在这项工作中,我们提出了一种带有中间嵌入模型的监督式自编码器来分散标记的潜在向量。通过增强的自编码器初始化,我们还构建了一个带梯度惩罚的BAGAN体系结构(BAGAN- gp)。我们提出的模型克服了原始BAGAN的不稳定问题,更快地收敛到高质量世代。我们的模型在不平衡缩小版的MNIST Fashion、CIFAR-10和一个小规模医学图像数据集上实现了高性能。https://github.com/GH920/improved-bagan-gp。
{"title":"Enhanced balancing GAN: minority-class image generation.","authors":"Gaofeng Huang,&nbsp;Amir Hossein Jafari","doi":"10.1007/s00521-021-06163-8","DOIUrl":"https://doi.org/10.1007/s00521-021-06163-8","url":null,"abstract":"<p><p>Generative adversarial networks (GANs) are one of the most powerful generative models, but always require a large and balanced dataset to train. Traditional GANs are not applicable to generate minority-class images in a highly imbalanced dataset. Balancing GAN (BAGAN) is proposed to mitigate this problem, but it is unstable when images in different classes look similar, e.g., flowers and cells. In this work, we propose a supervised autoencoder with an intermediate embedding model to disperse the labeled latent vectors. With the enhanced autoencoder initialization, we also build an architecture of BAGAN with gradient penalty (BAGAN-GP). Our proposed model overcomes the unstable issue in original BAGAN and converges faster to high-quality generations. Our model achieves high performance on the imbalanced scale-down version of MNIST Fashion, CIFAR-10, and one small-scale medical image dataset. https://github.com/GH920/improved-bagan-gp.</p>","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 7","pages":"5145-5154"},"PeriodicalIF":6.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1007/s00521-021-06163-8","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10698449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 32
期刊
Neural Computing & Applications
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1