首页 > 最新文献

2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)最新文献

英文 中文
An Enhanced Image Loading Framework for Social Media Applications 一种用于社交媒体应用程序的增强图像加载框架
Siji Rani S, Lekshmi S. Nair, Vaisakh M S
Online Social network (OSN) is the most popular platform where users prefer to share images and videos. Image loading time in social media applications is time-consuming due to significantly less internet bandwidth. Uploading an image on a social media platform demands accurate size, highest quality, format, and resolution. Often, duplicates of images may be uploaded by the user accidentally. Uploading images or videos by individual users on platforms like Facebook or Instagram is Content loading. In this article, we suggest a suitable method for reducing the content loading time by finding the duplicate images and replacing those images with the original image that is already loaded using ANNOY (Artificial Neural Network Oh Yeah). In the methodology we could successfully reduce the image loading time by checking the duplication.
在线社交网络(Online Social network, OSN)是用户最喜欢分享图片和视频的平台。由于网络带宽明显减少,社交媒体应用程序中的图像加载时间非常耗时。在社交媒体平台上上传图片需要精确的尺寸、最高的质量、格式和分辨率。通常,用户可能会不小心上传图像的副本。个人用户在Facebook或Instagram等平台上上传图片或视频属于内容加载。在本文中,我们建议一种合适的方法来减少内容加载时间,即找到重复的图像,并用已经加载的原始图像替换这些图像,使用ANNOY(人工神经网络)。在该方法中,我们可以通过检查重复来成功地减少图像加载时间。
{"title":"An Enhanced Image Loading Framework for Social Media Applications","authors":"Siji Rani S, Lekshmi S. Nair, Vaisakh M S","doi":"10.1109/ACCESS57397.2023.10200278","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200278","url":null,"abstract":"Online Social network (OSN) is the most popular platform where users prefer to share images and videos. Image loading time in social media applications is time-consuming due to significantly less internet bandwidth. Uploading an image on a social media platform demands accurate size, highest quality, format, and resolution. Often, duplicates of images may be uploaded by the user accidentally. Uploading images or videos by individual users on platforms like Facebook or Instagram is Content loading. In this article, we suggest a suitable method for reducing the content loading time by finding the duplicate images and replacing those images with the original image that is already loaded using ANNOY (Artificial Neural Network Oh Yeah). In the methodology we could successfully reduce the image loading time by checking the duplication.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131513327","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Pandemic Outbreak Prediction using Optimization-based Machine Learning Model 基于优化的机器学习模型的大流行爆发预测
Soni Singh, S. Mittal
Many pandemic epidemics have a variety of effects on people. Disease modelling is important in order to predict and evaluate the effects of these pandemics. A variety of statistical and machine learning (ML) models are built to produce the forecast. The models employ several ML strategies, but due to the available dataset, they are unable to achieve higher accuracy. To avoid the problem and improve forecast accuracy, we suggested an ML-based prediction model. This study optimises the parameters of the already employed machine learning models using the proposed Ant Colony Optimization approach (ACO). A comparison of different machine learning (ML) approaches, such as Polynomial Regression (PR), Support Vector Machine (SVM), and Linear Regression, is provided to predict the pandemic outbreak (LR). For COVID-19 datasets, the accuracy and Root Mean Square Error (RMSE) score of the proposed model are used to assess its performance. The results show that, as assessed by the RMSE score, the suggested method delivers good accuracy for daily prediction. The outcome forecast shows that PR-ACO outperforms other ML strategies in terms of final results. According to the results of the predictions, the proposed ACO parameter optimising algorithm will increase the capacity of current ML techniques to anticipate outbreaks across diverse countries.
许多流行病对人有各种各样的影响。疾病建模对于预测和评估这些流行病的影响非常重要。建立各种统计和机器学习(ML)模型来产生预测。这些模型采用了几种机器学习策略,但由于可用的数据集,它们无法达到更高的精度。为了避免这一问题,提高预测精度,我们提出了一种基于机器学习的预测模型。本研究使用蚁群优化方法(ACO)优化已经使用的机器学习模型的参数。本文比较了不同的机器学习(ML)方法,如多项式回归(PR)、支持向量机(SVM)和线性回归,以预测大流行爆发(LR)。对于COVID-19数据集,使用所提出模型的准确性和均方根误差(RMSE)评分来评估其性能。结果表明,根据RMSE评分评估,建议的方法对日常预测具有良好的准确性。结果预测表明,PR-ACO在最终结果方面优于其他ML策略。根据预测结果,提出的蚁群算法参数优化算法将提高当前机器学习技术预测不同国家疫情的能力。
{"title":"Pandemic Outbreak Prediction using Optimization-based Machine Learning Model","authors":"Soni Singh, S. Mittal","doi":"10.1109/ACCESS57397.2023.10199872","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10199872","url":null,"abstract":"Many pandemic epidemics have a variety of effects on people. Disease modelling is important in order to predict and evaluate the effects of these pandemics. A variety of statistical and machine learning (ML) models are built to produce the forecast. The models employ several ML strategies, but due to the available dataset, they are unable to achieve higher accuracy. To avoid the problem and improve forecast accuracy, we suggested an ML-based prediction model. This study optimises the parameters of the already employed machine learning models using the proposed Ant Colony Optimization approach (ACO). A comparison of different machine learning (ML) approaches, such as Polynomial Regression (PR), Support Vector Machine (SVM), and Linear Regression, is provided to predict the pandemic outbreak (LR). For COVID-19 datasets, the accuracy and Root Mean Square Error (RMSE) score of the proposed model are used to assess its performance. The results show that, as assessed by the RMSE score, the suggested method delivers good accuracy for daily prediction. The outcome forecast shows that PR-ACO outperforms other ML strategies in terms of final results. According to the results of the predictions, the proposed ACO parameter optimising algorithm will increase the capacity of current ML techniques to anticipate outbreaks across diverse countries.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"147 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122332492","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Comparitive Analysis of Deep Learning Techniques for Assistive Soft Wearable Robots 辅助软可穿戴机器人深度学习技术的比较分析
Ranjeesh R Chandran, Sreedeep Krishnan, Y. Chakrapani, D. Dharmaraj
A new innovation called wearable assistive robotics has the potential of assisting those with sensorimotor disabilities in doing routine tasks. A lot of research is done on soft robots because of its adaptability, deformability, and flexibility. In contrast to soft robots and rigid robots, face obstacles in control, calibration, and modelling because the properties of soft materials which result in complex behaviors due to hysteresis and non-linearity. The use of deep learning techniques in real-time poses additional challenges for researchers when it comes to accuracy and equipment cost. Recent research has used various deep learning algorithms to address these constraints. This paper gives deep insight and analysis of existing deep learning techniques in the area of assistive soft wearable robotics and classifies their applicability in various soft robotic applications. The current constraints in the study field, along with an analysis of various deep learning models with regard to various types of assistive soft wearable robot applications, are provided, followed by a description and implementation of the available deep learning techniques for assistive soft wearable robotics.
一种名为可穿戴辅助机器人的新发明,有可能帮助那些有感觉运动障碍的人完成日常任务。软机器人由于其适应性、可变形性和柔韧性而受到广泛的研究。与软机器人和刚性机器人相比,由于软材料的特性导致其由于迟滞和非线性而导致的复杂行为,在控制、校准和建模方面面临着障碍。在准确性和设备成本方面,实时使用深度学习技术给研究人员带来了额外的挑战。最近的研究使用了各种深度学习算法来解决这些限制。本文对软性可穿戴辅助机器人领域现有的深度学习技术进行了深入的洞察和分析,并对其在各种软性机器人应用中的适用性进行了分类。提供了当前研究领域的限制,以及关于各种类型的辅助软可穿戴机器人应用的各种深度学习模型的分析,然后描述和实现了辅助软可穿戴机器人的可用深度学习技术。
{"title":"Comparitive Analysis of Deep Learning Techniques for Assistive Soft Wearable Robots","authors":"Ranjeesh R Chandran, Sreedeep Krishnan, Y. Chakrapani, D. Dharmaraj","doi":"10.1109/ACCESS57397.2023.10199699","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10199699","url":null,"abstract":"A new innovation called wearable assistive robotics has the potential of assisting those with sensorimotor disabilities in doing routine tasks. A lot of research is done on soft robots because of its adaptability, deformability, and flexibility. In contrast to soft robots and rigid robots, face obstacles in control, calibration, and modelling because the properties of soft materials which result in complex behaviors due to hysteresis and non-linearity. The use of deep learning techniques in real-time poses additional challenges for researchers when it comes to accuracy and equipment cost. Recent research has used various deep learning algorithms to address these constraints. This paper gives deep insight and analysis of existing deep learning techniques in the area of assistive soft wearable robotics and classifies their applicability in various soft robotic applications. The current constraints in the study field, along with an analysis of various deep learning models with regard to various types of assistive soft wearable robot applications, are provided, followed by a description and implementation of the available deep learning techniques for assistive soft wearable robotics.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"98 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114963899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Heart Disease Prediction System using Machine Learning Model 基于机器学习模型的心脏病预测系统
P. S. Tomar, Kalpana Rai, Mohd. Zuber
Computer-based computation play a tremendous role over the last two decades, as the demand for computers and technology is growing day by day, Data science is a technique to handle the large amount of data which is generated from the computer and other computational-based devices, the data is playing very crucial role nowadays. Artificial intelligence techniques can handle a large amount of data in various filed, like computer-based vision, medical imaging, object detection and tracking, security surveillance, and other fields. The healthcare sector is one of the most promising and necessity-based sectors among the other sectors, therefore machine learning-based models provide all the required computation and solutions for the computer-aided disease diagnosis. In this work we present the comparative machine learning model to predict heart disease on basis of different features and, also improve the accuracy and some other performance parameters to detect the respective diseases. Our experimental result shows better accuracy and other performance parameters value than existing techniques.
基于计算机的计算在过去的二十年中发挥了巨大的作用,随着对计算机和技术的需求日益增长,数据科学是一门处理从计算机和其他基于计算的设备产生的大量数据的技术,数据在当今起着至关重要的作用。人工智能技术可以处理各种领域的大量数据,如计算机视觉、医学成像、物体检测和跟踪、安全监控等领域。医疗保健行业是最具发展前景和需求的行业之一,因此基于机器学习的模型为计算机辅助疾病诊断提供了所需的所有计算和解决方案。在这项工作中,我们提出了基于不同特征的比较机器学习模型来预测心脏病,并提高了准确率和其他一些性能参数来检测各自的疾病。实验结果表明,与现有技术相比,我们的精度和其他性能参数值都有所提高。
{"title":"Heart Disease Prediction System using Machine Learning Model","authors":"P. S. Tomar, Kalpana Rai, Mohd. Zuber","doi":"10.1109/ACCESS57397.2023.10200757","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200757","url":null,"abstract":"Computer-based computation play a tremendous role over the last two decades, as the demand for computers and technology is growing day by day, Data science is a technique to handle the large amount of data which is generated from the computer and other computational-based devices, the data is playing very crucial role nowadays. Artificial intelligence techniques can handle a large amount of data in various filed, like computer-based vision, medical imaging, object detection and tracking, security surveillance, and other fields. The healthcare sector is one of the most promising and necessity-based sectors among the other sectors, therefore machine learning-based models provide all the required computation and solutions for the computer-aided disease diagnosis. In this work we present the comparative machine learning model to predict heart disease on basis of different features and, also improve the accuracy and some other performance parameters to detect the respective diseases. Our experimental result shows better accuracy and other performance parameters value than existing techniques.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133418210","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Modeling of high-speed ISL links in Satellite Communication 卫星通信中高速ISL链路建模
Meet Kumari
In this paper, a high-speed inter satellite link (ISL) link over different orbits in satellite communication has been presented. The results show that the maximum transmission speed of 400Gbps can be obtained over 5000km ISL link. However, the transmission range can be exceeded upto 40000km for geostationary orbit satellite with additional loss of 1dB. Besides this, the comparative performance of proposed work shows its superiority over other work on satellite communication.
本文提出了一种卫星通信中不同轨道上的高速星间链路。结果表明,在5000km ISL链路上可获得400Gbps的最大传输速度。然而,地球静止轨道卫星的传输距离可超过40000km,附加损耗为1dB。此外,所提工作的性能对比显示了其相对于其他卫星通信工作的优越性。
{"title":"Modeling of high-speed ISL links in Satellite Communication","authors":"Meet Kumari","doi":"10.1109/ACCESS57397.2023.10199144","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10199144","url":null,"abstract":"In this paper, a high-speed inter satellite link (ISL) link over different orbits in satellite communication has been presented. The results show that the maximum transmission speed of 400Gbps can be obtained over 5000km ISL link. However, the transmission range can be exceeded upto 40000km for geostationary orbit satellite with additional loss of 1dB. Besides this, the comparative performance of proposed work shows its superiority over other work on satellite communication.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"46 5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126003791","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Implementation of Deep Learning Technique for Citrus Leaf Blotch Disease Severity Detection 柑橘叶斑病严重程度检测的深度学习技术实现
Nutika, Rishabh Sharma, V. Kukreja, Prince Sood, Ankit Bansal
Utilising computer vision methods, there has been an ongoing study in recognizing and categorizing plant diseases. Citrus is a member of the plant family and is highly susceptible to disease, there hasn't been much research done on citrus disease detection. Citrus leaf blotch (CLB) disease can be detected and categorized based on how severe the illness is through a model for citrus leaf disease detection and classification. To categorize 8000 real-phase images of citrus leaves which include healthy and CLB-infected images, A deep learning (DL) model based on convolutional neural networks (CNN) has been presented.. This ranking accuracy of the CLB disease is 97.81% for binary classification and 98.81% for multi-classification, respectively. Additionally, cutting-edge pre-trained models have been compared., showing that it outperforms them in terms of multiple classifications of CLB sickness.
利用计算机视觉方法对植物病害进行识别和分类的研究正在进行中。柑橘是植物科的一员,对病害非常敏感,但对柑橘病害检测的研究并不多。柑桔叶斑病(CLB)可以通过柑桔叶斑病检测和分类模型,根据疾病的严重程度进行检测和分类。为了对8000张柑橘叶片健康和clb感染图像进行分类,提出了一种基于卷积神经网络(CNN)的深度学习(DL)模型。二分类和多分类对CLB疾病的排序准确率分别为97.81%和98.81%。此外,还比较了尖端的预训练模型。,表明它在CLB疾病的多重分类方面优于它们。
{"title":"Implementation of Deep Learning Technique for Citrus Leaf Blotch Disease Severity Detection","authors":"Nutika, Rishabh Sharma, V. Kukreja, Prince Sood, Ankit Bansal","doi":"10.1109/ACCESS57397.2023.10199890","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10199890","url":null,"abstract":"Utilising computer vision methods, there has been an ongoing study in recognizing and categorizing plant diseases. Citrus is a member of the plant family and is highly susceptible to disease, there hasn't been much research done on citrus disease detection. Citrus leaf blotch (CLB) disease can be detected and categorized based on how severe the illness is through a model for citrus leaf disease detection and classification. To categorize 8000 real-phase images of citrus leaves which include healthy and CLB-infected images, A deep learning (DL) model based on convolutional neural networks (CNN) has been presented.. This ranking accuracy of the CLB disease is 97.81% for binary classification and 98.81% for multi-classification, respectively. Additionally, cutting-edge pre-trained models have been compared., showing that it outperforms them in terms of multiple classifications of CLB sickness.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130341332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Experimental study of a novel technique of data hiding with high PSNR 一种新的高信噪比数据隐藏技术的实验研究
S. Verdiyev, A. Naghiyeva, Ajay Kumar
The task of concealment of information is an important challenge in the context of an advance infrastructure of open and uncontrolled Internet users' communication through computer networks. Various steganographic methods have been developed to solve the problem of the protection of transmitted information. The aim of the present paper is the experimental study of the robustness of the algorithm developed by authors for the concealment of information. The reliability of the algorithm for stego analytics attacks has been investigated by the RS method of steganalysis. In addition, the computational complexity of the algorithm is investigated which in turn should not exceed a certain threshold.The results of the experimental studies demonstrate the algorithm’s high robustness and minimal computational time.
在开放和不受控制的互联网用户通过计算机网络通信的先进基础设施的背景下,信息隐藏任务是一个重要的挑战。为了解决传输信息的保护问题,人们开发了各种隐写方法。本文的目的是对作者开发的信息隐藏算法的鲁棒性进行实验研究。采用RS隐写分析方法研究了隐写分析算法的可靠性。此外,还研究了算法的计算复杂度,计算复杂度不应超过一定的阈值。实验结果表明,该算法鲁棒性强,计算时间短。
{"title":"Experimental study of a novel technique of data hiding with high PSNR","authors":"S. Verdiyev, A. Naghiyeva, Ajay Kumar","doi":"10.1109/ACCESS57397.2023.10200743","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200743","url":null,"abstract":"The task of concealment of information is an important challenge in the context of an advance infrastructure of open and uncontrolled Internet users' communication through computer networks. Various steganographic methods have been developed to solve the problem of the protection of transmitted information. The aim of the present paper is the experimental study of the robustness of the algorithm developed by authors for the concealment of information. The reliability of the algorithm for stego analytics attacks has been investigated by the RS method of steganalysis. In addition, the computational complexity of the algorithm is investigated which in turn should not exceed a certain threshold.The results of the experimental studies demonstrate the algorithm’s high robustness and minimal computational time.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129062992","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Protecting Contactless Credit Card Payments from Fraud through Ambient Authentication and Machine Learning 通过环境认证和机器学习保护非接触式信用卡支付免受欺诈
Divneet Singh
Credit card fraud (CCF) is a persistent issue in the financial sector with serious consequences. Data mining has proven to be extremely useful in detecting fraud in online transactions. However, detecting CCF through data mining is quite a difficult task because of two causes: constant changes in the profiles of normal and fraudulent behaviour, and the highly skewed nature of the data sets. The outcome of fraud detection in credit card transactions depends on the sampling approach, detection techniques, and variable selection. This work studies the performance of K-Nearest Neighbor, Naive Bayes, Logistic Regression and Random Forest algorithms on a highly skewed dataset. The dataset contains 2,84,807 transactions and has been collected from European cardholder transactions. A hybrid of under-sampling and oversampling techniques has been used on the skewed data. The four techniques were utilized on both data namely preprocessed and raw, and the results are evaluated using specificity, accuracy, sensitivity, and F1-score. The outcomes show that the optimal accuracy for Naive Bayes, Logistic Regression, K-Nearest Neighbor and Random Forest classifiers are 98.72%, 52.34%, 96.89%, 91.67%, respectively. The comparative results indicate that K-Nearest Neighbor performs better than Logistic Regression, Random Forest and Naive Bayes techniques.
信用卡诈骗(CCF)是金融领域的一个长期存在的问题,后果严重。数据挖掘已被证明在检测在线交易中的欺诈方面非常有用。然而,通过数据挖掘检测CCF是一项相当困难的任务,因为两个原因:正常和欺诈行为的概况不断变化,以及数据集的高度倾斜性质。信用卡交易欺诈检测的结果取决于采样方法、检测技术和变量选择。这项工作研究了k -最近邻,朴素贝叶斯,逻辑回归和随机森林算法在高度倾斜数据集上的性能。该数据集包含2,84,807笔交易,收集自欧洲持卡人交易。欠采样和过采样的混合技术已用于偏斜数据。这四种技术分别用于预处理和原始数据,并使用特异性、准确性、敏感性和f1评分对结果进行评估。结果表明,朴素贝叶斯分类器、逻辑回归分类器、k近邻分类器和随机森林分类器的最优准确率分别为98.72%、52.34%、96.89%、91.67%。对比结果表明,k近邻算法的性能优于逻辑回归、随机森林和朴素贝叶斯算法。
{"title":"Protecting Contactless Credit Card Payments from Fraud through Ambient Authentication and Machine Learning","authors":"Divneet Singh","doi":"10.1109/ACCESS57397.2023.10200022","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200022","url":null,"abstract":"Credit card fraud (CCF) is a persistent issue in the financial sector with serious consequences. Data mining has proven to be extremely useful in detecting fraud in online transactions. However, detecting CCF through data mining is quite a difficult task because of two causes: constant changes in the profiles of normal and fraudulent behaviour, and the highly skewed nature of the data sets. The outcome of fraud detection in credit card transactions depends on the sampling approach, detection techniques, and variable selection. This work studies the performance of K-Nearest Neighbor, Naive Bayes, Logistic Regression and Random Forest algorithms on a highly skewed dataset. The dataset contains 2,84,807 transactions and has been collected from European cardholder transactions. A hybrid of under-sampling and oversampling techniques has been used on the skewed data. The four techniques were utilized on both data namely preprocessed and raw, and the results are evaluated using specificity, accuracy, sensitivity, and F1-score. The outcomes show that the optimal accuracy for Naive Bayes, Logistic Regression, K-Nearest Neighbor and Random Forest classifiers are 98.72%, 52.34%, 96.89%, 91.67%, respectively. The comparative results indicate that K-Nearest Neighbor performs better than Logistic Regression, Random Forest and Naive Bayes techniques.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117031935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Hybrid Model for Sentiment Analysis of Whatsapp Data Whatsapp数据情感分析的混合模型
Royal Kaushal, Raman Chadha
Sentiment analysis, also called opinion mining (OM), is an approach to analyzing the sentiments expressed in data. This approach uses Natural Language Processing to classify the data based on emotions in different classes. Various Social media data are analyzed to determine sentiment, and Machine Learning (ML) techniques classify the data. This study utilizes ML models to analyze sentiment in WhatsApp data. The sentiment analysis process includes some steps, such as to pre-process the data, extract the features, and classify the data. The initial stage contributes to the clean-up of raw data and transforms it to make it suitable for analysis. Feature extraction is a stage to retrieve a relevant feature from the pre-processed data that contribute to determining sentiment. Finally, machine learning algorithms classify data to determine the sentiments expressed in the text. This work proposes a voting classifier which is hybrid architecture comprising SVM, KNN, and a Decision tree. Python is executed to simulate the suggested algorithm, and its performance is evaluated based on accuracy, precision, and recall metrics. These parameters are useful in measuring the efficiency of the algorithm in accurately classifying the sentiments existing in the data.
情感分析,也称为意见挖掘(OM),是一种分析数据中表达的情感的方法。这种方法使用自然语言处理,根据不同类别的情绪对数据进行分类。分析各种社交媒体数据以确定情绪,并使用机器学习(ML)技术对数据进行分类。本研究利用ML模型分析WhatsApp数据中的情绪。情感分析过程包括数据预处理、特征提取和分类等步骤。初始阶段有助于清理原始数据并将其转换为适合分析的数据。特征提取是从预处理数据中检索相关特征的阶段,有助于确定情感。最后,机器学习算法对数据进行分类,以确定文本中表达的情感。本文提出了一种由支持向量机、KNN和决策树组成的混合结构的投票分类器。执行Python以模拟建议的算法,并根据准确性、精度和召回率指标评估其性能。这些参数有助于衡量算法对数据中存在的情感进行准确分类的效率。
{"title":"Hybrid Model for Sentiment Analysis of Whatsapp Data","authors":"Royal Kaushal, Raman Chadha","doi":"10.1109/ACCESS57397.2023.10200411","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200411","url":null,"abstract":"Sentiment analysis, also called opinion mining (OM), is an approach to analyzing the sentiments expressed in data. This approach uses Natural Language Processing to classify the data based on emotions in different classes. Various Social media data are analyzed to determine sentiment, and Machine Learning (ML) techniques classify the data. This study utilizes ML models to analyze sentiment in WhatsApp data. The sentiment analysis process includes some steps, such as to pre-process the data, extract the features, and classify the data. The initial stage contributes to the clean-up of raw data and transforms it to make it suitable for analysis. Feature extraction is a stage to retrieve a relevant feature from the pre-processed data that contribute to determining sentiment. Finally, machine learning algorithms classify data to determine the sentiments expressed in the text. This work proposes a voting classifier which is hybrid architecture comprising SVM, KNN, and a Decision tree. Python is executed to simulate the suggested algorithm, and its performance is evaluated based on accuracy, precision, and recall metrics. These parameters are useful in measuring the efficiency of the algorithm in accurately classifying the sentiments existing in the data.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121493691","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Development of an Innovative Optimal Route Selection Model Based on an Improved Multi-Objective Genetic Algorithm (IMOGA) Method in IoT Healthcare 基于改进多目标遗传算法(IMOGA)的物联网医疗创新最优路径选择模型
Jeejo K P, Bobby Mathews C
One of the most widely used IoT applications, healthcare aims to reduce the need for hospitalisation by monitoring patients' vital signs throughout the day for several weeks. Many sensors, including as vital and unstructured message sensors as well as environmental monitoring sensors, are deployed in healthcare systems to collect patient information and lower costs for the patients. Several issues along the transmission channel could result in the loss of data gathered by sensors integrated into medical equipment. In order to identify nearly optimal routes and create a cutting-edge optimal route selection model for IoT healthcare, this article employs the Improved Multi-Objective Genetic Algorithm (IMOGA) technique. For a variety of causes, data transmitted by sensors integrated into medical equipment may be lost. As a result, creating a safe communication method in IoT networks is crucial for the healthcare industry. As a result, the best path for medical data is chosen while taking energy, distance, and delay into account. The performances of the adopted work are then contrasted. According on experimental findings, Energy, distance, and delay have all been improved by the suggested strategy by 14%, 2%, and 5.6%, respectively.
医疗保健是使用最广泛的物联网应用之一,旨在通过连续数周全天监测患者的生命体征来减少住院治疗的需求。医疗保健系统中部署了许多传感器,包括重要和非结构化消息传感器以及环境监测传感器,以收集患者信息并降低患者成本。传输通道上的几个问题可能导致集成到医疗设备中的传感器收集的数据丢失。本文采用改进的多目标遗传算法(IMOGA)技术,为物联网医疗保健识别近最优路径,并建立一个前沿的最优路径选择模型。由于各种原因,集成到医疗设备中的传感器传输的数据可能会丢失。因此,在物联网网络中创建安全的通信方法对医疗保健行业至关重要。因此,在考虑能量、距离和延迟的情况下,选择医疗数据的最佳路径。然后对所采用作品的性能进行对比。根据实验结果,能量、距离和延迟分别提高了14%、2%和5.6%。
{"title":"Development of an Innovative Optimal Route Selection Model Based on an Improved Multi-Objective Genetic Algorithm (IMOGA) Method in IoT Healthcare","authors":"Jeejo K P, Bobby Mathews C","doi":"10.1109/ACCESS57397.2023.10201024","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10201024","url":null,"abstract":"One of the most widely used IoT applications, healthcare aims to reduce the need for hospitalisation by monitoring patients' vital signs throughout the day for several weeks. Many sensors, including as vital and unstructured message sensors as well as environmental monitoring sensors, are deployed in healthcare systems to collect patient information and lower costs for the patients. Several issues along the transmission channel could result in the loss of data gathered by sensors integrated into medical equipment. In order to identify nearly optimal routes and create a cutting-edge optimal route selection model for IoT healthcare, this article employs the Improved Multi-Objective Genetic Algorithm (IMOGA) technique. For a variety of causes, data transmitted by sensors integrated into medical equipment may be lost. As a result, creating a safe communication method in IoT networks is crucial for the healthcare industry. As a result, the best path for medical data is chosen while taking energy, distance, and delay into account. The performances of the adopted work are then contrasted. According on experimental findings, Energy, distance, and delay have all been improved by the suggested strategy by 14%, 2%, and 5.6%, respectively.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"210 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114668213","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1