Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10200278
Siji Rani S, Lekshmi S. Nair, Vaisakh M S
Online Social network (OSN) is the most popular platform where users prefer to share images and videos. Image loading time in social media applications is time-consuming due to significantly less internet bandwidth. Uploading an image on a social media platform demands accurate size, highest quality, format, and resolution. Often, duplicates of images may be uploaded by the user accidentally. Uploading images or videos by individual users on platforms like Facebook or Instagram is Content loading. In this article, we suggest a suitable method for reducing the content loading time by finding the duplicate images and replacing those images with the original image that is already loaded using ANNOY (Artificial Neural Network Oh Yeah). In the methodology we could successfully reduce the image loading time by checking the duplication.
在线社交网络(Online Social network, OSN)是用户最喜欢分享图片和视频的平台。由于网络带宽明显减少,社交媒体应用程序中的图像加载时间非常耗时。在社交媒体平台上上传图片需要精确的尺寸、最高的质量、格式和分辨率。通常,用户可能会不小心上传图像的副本。个人用户在Facebook或Instagram等平台上上传图片或视频属于内容加载。在本文中,我们建议一种合适的方法来减少内容加载时间,即找到重复的图像,并用已经加载的原始图像替换这些图像,使用ANNOY(人工神经网络)。在该方法中,我们可以通过检查重复来成功地减少图像加载时间。
{"title":"An Enhanced Image Loading Framework for Social Media Applications","authors":"Siji Rani S, Lekshmi S. Nair, Vaisakh M S","doi":"10.1109/ACCESS57397.2023.10200278","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200278","url":null,"abstract":"Online Social network (OSN) is the most popular platform where users prefer to share images and videos. Image loading time in social media applications is time-consuming due to significantly less internet bandwidth. Uploading an image on a social media platform demands accurate size, highest quality, format, and resolution. Often, duplicates of images may be uploaded by the user accidentally. Uploading images or videos by individual users on platforms like Facebook or Instagram is Content loading. In this article, we suggest a suitable method for reducing the content loading time by finding the duplicate images and replacing those images with the original image that is already loaded using ANNOY (Artificial Neural Network Oh Yeah). In the methodology we could successfully reduce the image loading time by checking the duplication.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131513327","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10199872
Soni Singh, S. Mittal
Many pandemic epidemics have a variety of effects on people. Disease modelling is important in order to predict and evaluate the effects of these pandemics. A variety of statistical and machine learning (ML) models are built to produce the forecast. The models employ several ML strategies, but due to the available dataset, they are unable to achieve higher accuracy. To avoid the problem and improve forecast accuracy, we suggested an ML-based prediction model. This study optimises the parameters of the already employed machine learning models using the proposed Ant Colony Optimization approach (ACO). A comparison of different machine learning (ML) approaches, such as Polynomial Regression (PR), Support Vector Machine (SVM), and Linear Regression, is provided to predict the pandemic outbreak (LR). For COVID-19 datasets, the accuracy and Root Mean Square Error (RMSE) score of the proposed model are used to assess its performance. The results show that, as assessed by the RMSE score, the suggested method delivers good accuracy for daily prediction. The outcome forecast shows that PR-ACO outperforms other ML strategies in terms of final results. According to the results of the predictions, the proposed ACO parameter optimising algorithm will increase the capacity of current ML techniques to anticipate outbreaks across diverse countries.
{"title":"Pandemic Outbreak Prediction using Optimization-based Machine Learning Model","authors":"Soni Singh, S. Mittal","doi":"10.1109/ACCESS57397.2023.10199872","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10199872","url":null,"abstract":"Many pandemic epidemics have a variety of effects on people. Disease modelling is important in order to predict and evaluate the effects of these pandemics. A variety of statistical and machine learning (ML) models are built to produce the forecast. The models employ several ML strategies, but due to the available dataset, they are unable to achieve higher accuracy. To avoid the problem and improve forecast accuracy, we suggested an ML-based prediction model. This study optimises the parameters of the already employed machine learning models using the proposed Ant Colony Optimization approach (ACO). A comparison of different machine learning (ML) approaches, such as Polynomial Regression (PR), Support Vector Machine (SVM), and Linear Regression, is provided to predict the pandemic outbreak (LR). For COVID-19 datasets, the accuracy and Root Mean Square Error (RMSE) score of the proposed model are used to assess its performance. The results show that, as assessed by the RMSE score, the suggested method delivers good accuracy for daily prediction. The outcome forecast shows that PR-ACO outperforms other ML strategies in terms of final results. According to the results of the predictions, the proposed ACO parameter optimising algorithm will increase the capacity of current ML techniques to anticipate outbreaks across diverse countries.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"147 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122332492","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10199699
Ranjeesh R Chandran, Sreedeep Krishnan, Y. Chakrapani, D. Dharmaraj
A new innovation called wearable assistive robotics has the potential of assisting those with sensorimotor disabilities in doing routine tasks. A lot of research is done on soft robots because of its adaptability, deformability, and flexibility. In contrast to soft robots and rigid robots, face obstacles in control, calibration, and modelling because the properties of soft materials which result in complex behaviors due to hysteresis and non-linearity. The use of deep learning techniques in real-time poses additional challenges for researchers when it comes to accuracy and equipment cost. Recent research has used various deep learning algorithms to address these constraints. This paper gives deep insight and analysis of existing deep learning techniques in the area of assistive soft wearable robotics and classifies their applicability in various soft robotic applications. The current constraints in the study field, along with an analysis of various deep learning models with regard to various types of assistive soft wearable robot applications, are provided, followed by a description and implementation of the available deep learning techniques for assistive soft wearable robotics.
{"title":"Comparitive Analysis of Deep Learning Techniques for Assistive Soft Wearable Robots","authors":"Ranjeesh R Chandran, Sreedeep Krishnan, Y. Chakrapani, D. Dharmaraj","doi":"10.1109/ACCESS57397.2023.10199699","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10199699","url":null,"abstract":"A new innovation called wearable assistive robotics has the potential of assisting those with sensorimotor disabilities in doing routine tasks. A lot of research is done on soft robots because of its adaptability, deformability, and flexibility. In contrast to soft robots and rigid robots, face obstacles in control, calibration, and modelling because the properties of soft materials which result in complex behaviors due to hysteresis and non-linearity. The use of deep learning techniques in real-time poses additional challenges for researchers when it comes to accuracy and equipment cost. Recent research has used various deep learning algorithms to address these constraints. This paper gives deep insight and analysis of existing deep learning techniques in the area of assistive soft wearable robotics and classifies their applicability in various soft robotic applications. The current constraints in the study field, along with an analysis of various deep learning models with regard to various types of assistive soft wearable robot applications, are provided, followed by a description and implementation of the available deep learning techniques for assistive soft wearable robotics.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"98 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114963899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10200757
P. S. Tomar, Kalpana Rai, Mohd. Zuber
Computer-based computation play a tremendous role over the last two decades, as the demand for computers and technology is growing day by day, Data science is a technique to handle the large amount of data which is generated from the computer and other computational-based devices, the data is playing very crucial role nowadays. Artificial intelligence techniques can handle a large amount of data in various filed, like computer-based vision, medical imaging, object detection and tracking, security surveillance, and other fields. The healthcare sector is one of the most promising and necessity-based sectors among the other sectors, therefore machine learning-based models provide all the required computation and solutions for the computer-aided disease diagnosis. In this work we present the comparative machine learning model to predict heart disease on basis of different features and, also improve the accuracy and some other performance parameters to detect the respective diseases. Our experimental result shows better accuracy and other performance parameters value than existing techniques.
{"title":"Heart Disease Prediction System using Machine Learning Model","authors":"P. S. Tomar, Kalpana Rai, Mohd. Zuber","doi":"10.1109/ACCESS57397.2023.10200757","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200757","url":null,"abstract":"Computer-based computation play a tremendous role over the last two decades, as the demand for computers and technology is growing day by day, Data science is a technique to handle the large amount of data which is generated from the computer and other computational-based devices, the data is playing very crucial role nowadays. Artificial intelligence techniques can handle a large amount of data in various filed, like computer-based vision, medical imaging, object detection and tracking, security surveillance, and other fields. The healthcare sector is one of the most promising and necessity-based sectors among the other sectors, therefore machine learning-based models provide all the required computation and solutions for the computer-aided disease diagnosis. In this work we present the comparative machine learning model to predict heart disease on basis of different features and, also improve the accuracy and some other performance parameters to detect the respective diseases. Our experimental result shows better accuracy and other performance parameters value than existing techniques.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133418210","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10199144
Meet Kumari
In this paper, a high-speed inter satellite link (ISL) link over different orbits in satellite communication has been presented. The results show that the maximum transmission speed of 400Gbps can be obtained over 5000km ISL link. However, the transmission range can be exceeded upto 40000km for geostationary orbit satellite with additional loss of 1dB. Besides this, the comparative performance of proposed work shows its superiority over other work on satellite communication.
{"title":"Modeling of high-speed ISL links in Satellite Communication","authors":"Meet Kumari","doi":"10.1109/ACCESS57397.2023.10199144","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10199144","url":null,"abstract":"In this paper, a high-speed inter satellite link (ISL) link over different orbits in satellite communication has been presented. The results show that the maximum transmission speed of 400Gbps can be obtained over 5000km ISL link. However, the transmission range can be exceeded upto 40000km for geostationary orbit satellite with additional loss of 1dB. Besides this, the comparative performance of proposed work shows its superiority over other work on satellite communication.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"46 5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126003791","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10199890
Nutika, Rishabh Sharma, V. Kukreja, Prince Sood, Ankit Bansal
Utilising computer vision methods, there has been an ongoing study in recognizing and categorizing plant diseases. Citrus is a member of the plant family and is highly susceptible to disease, there hasn't been much research done on citrus disease detection. Citrus leaf blotch (CLB) disease can be detected and categorized based on how severe the illness is through a model for citrus leaf disease detection and classification. To categorize 8000 real-phase images of citrus leaves which include healthy and CLB-infected images, A deep learning (DL) model based on convolutional neural networks (CNN) has been presented.. This ranking accuracy of the CLB disease is 97.81% for binary classification and 98.81% for multi-classification, respectively. Additionally, cutting-edge pre-trained models have been compared., showing that it outperforms them in terms of multiple classifications of CLB sickness.
{"title":"Implementation of Deep Learning Technique for Citrus Leaf Blotch Disease Severity Detection","authors":"Nutika, Rishabh Sharma, V. Kukreja, Prince Sood, Ankit Bansal","doi":"10.1109/ACCESS57397.2023.10199890","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10199890","url":null,"abstract":"Utilising computer vision methods, there has been an ongoing study in recognizing and categorizing plant diseases. Citrus is a member of the plant family and is highly susceptible to disease, there hasn't been much research done on citrus disease detection. Citrus leaf blotch (CLB) disease can be detected and categorized based on how severe the illness is through a model for citrus leaf disease detection and classification. To categorize 8000 real-phase images of citrus leaves which include healthy and CLB-infected images, A deep learning (DL) model based on convolutional neural networks (CNN) has been presented.. This ranking accuracy of the CLB disease is 97.81% for binary classification and 98.81% for multi-classification, respectively. Additionally, cutting-edge pre-trained models have been compared., showing that it outperforms them in terms of multiple classifications of CLB sickness.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130341332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10200743
S. Verdiyev, A. Naghiyeva, Ajay Kumar
The task of concealment of information is an important challenge in the context of an advance infrastructure of open and uncontrolled Internet users' communication through computer networks. Various steganographic methods have been developed to solve the problem of the protection of transmitted information. The aim of the present paper is the experimental study of the robustness of the algorithm developed by authors for the concealment of information. The reliability of the algorithm for stego analytics attacks has been investigated by the RS method of steganalysis. In addition, the computational complexity of the algorithm is investigated which in turn should not exceed a certain threshold.The results of the experimental studies demonstrate the algorithm’s high robustness and minimal computational time.
{"title":"Experimental study of a novel technique of data hiding with high PSNR","authors":"S. Verdiyev, A. Naghiyeva, Ajay Kumar","doi":"10.1109/ACCESS57397.2023.10200743","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200743","url":null,"abstract":"The task of concealment of information is an important challenge in the context of an advance infrastructure of open and uncontrolled Internet users' communication through computer networks. Various steganographic methods have been developed to solve the problem of the protection of transmitted information. The aim of the present paper is the experimental study of the robustness of the algorithm developed by authors for the concealment of information. The reliability of the algorithm for stego analytics attacks has been investigated by the RS method of steganalysis. In addition, the computational complexity of the algorithm is investigated which in turn should not exceed a certain threshold.The results of the experimental studies demonstrate the algorithm’s high robustness and minimal computational time.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129062992","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10200022
Divneet Singh
Credit card fraud (CCF) is a persistent issue in the financial sector with serious consequences. Data mining has proven to be extremely useful in detecting fraud in online transactions. However, detecting CCF through data mining is quite a difficult task because of two causes: constant changes in the profiles of normal and fraudulent behaviour, and the highly skewed nature of the data sets. The outcome of fraud detection in credit card transactions depends on the sampling approach, detection techniques, and variable selection. This work studies the performance of K-Nearest Neighbor, Naive Bayes, Logistic Regression and Random Forest algorithms on a highly skewed dataset. The dataset contains 2,84,807 transactions and has been collected from European cardholder transactions. A hybrid of under-sampling and oversampling techniques has been used on the skewed data. The four techniques were utilized on both data namely preprocessed and raw, and the results are evaluated using specificity, accuracy, sensitivity, and F1-score. The outcomes show that the optimal accuracy for Naive Bayes, Logistic Regression, K-Nearest Neighbor and Random Forest classifiers are 98.72%, 52.34%, 96.89%, 91.67%, respectively. The comparative results indicate that K-Nearest Neighbor performs better than Logistic Regression, Random Forest and Naive Bayes techniques.
{"title":"Protecting Contactless Credit Card Payments from Fraud through Ambient Authentication and Machine Learning","authors":"Divneet Singh","doi":"10.1109/ACCESS57397.2023.10200022","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200022","url":null,"abstract":"Credit card fraud (CCF) is a persistent issue in the financial sector with serious consequences. Data mining has proven to be extremely useful in detecting fraud in online transactions. However, detecting CCF through data mining is quite a difficult task because of two causes: constant changes in the profiles of normal and fraudulent behaviour, and the highly skewed nature of the data sets. The outcome of fraud detection in credit card transactions depends on the sampling approach, detection techniques, and variable selection. This work studies the performance of K-Nearest Neighbor, Naive Bayes, Logistic Regression and Random Forest algorithms on a highly skewed dataset. The dataset contains 2,84,807 transactions and has been collected from European cardholder transactions. A hybrid of under-sampling and oversampling techniques has been used on the skewed data. The four techniques were utilized on both data namely preprocessed and raw, and the results are evaluated using specificity, accuracy, sensitivity, and F1-score. The outcomes show that the optimal accuracy for Naive Bayes, Logistic Regression, K-Nearest Neighbor and Random Forest classifiers are 98.72%, 52.34%, 96.89%, 91.67%, respectively. The comparative results indicate that K-Nearest Neighbor performs better than Logistic Regression, Random Forest and Naive Bayes techniques.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117031935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10200411
Royal Kaushal, Raman Chadha
Sentiment analysis, also called opinion mining (OM), is an approach to analyzing the sentiments expressed in data. This approach uses Natural Language Processing to classify the data based on emotions in different classes. Various Social media data are analyzed to determine sentiment, and Machine Learning (ML) techniques classify the data. This study utilizes ML models to analyze sentiment in WhatsApp data. The sentiment analysis process includes some steps, such as to pre-process the data, extract the features, and classify the data. The initial stage contributes to the clean-up of raw data and transforms it to make it suitable for analysis. Feature extraction is a stage to retrieve a relevant feature from the pre-processed data that contribute to determining sentiment. Finally, machine learning algorithms classify data to determine the sentiments expressed in the text. This work proposes a voting classifier which is hybrid architecture comprising SVM, KNN, and a Decision tree. Python is executed to simulate the suggested algorithm, and its performance is evaluated based on accuracy, precision, and recall metrics. These parameters are useful in measuring the efficiency of the algorithm in accurately classifying the sentiments existing in the data.
{"title":"Hybrid Model for Sentiment Analysis of Whatsapp Data","authors":"Royal Kaushal, Raman Chadha","doi":"10.1109/ACCESS57397.2023.10200411","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10200411","url":null,"abstract":"Sentiment analysis, also called opinion mining (OM), is an approach to analyzing the sentiments expressed in data. This approach uses Natural Language Processing to classify the data based on emotions in different classes. Various Social media data are analyzed to determine sentiment, and Machine Learning (ML) techniques classify the data. This study utilizes ML models to analyze sentiment in WhatsApp data. The sentiment analysis process includes some steps, such as to pre-process the data, extract the features, and classify the data. The initial stage contributes to the clean-up of raw data and transforms it to make it suitable for analysis. Feature extraction is a stage to retrieve a relevant feature from the pre-processed data that contribute to determining sentiment. Finally, machine learning algorithms classify data to determine the sentiments expressed in the text. This work proposes a voting classifier which is hybrid architecture comprising SVM, KNN, and a Decision tree. Python is executed to simulate the suggested algorithm, and its performance is evaluated based on accuracy, precision, and recall metrics. These parameters are useful in measuring the efficiency of the algorithm in accurately classifying the sentiments existing in the data.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121493691","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-18DOI: 10.1109/ACCESS57397.2023.10201024
Jeejo K P, Bobby Mathews C
One of the most widely used IoT applications, healthcare aims to reduce the need for hospitalisation by monitoring patients' vital signs throughout the day for several weeks. Many sensors, including as vital and unstructured message sensors as well as environmental monitoring sensors, are deployed in healthcare systems to collect patient information and lower costs for the patients. Several issues along the transmission channel could result in the loss of data gathered by sensors integrated into medical equipment. In order to identify nearly optimal routes and create a cutting-edge optimal route selection model for IoT healthcare, this article employs the Improved Multi-Objective Genetic Algorithm (IMOGA) technique. For a variety of causes, data transmitted by sensors integrated into medical equipment may be lost. As a result, creating a safe communication method in IoT networks is crucial for the healthcare industry. As a result, the best path for medical data is chosen while taking energy, distance, and delay into account. The performances of the adopted work are then contrasted. According on experimental findings, Energy, distance, and delay have all been improved by the suggested strategy by 14%, 2%, and 5.6%, respectively.
{"title":"Development of an Innovative Optimal Route Selection Model Based on an Improved Multi-Objective Genetic Algorithm (IMOGA) Method in IoT Healthcare","authors":"Jeejo K P, Bobby Mathews C","doi":"10.1109/ACCESS57397.2023.10201024","DOIUrl":"https://doi.org/10.1109/ACCESS57397.2023.10201024","url":null,"abstract":"One of the most widely used IoT applications, healthcare aims to reduce the need for hospitalisation by monitoring patients' vital signs throughout the day for several weeks. Many sensors, including as vital and unstructured message sensors as well as environmental monitoring sensors, are deployed in healthcare systems to collect patient information and lower costs for the patients. Several issues along the transmission channel could result in the loss of data gathered by sensors integrated into medical equipment. In order to identify nearly optimal routes and create a cutting-edge optimal route selection model for IoT healthcare, this article employs the Improved Multi-Objective Genetic Algorithm (IMOGA) technique. For a variety of causes, data transmitted by sensors integrated into medical equipment may be lost. As a result, creating a safe communication method in IoT networks is crucial for the healthcare industry. As a result, the best path for medical data is chosen while taking energy, distance, and delay into account. The performances of the adopted work are then contrasted. According on experimental findings, Energy, distance, and delay have all been improved by the suggested strategy by 14%, 2%, and 5.6%, respectively.","PeriodicalId":345351,"journal":{"name":"2023 3rd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS)","volume":"210 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114668213","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}