Pub Date : 2023-09-10DOI: 10.12694/scpe.v24i3.2138
Jatla Srikanth, Avula Damodaram Shanmugam
In today’s world, advanced internet technologies have significantly increased people’s affinity towards social networks to stay updated on current events and communicate with others residing in different cities. Social opinion analyses helped determine the optimal public health response during the COVID-19 pandemic. Analysis of articulating tweets from Twitter can reveal the public’s perceptions of social distance. Sentiment Analysis is used for classifying text data and analyzing people’s emotions. The proposed work uses LSTM-RNN with the SMOTE method for categorizing Twitter data. The suggested approach uses increased characteristics weighted by attention layers and an LSTM-RNN-based network as its foundation. This method computes the advantage of an improved information transformation framework through the attention mechanism compared to existing BI-LSTM and LSTM models. A combination of four publicly accessible class labels such as happy, sad, neutral, and angry, is analyzed. The message of tweets is analyzed for polarization and subjectivity using TextBlob, VADER (Valence Aware Dictionary for Sentiment Reasoning), and SentiWordNet. The model has been successfully built and evaluated using two feature extraction methods, TF-IDF (Term Frequency-Inverse Document Frequency) and Bag of Words (BoW). Compared to the previous methodologies, the suggested deep learning model improved considerably in performance measures, including accuracy, precision, and recall. This demonstrates how effective and practical the recommended deep learning strategy is and how simple it is to employ for sentiment categorization of COVID-19 reviews. The proposed method achieves 97% accuracy in classifying the text whereas, among existing Bi-LSTM, achieves 88% maximum in the text classification.
{"title":"A Deep LSTM-RNN Classification Method for Covid-19 Twitter Review Based on Sentiment Analysis","authors":"Jatla Srikanth, Avula Damodaram Shanmugam","doi":"10.12694/scpe.v24i3.2138","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2138","url":null,"abstract":"In today’s world, advanced internet technologies have significantly increased people’s affinity towards social networks to stay updated on current events and communicate with others residing in different cities. Social opinion analyses helped determine the optimal public health response during the COVID-19 pandemic. Analysis of articulating tweets from Twitter can reveal the public’s perceptions of social distance. Sentiment Analysis is used for classifying text data and analyzing people’s emotions. The proposed work uses LSTM-RNN with the SMOTE method for categorizing Twitter data. The suggested approach uses increased characteristics weighted by attention layers and an LSTM-RNN-based network as its foundation. This method computes the advantage of an improved information transformation framework through the attention mechanism compared to existing BI-LSTM and LSTM models. A combination of four publicly accessible class labels such as happy, sad, neutral, and angry, is analyzed. The message of tweets is analyzed for polarization and subjectivity using TextBlob, VADER (Valence Aware Dictionary for Sentiment Reasoning), and SentiWordNet. The model has been successfully built and evaluated using two feature extraction methods, TF-IDF (Term Frequency-Inverse Document Frequency) and Bag of Words (BoW). Compared to the previous methodologies, the suggested deep learning model improved considerably in performance measures, including accuracy, precision, and recall. This demonstrates how effective and practical the recommended deep learning strategy is and how simple it is to employ for sentiment categorization of COVID-19 reviews. The proposed method achieves 97% accuracy in classifying the text whereas, among existing Bi-LSTM, achieves 88% maximum in the text classification.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136071353","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-10DOI: 10.12694/scpe.v24i3.2272
Joseph Bamidele Awotunde, Tarek Gaber, L V Narasimha Prasad, Sakinat Oluwabukonla Folorunso, Vuyyuru Lakshmi Lalitha
The emergence of the Internet of Things (IoT) accelerated the implementation of various smart city applications and initiatives. The rapid adoption of IoT-powered smart cities is faced by a number of security and privacy challenges that hindered their application in areas such as critical infrastructure. One of the most crucial elements of any smart city is safety. Without the right safeguards, bad actors can quickly exploit weak systems to access networks or sensitive data. Security issues are a big worry for smart cities in addition to safety issues. Smart cities become easy targets for attackers attempting to steal data or disrupt services if they are not adequately protected against cyberthreats like malware or distributed denial-of-service (DDoS) attacks. Therefore, in order to safeguard their systems from potential threats, businesses must employ strong security protocols including encryption, authentication, and access control measures. In order to ensure that their network traffic remains secure, organizations should implement powerful network firewalls and intrusion detection systems (IDS). This article proposes a blockchain-supported hybrid Convolutional Neural Network (CNN) with Kernel Principal Component Analysis (KPCA) to provide privacy and security for smart city users and systems. Blockchain is used to provide trust, and CNN enabled with KPCA is used for classifying threats. The proposed solution comprises three steps, preprocessing, feature selection, and classification. The standard features of the datasets used are converted to a numeric format during the preprocessing stage, and the result is sent to KPCA for feature extraction. Feature extraction reduces the dimensionality of relevant features before it passes the resulting dataset to the CNN to classify and detect malicious activities. Two prominent datasets namely ToN-IoT and BoT-IoT were used to measure the performance of this anticipated method compared to its best rivals in the literature. Experimental evaluation results show an improved performance in terms of threat prediction accuracy, and hence, increased security, privacy, and maintainability of IoT-enabled smart cities.
{"title":"Privacy and Security Enhancement of Smart Cities using Hybrid Deep Learning-enabled Blockchain","authors":"Joseph Bamidele Awotunde, Tarek Gaber, L V Narasimha Prasad, Sakinat Oluwabukonla Folorunso, Vuyyuru Lakshmi Lalitha","doi":"10.12694/scpe.v24i3.2272","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2272","url":null,"abstract":"The emergence of the Internet of Things (IoT) accelerated the implementation of various smart city applications and initiatives. The rapid adoption of IoT-powered smart cities is faced by a number of security and privacy challenges that hindered their application in areas such as critical infrastructure. One of the most crucial elements of any smart city is safety. Without the right safeguards, bad actors can quickly exploit weak systems to access networks or sensitive data. Security issues are a big worry for smart cities in addition to safety issues. Smart cities become easy targets for attackers attempting to steal data or disrupt services if they are not adequately protected against cyberthreats like malware or distributed denial-of-service (DDoS) attacks. Therefore, in order to safeguard their systems from potential threats, businesses must employ strong security protocols including encryption, authentication, and access control measures. In order to ensure that their network traffic remains secure, organizations should implement powerful network firewalls and intrusion detection systems (IDS). This article proposes a blockchain-supported hybrid Convolutional Neural Network (CNN) with Kernel Principal Component Analysis (KPCA) to provide privacy and security for smart city users and systems. Blockchain is used to provide trust, and CNN enabled with KPCA is used for classifying threats. The proposed solution comprises three steps, preprocessing, feature selection, and classification. The standard features of the datasets used are converted to a numeric format during the preprocessing stage, and the result is sent to KPCA for feature extraction. Feature extraction reduces the dimensionality of relevant features before it passes the resulting dataset to the CNN to classify and detect malicious activities. Two prominent datasets namely ToN-IoT and BoT-IoT were used to measure the performance of this anticipated method compared to its best rivals in the literature. Experimental evaluation results show an improved performance in terms of threat prediction accuracy, and hence, increased security, privacy, and maintainability of IoT-enabled smart cities.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136071900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The existing approach for monitoring the insulation state of photoelectric composite submarine cables primarily relies on detecting the current of the cable protection layer. However, this conventional method suffers from limited monitoring accuracy due to the absence of parameter identification processing for the cable. As a result, there is a need to improve the monitoring methodology by incorporating robust parameter identification techniques to enhance the accuracy of insulation state evaluation. In this regard, a real-time monitoring method based on thermoelectric coupling is proposed to monitor the insulation status of the photoelectric composite submarine cable. By constructing an equivalent composite circuit model and a thermodynamic function, a thermoelectric coupling model is constructed and used to identify the parameters of the submarine cable; by extracting the frequency extremes in the spectral values of the submarine cable current signal, an equivalent insulation characteristic function is constructed to realize the determination of the insulation state. The proposed method is verified for the insulation state monitoring effect in the experiment. The experimental results show that when the proposed method is used to monitor the insulation state of the photoelectric composite submarine cable, the calculated partial discharge quantity has a small error, and the monitoring accuracy is high.
{"title":"Real-time Monitoring Method of Insulation Status of Photoelectric Composite Submarine Cable based on Thermoelectric Coupling","authors":"Xinli Lao, Jiajian Zhang, Chuanlian Gao, Huakun Deng, Yanlei Wei, Zhenzhong Liu","doi":"10.12694/scpe.v24i3.2282","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2282","url":null,"abstract":"The existing approach for monitoring the insulation state of photoelectric composite submarine cables primarily relies on detecting the current of the cable protection layer. However, this conventional method suffers from limited monitoring accuracy due to the absence of parameter identification processing for the cable. As a result, there is a need to improve the monitoring methodology by incorporating robust parameter identification techniques to enhance the accuracy of insulation state evaluation. In this regard, a real-time monitoring method based on thermoelectric coupling is proposed to monitor the insulation status of the photoelectric composite submarine cable. By constructing an equivalent composite circuit model and a thermodynamic function, a thermoelectric coupling model is constructed and used to identify the parameters of the submarine cable; by extracting the frequency extremes in the spectral values of the submarine cable current signal, an equivalent insulation characteristic function is constructed to realize the determination of the insulation state. The proposed method is verified for the insulation state monitoring effect in the experiment. The experimental results show that when the proposed method is used to monitor the insulation state of the photoelectric composite submarine cable, the calculated partial discharge quantity has a small error, and the monitoring accuracy is high.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136072012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-10DOI: 10.12694/scpe.v24i3.2318
Lu Lu
In the actual operation process of the conventional digital agricultural greenhouse monitoring system, there are problems such as limited monitoring scope and large deviation between the monitoring results and the actual situation of the greenhouse. To solve this problem, a new remote monitoring system is proposed by introducing the technology of the Internet of Things. On the basis of the completion of the hardware design of the remote monitoring system, the optimal fusion data value of the remote monitoring of the digital agricultural greenhouse is obtained by establishing the monitoring data fusion model. The particle swarm optimization fuzzy control algorithm is designed to optimize the adaptive remote monitoring process of the system dynamically. The Internet of Things technology is used to deploy the remote monitoring system of digital agricultural greenhouses online to fully ensure the quality and timeliness of the remote monitoring system. The test results show that the new system can significantly improve the greenhouse remote monitoring deviation, and the monitoring value is close to the actual value.
{"title":"Remote Monitoring System of Digital Agricultural Greenhouse Based on Internet of Things","authors":"Lu Lu","doi":"10.12694/scpe.v24i3.2318","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2318","url":null,"abstract":"In the actual operation process of the conventional digital agricultural greenhouse monitoring system, there are problems such as limited monitoring scope and large deviation between the monitoring results and the actual situation of the greenhouse. To solve this problem, a new remote monitoring system is proposed by introducing the technology of the Internet of Things. On the basis of the completion of the hardware design of the remote monitoring system, the optimal fusion data value of the remote monitoring of the digital agricultural greenhouse is obtained by establishing the monitoring data fusion model. The particle swarm optimization fuzzy control algorithm is designed to optimize the adaptive remote monitoring process of the system dynamically. The Internet of Things technology is used to deploy the remote monitoring system of digital agricultural greenhouses online to fully ensure the quality and timeliness of the remote monitoring system. The test results show that the new system can significantly improve the greenhouse remote monitoring deviation, and the monitoring value is close to the actual value.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136072015","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-10DOI: 10.12694/scpe.v24i3.2438
Jia Lin, Bigerng Zheng, Zhijian Chen
A visualization and interactive network topology model are studied based on real-time features generated during spaceflight. Start by establishing a consistent set of data and logical interaction interfaces. This paper presents a method of scenario model construction and application programming based on virtual reality technology. The scene elements are extracted into two types of primitives, namely logical type and simulated object type. This provides a unified architecture for the editing and processing of graphic elements. This system can realize the automatic creation of the scene. Then the point cloud data obtained by sparse reconstruction of SFM is reconstructed to the Poisson surface. You get a dense, uniform grid. Experiments show that the proposed algorithm can realize the 3D reconstruction of non-cooperative objects. The spatial feature points obtained in the spatial positioning of non-cooperative objects can provide necessary technical support for its orbit positioning. The model can quickly generate new model scenario pages according to the characteristics of the task. This method changes the display mode, which can only be static or limited dynamic before. It has also improved the efficiency of space mission preparation.
{"title":"Application of Data Visualization Interaction Technology in Aerospace Data Processing","authors":"Jia Lin, Bigerng Zheng, Zhijian Chen","doi":"10.12694/scpe.v24i3.2438","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2438","url":null,"abstract":"A visualization and interactive network topology model are studied based on real-time features generated during spaceflight. Start by establishing a consistent set of data and logical interaction interfaces. This paper presents a method of scenario model construction and application programming based on virtual reality technology. The scene elements are extracted into two types of primitives, namely logical type and simulated object type. This provides a unified architecture for the editing and processing of graphic elements. This system can realize the automatic creation of the scene. Then the point cloud data obtained by sparse reconstruction of SFM is reconstructed to the Poisson surface. You get a dense, uniform grid. Experiments show that the proposed algorithm can realize the 3D reconstruction of non-cooperative objects. The spatial feature points obtained in the spatial positioning of non-cooperative objects can provide necessary technical support for its orbit positioning. The model can quickly generate new model scenario pages according to the characteristics of the task. This method changes the display mode, which can only be static or limited dynamic before. It has also improved the efficiency of space mission preparation.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136072159","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-10DOI: 10.12694/scpe.v24i3.2163
Xiaoteng Liu
The article addresses the challenges modelled by the inadequacy of traditional detection methods in effectively handling the substantial volume of software behavior samples, particularly in big data. A novel approach is proposed for leveraging big data technology to detect malicious computer code signals. Additionally, it seeks to attack the issues associated with machine learning-based mobile malware detection, namely the presence of a large number of features, low accuracy in detection, and imbalanced data distribution. To resolve these challenges, this paper presents a multifaceted methodology. First, it introduces a feature selection technique based on mean and variance analysis to eliminate irrelevant features hindering classification accuracy. Next, a comprehensive classification method is implemented, utilizing various feature extraction techniques such as principal component analysis (PCA), Kaehunen-Loeve transform (KLT), and independent component analysis (ICA). These techniques collectively contribute to enhancing the Precision of the detection process. Recognizing the issue of unbalanced data distribution among software samples, the study proposes a multi-level classification integration model grounded in decision trees. In response, the research focuses on enhancing accuracy and mitigating the impact of data imbalance through a combination of feature selection, extraction techniques, and a multi-level classification model. The empirical results highlight the effectiveness of the proposed methodologies, showcasing notable accuracy improvements ranging from 3.36% to 6.41% across different detection methods on the Android platform. The introduced malware detection technology, grounded in source code analysis, demonstrates a promising capacity to identify Android malware effectively.
{"title":"Computer Malicious Code Signal Detection based on Big Data Technology","authors":"Xiaoteng Liu","doi":"10.12694/scpe.v24i3.2163","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2163","url":null,"abstract":"The article addresses the challenges modelled by the inadequacy of traditional detection methods in effectively handling the substantial volume of software behavior samples, particularly in big data. A novel approach is proposed for leveraging big data technology to detect malicious computer code signals. Additionally, it seeks to attack the issues associated with machine learning-based mobile malware detection, namely the presence of a large number of features, low accuracy in detection, and imbalanced data distribution. To resolve these challenges, this paper presents a multifaceted methodology. First, it introduces a feature selection technique based on mean and variance analysis to eliminate irrelevant features hindering classification accuracy. Next, a comprehensive classification method is implemented, utilizing various feature extraction techniques such as principal component analysis (PCA), Kaehunen-Loeve transform (KLT), and independent component analysis (ICA). These techniques collectively contribute to enhancing the Precision of the detection process. Recognizing the issue of unbalanced data distribution among software samples, the study proposes a multi-level classification integration model grounded in decision trees. In response, the research focuses on enhancing accuracy and mitigating the impact of data imbalance through a combination of feature selection, extraction techniques, and a multi-level classification model. The empirical results highlight the effectiveness of the proposed methodologies, showcasing notable accuracy improvements ranging from 3.36% to 6.41% across different detection methods on the Android platform. The introduced malware detection technology, grounded in source code analysis, demonstrates a promising capacity to identify Android malware effectively.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136071356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-10DOI: 10.12694/scpe.v24i3.2342
Yang Xu
It makes the reflection of humans’ emotions and intentions from watching live sports events. Watching the event keeps people entertained and changes their mindset from being stressed to joyful. Watching sports events encourages the athletes and the sports persons to participate. Reflection of the live sports event consists of many crowds as the event’s audience. This crowd’s emotions and intentions directly impact the changes in the event’s performance. It provides positive energy to the demotivated sports participants, making them perform better in the event. This study reflects the understanding of the facial emotions of the spectators from the live event. Then, they are decoded in the computer programming language, and an outcome is provided. It understands the emotions and sentiments of the people that affect the event’s environment. The representation by the computer analysis makes the understanding of the changes provided by the spectators of the live event. The effect of the audience’s emotions and behaviors in the crowd are computed by the utilization of computer software analysis and the effect of those reactions in the event. The collection of data is taken from the secondary sources of data collection, including the collection of information from the article and the journal based on the topic. The gathered data is analyzed by comparing them with their reaction and expressions in the live sports event.
{"title":"Analyzing Spectator Emotions and Behaviors at Live Sporting Events using Computer Vision and Sentiment Analysis Techniques","authors":"Yang Xu","doi":"10.12694/scpe.v24i3.2342","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2342","url":null,"abstract":"It makes the reflection of humans’ emotions and intentions from watching live sports events. Watching the event keeps people entertained and changes their mindset from being stressed to joyful. Watching sports events encourages the athletes and the sports persons to participate. Reflection of the live sports event consists of many crowds as the event’s audience. This crowd’s emotions and intentions directly impact the changes in the event’s performance. It provides positive energy to the demotivated sports participants, making them perform better in the event. This study reflects the understanding of the facial emotions of the spectators from the live event. Then, they are decoded in the computer programming language, and an outcome is provided. It understands the emotions and sentiments of the people that affect the event’s environment. The representation by the computer analysis makes the understanding of the changes provided by the spectators of the live event. The effect of the audience’s emotions and behaviors in the crowd are computed by the utilization of computer software analysis and the effect of those reactions in the event. The collection of data is taken from the secondary sources of data collection, including the collection of information from the article and the journal based on the topic. The gathered data is analyzed by comparing them with their reaction and expressions in the live sports event.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136071359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-10DOI: 10.12694/scpe.v24i3.2122
Sandeep Yelisetti, None Nellore Geethanjali
Sentiment analysis has gained increasing attention from an educational and social perspective with the huge expansion of user interactions due to the Web’s significant improvement. The connection between an opinion target’s polarity scores and other aspects of the content is defined by aspect-based sentiment analysis. Identifying aspects and determining their different polarities is quite complicated because they are frequently implicit. To overcome these difficulties, efficient hybrid methods are used in aspect-based text classification in sentiment analysis. The existing process evaluates the aspects of polarity by using a Convolutional neural network, and it does not work with Big data. In this work, aspect-based text classification and attention mechanisms are used to assist in filtering out irrelevant information and quickly locating the essential features in big data. Initially, the data is collected, and then the data is preprocessed by using Tokenization, Stop word removal, Stemming, and Lemmatization. After preprocessing, the features are vectorized and extracted using Bag-of-Words and TF-IDF. Then, the extracted features are given into word embeddings by GloVe and Word2vec. It uses Deep Recurrent based Bidirectional Long Short Term Memory (RUBiLSTM) for aspect-based sentiment analysis. The RU-Bi-LSTM method integrates aspect-based embeddings and an attention mechanism for text classification. The attention mechanism focuses on more crucial aspects and the bidirectional LSTM to maintain context in both ways. Finally, the binary and ternary classification outcomes are obtained using the final dense softmax output layer. The proposed RU-BiLSTM uses four reviews and two Twitter datasets. The results of the studies demonstrate the efficacy of the RU-BiLSTM model, which outperformed aspect-based classifications on lengthy reviews and short tweets in terms of evaluation.
随着网络的显著改进,用户交互的巨大扩展,情感分析从教育和社会的角度得到了越来越多的关注。意见目标的极性得分与内容的其他方面之间的联系是由基于方面的情感分析定义的。识别方面并确定它们的不同极性是相当复杂的,因为它们通常是隐含的。为了克服这些困难,情感分析中基于方面的文本分类采用了高效的混合方法。现有的方法是通过使用卷积神经网络来评估极性的各个方面,而且它不适用于大数据。在这项工作中,使用基于方面的文本分类和注意机制来帮助过滤掉不相关的信息,并快速定位大数据中的基本特征。首先收集数据,然后使用Tokenization、Stop word removal、词干化和词形化对数据进行预处理。预处理后,使用Bag-of-Words和TF-IDF对特征进行矢量化提取。然后,将提取的特征用GloVe和Word2vec进行词嵌入。它使用基于深度循环的双向长短期记忆(RUBiLSTM)进行基于方面的情感分析。RU-Bi-LSTM方法集成了基于方面的嵌入和文本分类的注意机制。注意机制关注更关键的方面,双向LSTM在两种方式下维持语境。最后,利用最终的密集softmax输出层得到二值和三值分类结果。提出的RU-BiLSTM使用四个评论和两个Twitter数据集。研究结果证明了RU-BiLSTM模型的有效性,在评估方面优于基于方面的分类,在长评论和短推文中。
{"title":"Aspect-based Text Classification for Sentimental Analysis using Attention mechanism with RU-BiLSTM","authors":"Sandeep Yelisetti, None Nellore Geethanjali","doi":"10.12694/scpe.v24i3.2122","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2122","url":null,"abstract":"Sentiment analysis has gained increasing attention from an educational and social perspective with the huge expansion of user interactions due to the Web’s significant improvement. The connection between an opinion target’s polarity scores and other aspects of the content is defined by aspect-based sentiment analysis. Identifying aspects and determining their different polarities is quite complicated because they are frequently implicit. To overcome these difficulties, efficient hybrid methods are used in aspect-based text classification in sentiment analysis. The existing process evaluates the aspects of polarity by using a Convolutional neural network, and it does not work with Big data. In this work, aspect-based text classification and attention mechanisms are used to assist in filtering out irrelevant information and quickly locating the essential features in big data. Initially, the data is collected, and then the data is preprocessed by using Tokenization, Stop word removal, Stemming, and Lemmatization. After preprocessing, the features are vectorized and extracted using Bag-of-Words and TF-IDF. Then, the extracted features are given into word embeddings by GloVe and Word2vec. It uses Deep Recurrent based Bidirectional Long Short Term Memory (RUBiLSTM) for aspect-based sentiment analysis. The RU-Bi-LSTM method integrates aspect-based embeddings and an attention mechanism for text classification. The attention mechanism focuses on more crucial aspects and the bidirectional LSTM to maintain context in both ways. Finally, the binary and ternary classification outcomes are obtained using the final dense softmax output layer. The proposed RU-BiLSTM uses four reviews and two Twitter datasets. The results of the studies demonstrate the efficacy of the RU-BiLSTM model, which outperformed aspect-based classifications on lengthy reviews and short tweets in terms of evaluation.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136071664","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-10DOI: 10.12694/scpe.v24i3.2372
Xin Chen
The main dynamic truck routing problem also presents a significant difficulty in the logistics sector, which is an unavoidable development trend of the contemporary technological changing society. A dynamic vehicle routing problem with time window model is suggested by the study in order to establish an effective and low-energy dynamic response method. The fundamental concept is to disrupt the conventional strategy of static dynamic consumers responding in time slots by dividing the dynamic time window into a static time window with several time slice intervals. The study makes use of cutting-edge ideas including dynamic attitude, before-and-after time slicing, and continuous optimisation while proposing a new method for model solution to optimise dynamic vehicle route issues effectively and affordably. The study employs the Solomon optimisation dataset and runs simulation studies on the Java platform to confirm its efficacy. The experimental findings demonstrated that the optimisation technique employed in the study reduced the cost of travelling by 83.8 miles while also considerably increasing the average vehicle utilisation by 3.6%. Because driving distance cost and vehicle number cost are typically positively connected with dynamic attitude, the study employs solutions that can increase dynamic response efficiency and save money. As a result, their robustness is higher.
{"title":"Time Window Oriented IoT Vehicle Pathway Study for the Dynamically Changing Needs of E-Commerce Customers","authors":"Xin Chen","doi":"10.12694/scpe.v24i3.2372","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2372","url":null,"abstract":"The main dynamic truck routing problem also presents a significant difficulty in the logistics sector, which is an unavoidable development trend of the contemporary technological changing society. A dynamic vehicle routing problem with time window model is suggested by the study in order to establish an effective and low-energy dynamic response method. The fundamental concept is to disrupt the conventional strategy of static dynamic consumers responding in time slots by dividing the dynamic time window into a static time window with several time slice intervals. The study makes use of cutting-edge ideas including dynamic attitude, before-and-after time slicing, and continuous optimisation while proposing a new method for model solution to optimise dynamic vehicle route issues effectively and affordably. The study employs the Solomon optimisation dataset and runs simulation studies on the Java platform to confirm its efficacy. The experimental findings demonstrated that the optimisation technique employed in the study reduced the cost of travelling by 83.8 miles while also considerably increasing the average vehicle utilisation by 3.6%. Because driving distance cost and vehicle number cost are typically positively connected with dynamic attitude, the study employs solutions that can increase dynamic response efficiency and save money. As a result, their robustness is higher.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136071684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Conventional online composite submarine cable monitoring data release mostly adopts the method and principle of blockchain dynamic zoning consensus. In the data release process, there are omissions, and it takes a long time to complete the task, which reduces the timeliness of online composite submarine cable monitoring data release. Based on this, a new data publishing method is proposed by introducing horizontal federation learning. First, the online monitoring data of composite submarine cables are collected and preprocessed to eliminate the high-frequency capacitive effect of submarine cables. Secondly, manage composite submarine cable data nodes, transform the status relationship of data nodes, and ensure the quality of subsequent data release. A horizontal federation learning model is established to design the online monitoring data release process. The experimental results show that the new data release method is highly feasible. With the increasing online monitoring data of composite submarine cables, the time required for data release is short, and the timeliness is high.
{"title":"A Method for Online Monitoring Data Release of Composite Submarine Cable Based on Horizontal Federated Learning","authors":"Xinli Lao, Jiajian Zhang, Chuanlian Gao, Huakun Deng, Yanlei Wei, Zhenzhong Liu","doi":"10.12694/scpe.v24i3.2275","DOIUrl":"https://doi.org/10.12694/scpe.v24i3.2275","url":null,"abstract":"Conventional online composite submarine cable monitoring data release mostly adopts the method and principle of blockchain dynamic zoning consensus. In the data release process, there are omissions, and it takes a long time to complete the task, which reduces the timeliness of online composite submarine cable monitoring data release. Based on this, a new data publishing method is proposed by introducing horizontal federation learning. First, the online monitoring data of composite submarine cables are collected and preprocessed to eliminate the high-frequency capacitive effect of submarine cables. Secondly, manage composite submarine cable data nodes, transform the status relationship of data nodes, and ensure the quality of subsequent data release. A horizontal federation learning model is established to design the online monitoring data release process. The experimental results show that the new data release method is highly feasible. With the increasing online monitoring data of composite submarine cables, the time required for data release is short, and the timeliness is high.","PeriodicalId":43791,"journal":{"name":"Scalable Computing-Practice and Experience","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136071655","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}