首页 > 最新文献

Journal of Advances in Information Technology最新文献

英文 中文
Evaluation of Illumination in 3D Scenes Based on Heat Maps Comparison 基于热图对比的三维场景照明评价
IF 1 Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.3.601-605
A. Mezhenin, V. Izvozchikova, Ivan A. Mezhenin
—The issues of assessing the quality of lighting computer 3D scenes using different lighting systems are considered. Quality lighting increases realism, immersion and improves the perception of shape, color and texture of objects in the image. Existing engineering professional lighting calculation programs are not well suited to the design, art solutions or gaming scenes. To obtain objective estimates of illumination, we propose to use metrics for evaluating the quality of rendering systems. Particular attention is paid to the use of such tools as heat maps. Their visual analysis by hue or intensity helps to compare and evaluate the quality of illumination of scenes. However, such a comparison does not give a cumulative score. A possible solution is to treat heat maps as images and use them as the basis for a generalized heat map to produce a single cumulative statistic. In order to create a generalized heat map, several ways of constructing a difference matrix based on normalization methods have been proposed. The proposed approach is implemented as a prototype application. Experiments were carried out on test scenes with different illumination systems. The generalized heat maps made it possible to obtain cumulative estimates of the comparison of different lighting approaches and to identify areas most sensitive to changes in illumination. According to the authors, the proposed approach to illuminance estimation for staged lighting can be used to improve the realism of visualization in 3D modeling.
-考虑了使用不同照明系统评估计算机3D场景照明质量的问题。高质量的照明增加了真实感,沉浸感,并改善了图像中物体的形状,颜色和纹理的感知。现有的工程专业照明计算程序不太适合设计,艺术解决方案或游戏场景。为了获得照明的客观估计,我们建议使用度量来评估渲染系统的质量。特别注意使用诸如热图之类的工具。他们通过色调或强度进行视觉分析,有助于比较和评估场景的照明质量。然而,这样的比较并没有给出一个累积分数。一种可能的解决方案是将热图视为图像,并将其用作广义热图的基础,以产生单个累积统计数据。为了生成广义热图,提出了几种基于归一化方法构造差分矩阵的方法。该方法作为一个原型应用程序实现。在不同照明系统的测试场景中进行了实验。普遍化的热图使人们能够获得不同照明方法比较的累积估计,并确定对照明变化最敏感的地区。作者认为,该方法可用于舞台照明的照度估计,以提高三维建模可视化的真实感。
{"title":"Evaluation of Illumination in 3D Scenes Based on Heat Maps Comparison","authors":"A. Mezhenin, V. Izvozchikova, Ivan A. Mezhenin","doi":"10.12720/jait.14.3.601-605","DOIUrl":"https://doi.org/10.12720/jait.14.3.601-605","url":null,"abstract":"—The issues of assessing the quality of lighting computer 3D scenes using different lighting systems are considered. Quality lighting increases realism, immersion and improves the perception of shape, color and texture of objects in the image. Existing engineering professional lighting calculation programs are not well suited to the design, art solutions or gaming scenes. To obtain objective estimates of illumination, we propose to use metrics for evaluating the quality of rendering systems. Particular attention is paid to the use of such tools as heat maps. Their visual analysis by hue or intensity helps to compare and evaluate the quality of illumination of scenes. However, such a comparison does not give a cumulative score. A possible solution is to treat heat maps as images and use them as the basis for a generalized heat map to produce a single cumulative statistic. In order to create a generalized heat map, several ways of constructing a difference matrix based on normalization methods have been proposed. The proposed approach is implemented as a prototype application. Experiments were carried out on test scenes with different illumination systems. The generalized heat maps made it possible to obtain cumulative estimates of the comparison of different lighting approaches and to identify areas most sensitive to changes in illumination. According to the authors, the proposed approach to illuminance estimation for staged lighting can be used to improve the realism of visualization in 3D modeling.","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66332130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Detecting Unusual Activities in Local Network Using Snort and Wireshark Tools 使用Snort和Wireshark工具检测本地网络中的异常活动
IF 1 Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.4.616-624
N. Alsharabi, Maha Alqunun, Belal Abdullah Hezam Murshed
—Many organizations worldwide encounter security risks on their local network caused by malware, which might result in losing sensitive data. Thus, network administrators should use efficient tools to observe the instantaneous network traffic and detect any suspicious activity. This project aims to detect incidents in local networks based on snort and Wireshark tools. Wireshark and snort tools combine their advantages to achieve maximum benefit, enhance the security level of local networks, and protect data. Snort Intrusion Detection System (Snort-IDS) is a security tool for network security. Snort-IDS rules use to match packet traffic. If some packets match the rules, Snort-IDS will generate alert messages. First, this project uses a virtual dataset that includes normal and abnormal traffic for the performance evaluation test. In addition, design local rules to detect anomalous activities. Second, use Wireshark software to analyze data packets. Second, use Wireshark software to analyze data packets. This project categorizes the detected patterns into two groups, anomaly-based detection, and signature-based detection. The results revealed the efficiency of the snort-IDS system in detecting unusual activities in both patterns and generating more information by analyzing it by Wireshark, such as source, destination, and protocol type. The promoted experience was tested on the virtual local network to ensure the effectiveness of this method.
—全球许多组织在其本地网络上都遇到了由恶意软件引起的安全风险,可能导致敏感数据丢失。因此,网络管理员应该使用有效的工具来观察瞬时网络流量并检测任何可疑活动。本项目旨在基于snort和Wireshark工具检测本地网络中的事件。Wireshark工具和snort工具的优势相结合,实现效益最大化,增强本地网络的安全水平,保护数据。Snort入侵检测系统(Snort- ids)是一种用于网络安全的安全工具。Snort-IDS规则用于匹配数据包流量。如果一些数据包匹配规则,Snort-IDS将生成警报消息。首先,该项目使用一个虚拟数据集,其中包括正常和异常流量进行性能评估测试。此外,设计局部规则来检测异常活动。其次,使用Wireshark软件对数据包进行分析。其次,使用Wireshark软件对数据包进行分析。本项目将检测到的模式分为两组,基于异常的检测和基于签名的检测。结果显示了snort-IDS系统在检测两种模式中的异常活动和通过Wireshark分析生成更多信息(如源、目的地和协议类型)方面的效率。在虚拟局域网中对提升体验进行了测试,验证了该方法的有效性。
{"title":"Detecting Unusual Activities in Local Network Using Snort and Wireshark Tools","authors":"N. Alsharabi, Maha Alqunun, Belal Abdullah Hezam Murshed","doi":"10.12720/jait.14.4.616-624","DOIUrl":"https://doi.org/10.12720/jait.14.4.616-624","url":null,"abstract":"—Many organizations worldwide encounter security risks on their local network caused by malware, which might result in losing sensitive data. Thus, network administrators should use efficient tools to observe the instantaneous network traffic and detect any suspicious activity. This project aims to detect incidents in local networks based on snort and Wireshark tools. Wireshark and snort tools combine their advantages to achieve maximum benefit, enhance the security level of local networks, and protect data. Snort Intrusion Detection System (Snort-IDS) is a security tool for network security. Snort-IDS rules use to match packet traffic. If some packets match the rules, Snort-IDS will generate alert messages. First, this project uses a virtual dataset that includes normal and abnormal traffic for the performance evaluation test. In addition, design local rules to detect anomalous activities. Second, use Wireshark software to analyze data packets. Second, use Wireshark software to analyze data packets. This project categorizes the detected patterns into two groups, anomaly-based detection, and signature-based detection. The results revealed the efficiency of the snort-IDS system in detecting unusual activities in both patterns and generating more information by analyzing it by Wireshark, such as source, destination, and protocol type. The promoted experience was tested on the virtual local network to ensure the effectiveness of this method.","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66332412","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Intrusion Detection System in IoT Based on GA-ELM Hybrid Method 基于GA-ELM混合方法的物联网入侵检测系统
IF 1 Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.4.625-629
Elijah M. Maseno, Z. Wang, Fangzhou Liu
—In recent years, we have witnessed rapid growth in the application of IoT globally. IoT has found its applications in governmental and non-governmental institutions. The integration of a large number of electronic devices exposes IoT technologies to various forms of cyber-attacks. Cybercriminals have shifted their focus to the IoT as it provides a broad network intrusion surface area. To better protect IoT devices, we need intelligent intrusion detection systems. This work proposes a hybrid detection system based on Genetic Algorithm (GA) and Extreme Learning Method (ELM). The main limitation of ELM is that the initial parameters (weights and biases) are chosen randomly affecting the algorithm’s performance. To overcome this challenge, GA is used for the selection of the input weights. In addition, the choice of activation function is key for the optimal performance of a model. In this work, we have used different activation functions to demonstrate the importance of activation functions in the construction of GA-ELM. The proposed model was evaluated using the TON_IoT network data set. This data set is an up-to-date heterogeneous data set that captures the sophisticated cyber threats in the IoT environment. The results show that the GA-ELM model has a high accuracy compared to single ELM. In addition, Relu outperformed other activation functions, and this can be attributed to the fact that it is known to have fast learning capabilities and solves the challenge of vanishing gradient witnessed in the sigmoid activation function.
-近年来,我们见证了物联网在全球范围内的应用快速增长。物联网已经在政府和非政府机构中得到了应用。大量电子设备的集成使物联网技术面临各种形式的网络攻击。网络犯罪分子已经将注意力转移到物联网上,因为它提供了广泛的网络入侵表面积。为了更好地保护物联网设备,我们需要智能入侵检测系统。本文提出了一种基于遗传算法(GA)和极限学习方法(ELM)的混合检测系统。ELM的主要限制是初始参数(权重和偏差)是随机选择的,影响算法的性能。为了克服这一挑战,遗传算法被用于选择输入权重。此外,激活函数的选择是模型实现最佳性能的关键。在这项工作中,我们使用了不同的激活函数来证明激活函数在GA-ELM构建中的重要性。利用TON_IoT网络数据集对所提出的模型进行了评估。该数据集是最新的异构数据集,可捕获物联网环境中复杂的网络威胁。结果表明,GA-ELM模型与单一ELM模型相比具有较高的精度。此外,Relu优于其他激活函数,这可以归因于它具有快速学习能力,并解决了sigmoid激活函数中出现的梯度消失的挑战。
{"title":"Intrusion Detection System in IoT Based on GA-ELM Hybrid Method","authors":"Elijah M. Maseno, Z. Wang, Fangzhou Liu","doi":"10.12720/jait.14.4.625-629","DOIUrl":"https://doi.org/10.12720/jait.14.4.625-629","url":null,"abstract":"—In recent years, we have witnessed rapid growth in the application of IoT globally. IoT has found its applications in governmental and non-governmental institutions. The integration of a large number of electronic devices exposes IoT technologies to various forms of cyber-attacks. Cybercriminals have shifted their focus to the IoT as it provides a broad network intrusion surface area. To better protect IoT devices, we need intelligent intrusion detection systems. This work proposes a hybrid detection system based on Genetic Algorithm (GA) and Extreme Learning Method (ELM). The main limitation of ELM is that the initial parameters (weights and biases) are chosen randomly affecting the algorithm’s performance. To overcome this challenge, GA is used for the selection of the input weights. In addition, the choice of activation function is key for the optimal performance of a model. In this work, we have used different activation functions to demonstrate the importance of activation functions in the construction of GA-ELM. The proposed model was evaluated using the TON_IoT network data set. This data set is an up-to-date heterogeneous data set that captures the sophisticated cyber threats in the IoT environment. The results show that the GA-ELM model has a high accuracy compared to single ELM. In addition, Relu outperformed other activation functions, and this can be attributed to the fact that it is known to have fast learning capabilities and solves the challenge of vanishing gradient witnessed in the sigmoid activation function.","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66332535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Fusion of CNN-QCSO for Content Based Image Retrieval 融合CNN-QCSO的基于内容的图像检索
IF 1 Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.4.668-673
Sarva Naveen Kumar, Ch. Sumanth Kumar
—As the growth of digital images is been widely increased over the last few years on internet, the retrieval of required image is been a big problem. In this paper, a combinational approach is designed for retrieval of image form big data. The approach is CNN-QCSO, one is deep learning technique, i.e., Convolutional Neural Network (CNN) and another is optimization technique, i.e., Quantm Cuckoo Search Optimization (QCSO). CNN is used for extracting of features for the given query image and optimization techniques helps in achieving the global best features by changing the internal parameters of processing layers. The Content Based Image Retrieval (CBIR) is proposed in this study. In big data analysis, CNN is vastly used and have many applications like identifying objects, medical imaging fields, security analysis and so on. In this paper, the combination of two efficient techniques helps in identifying the image and achieves good results. The results shows that CNN alone achieves an accuracy of 94.8% and when combined with QCSO the rate of accuracy improved by 1.6%. The entire experimental values are evaluated using matlab tool.
近年来,随着互联网上数字图像的增长,所需图像的检索成为一个大问题。本文设计了一种用于大数据图像检索的组合方法。方法是CNN-QCSO,一种是深度学习技术,即卷积神经网络(CNN),另一种是优化技术,即量子布谷鸟搜索优化(QCSO)。使用CNN对给定的查询图像进行特征提取,优化技术通过改变处理层的内部参数来实现全局最优特征。本文提出了一种基于内容的图像检索方法。在大数据分析中,CNN被广泛使用,在物体识别、医学成像领域、安全分析等方面都有很多应用。在本文中,两种有效技术的结合有助于图像识别,并取得了良好的效果。结果表明,CNN单独使用时准确率达到94.8%,与QCSO结合使用时准确率提高1.6%。利用matlab工具对整个实验值进行了计算。
{"title":"Fusion of CNN-QCSO for Content Based Image Retrieval","authors":"Sarva Naveen Kumar, Ch. Sumanth Kumar","doi":"10.12720/jait.14.4.668-673","DOIUrl":"https://doi.org/10.12720/jait.14.4.668-673","url":null,"abstract":"—As the growth of digital images is been widely increased over the last few years on internet, the retrieval of required image is been a big problem. In this paper, a combinational approach is designed for retrieval of image form big data. The approach is CNN-QCSO, one is deep learning technique, i.e., Convolutional Neural Network (CNN) and another is optimization technique, i.e., Quantm Cuckoo Search Optimization (QCSO). CNN is used for extracting of features for the given query image and optimization techniques helps in achieving the global best features by changing the internal parameters of processing layers. The Content Based Image Retrieval (CBIR) is proposed in this study. In big data analysis, CNN is vastly used and have many applications like identifying objects, medical imaging fields, security analysis and so on. In this paper, the combination of two efficient techniques helps in identifying the image and achieves good results. The results shows that CNN alone achieves an accuracy of 94.8% and when combined with QCSO the rate of accuracy improved by 1.6%. The entire experimental values are evaluated using matlab tool.","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66332812","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Open Banking API Framework to Improve the Online Transaction between Local Banks in Egypt Using Blockchain Technology 开放银行API框架利用区块链技术改善埃及本地银行之间的在线交易
IF 1 Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.4.729-740
Mohamed Hamed Mohamed Hefny, Y. Helmy, M. Abdelsalam
—Blockchain technology is considered to have a high impact on the banking industry due to its potential to enable new ways of organizing and handling banking industry activities. It reduces costs and time associated with intermediaries and improves trust and security. This study explores how blockchain technology could enhance fund transfer transactions between local banks in Egypt by providing a blockchain-based framework to conduct instant payments and financial transactions. Due to its properties, blockchain is qualified to play a vital role in the financial sector by helping financial institutions protect their daily routine financial transactions with a more secure, instant, and low-cost model. The findings show that blockchain technology’s characteristics (enhanced security, transparency, data integrity, information immutability, and instant settlement) and using open Application Programming Interface (API) architecture will give seamless integration of financial services and applications. This approach will improve Egypt’s financial transactions between local banks as well as the growth of e-payments and digital transformation. The proposed framework, which uses blockchain and open banking API architecture in fund transfer between local banks, will provide a great opportunity and space for banks to improve and positively impact digital transformation strategy, financial inclusion, digitization of payments, online SME finance, increasing access points, partnerships with FinTech’s, and using innovative technologies further to bring efficiency in banking and payments. By using a blockchain network for domestic remittance Automated Clearing House (ACH), banks should be able to offer customers a faster, cheaper, and more efficient service.
-区块链技术被认为对银行业有很大的影响,因为它有可能实现组织和处理银行业活动的新方法。它减少了与中介相关的成本和时间,并提高了信任和安全性。本研究探讨了区块链技术如何通过提供基于区块链的框架来进行即时支付和金融交易,从而增强埃及当地银行之间的资金转移交易。由于区块链的特性,区块链有资格在金融领域发挥重要作用,帮助金融机构以更安全、即时和低成本的模式保护他们的日常金融交易。研究结果表明,区块链技术的特点(增强安全性、透明度、数据完整性、信息不变性和即时结算)和使用开放的应用程序编程接口(API)架构将实现金融服务和应用程序的无缝集成。这种方法将改善埃及当地银行之间的金融交易,以及电子支付和数字化转型的增长。拟议的框架将在本地银行之间的资金转账中使用区块链和开放银行API架构,将为银行改善和积极影响数字化转型战略、普惠金融、支付数字化、在线中小企业融资、增加接入点、与金融科技公司合作,以及利用创新技术进一步提高银行和支付效率提供巨大的机会和空间。通过使用区块链网络进行国内汇款自动清算所(ACH),银行应该能够为客户提供更快、更便宜和更有效的服务。
{"title":"Open Banking API Framework to Improve the Online Transaction between Local Banks in Egypt Using Blockchain Technology","authors":"Mohamed Hamed Mohamed Hefny, Y. Helmy, M. Abdelsalam","doi":"10.12720/jait.14.4.729-740","DOIUrl":"https://doi.org/10.12720/jait.14.4.729-740","url":null,"abstract":"—Blockchain technology is considered to have a high impact on the banking industry due to its potential to enable new ways of organizing and handling banking industry activities. It reduces costs and time associated with intermediaries and improves trust and security. This study explores how blockchain technology could enhance fund transfer transactions between local banks in Egypt by providing a blockchain-based framework to conduct instant payments and financial transactions. Due to its properties, blockchain is qualified to play a vital role in the financial sector by helping financial institutions protect their daily routine financial transactions with a more secure, instant, and low-cost model. The findings show that blockchain technology’s characteristics (enhanced security, transparency, data integrity, information immutability, and instant settlement) and using open Application Programming Interface (API) architecture will give seamless integration of financial services and applications. This approach will improve Egypt’s financial transactions between local banks as well as the growth of e-payments and digital transformation. The proposed framework, which uses blockchain and open banking API architecture in fund transfer between local banks, will provide a great opportunity and space for banks to improve and positively impact digital transformation strategy, financial inclusion, digitization of payments, online SME finance, increasing access points, partnerships with FinTech’s, and using innovative technologies further to bring efficiency in banking and payments. By using a blockchain network for domestic remittance Automated Clearing House (ACH), banks should be able to offer customers a faster, cheaper, and more efficient service.","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66333181","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
An Effective Time-Sharing Switch Migration Scheme for Load Balancing in Software Defined Networking 一种用于软件定义网络负载均衡的有效分时交换机迁移方案
IF 1 Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.4.846-856
Thangaraj Ethilu, Abirami Sathappan, P. Rodrigues
—Using distributed Software Defined Networking (SDN)control, SDN delivers additional flexibility to network management, and it has been a significant breakthrough in network innovation. Switch migration is often used for distributed controller workload balancing. The Time-Sharing Switch Migration (TSSM) scheme proposed a strategy in which multiple controllers are allowed to share the workload of a switch via time sharing during an overloaded condition, resulting in reduced ping-pong controller difficulty, fewer overload occurrences, and improved controller efficiency. However, it requires more than one controller to accomplish, it has greater migration costs and higher controller resource usage during the TSSM operating time. As a result, we presented a coalitional game strategy that optimizes controller selection throughout the TSSM phase depending on flow characteristics. The new TSSM method reduces migration costs and controller resource usage while still providing TSSM benefits. For the sake of practicality, the proposed strategy is implemented using an open network operating system. The experimental findings reveal that, as compared to the typical TSSM system, the proposed technique reduces migration costs and controller resource usage by approximately 18%.
-使用分布式软件定义网络(SDN)控制,SDN为网络管理提供了额外的灵活性,是网络创新的重大突破。交换机迁移通常用于分布式控制器负载均衡。TSSM (time - sharing Switch Migration)方案提出了一种在过载情况下,允许多个控制器通过分时分担一个交换机的工作负载的策略,从而降低了乒乓控制器的难度,减少了过载的发生,提高了控制器的效率。但是,它需要多个控制器来完成,在TSSM运行期间,它具有更高的迁移成本和更高的控制器资源占用。因此,我们提出了一种联合博弈策略,根据流量特征优化整个TSSM阶段的控制器选择。新的TSSM方法降低了迁移成本和控制器资源的使用,同时仍然提供了TSSM的优点。为了实用,本文提出的策略在开放网络操作系统上实现。实验结果表明,与典型的TSSM系统相比,所提出的技术可将迁移成本和控制器资源使用降低约18%。
{"title":"An Effective Time-Sharing Switch Migration Scheme for Load Balancing in Software Defined Networking","authors":"Thangaraj Ethilu, Abirami Sathappan, P. Rodrigues","doi":"10.12720/jait.14.4.846-856","DOIUrl":"https://doi.org/10.12720/jait.14.4.846-856","url":null,"abstract":"—Using distributed Software Defined Networking (SDN)control, SDN delivers additional flexibility to network management, and it has been a significant breakthrough in network innovation. Switch migration is often used for distributed controller workload balancing. The Time-Sharing Switch Migration (TSSM) scheme proposed a strategy in which multiple controllers are allowed to share the workload of a switch via time sharing during an overloaded condition, resulting in reduced ping-pong controller difficulty, fewer overload occurrences, and improved controller efficiency. However, it requires more than one controller to accomplish, it has greater migration costs and higher controller resource usage during the TSSM operating time. As a result, we presented a coalitional game strategy that optimizes controller selection throughout the TSSM phase depending on flow characteristics. The new TSSM method reduces migration costs and controller resource usage while still providing TSSM benefits. For the sake of practicality, the proposed strategy is implemented using an open network operating system. The experimental findings reveal that, as compared to the typical TSSM system, the proposed technique reduces migration costs and controller resource usage by approximately 18%.","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66334522","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Resource Allocation in Cloud Computing 云计算中的资源分配
Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.5.1063-1072
G. Senthilkumar, K. Tamilarasi, N. Velmurugan, J. K. Periasamy
—Cloud computing seems to be currently the hottest new trend in data storage, processing, visualization, and analysis. There has also been a significant rise in cloud computing as government organizations and commercial businesses have migrated toward the cloud system. It has to do with dynamic resource allocation on demand to provide guaranteed services to clients. Another of the fastest-growing segments of computer business involves cloud computing. It was a brand-new approach to delivering IT services through the Internet. This paradigm allows consumers to access computing resources as in puddles over the Internet. It is necessary and challenging to deal with the allocation of resources and planning in cloud computing. The Random Forest (RF) and the Genetic Algorithm (GA) are used in a hybrid strategy for virtual machine allocation in this work. This is a supervised machine-learning technique. Power consumption will be minimized while resources are better distributed and utilized, and the project’s goal is to maximize resource usage. There is an approach that can be used to produce training data that can be used to train a random forest. Planet Lab’s real-time workload traces are utilized to test the method. The suggested GA-RF model outperformed in terms of data center and host resource utilization, energy consumption, and execution time. Resource utilization, Power consumption, and execution time were employed as performance measures in this work. Random Forest provides better results compared with the Genetic Algorithm.
{"title":"Resource Allocation in Cloud Computing","authors":"G. Senthilkumar, K. Tamilarasi, N. Velmurugan, J. K. Periasamy","doi":"10.12720/jait.14.5.1063-1072","DOIUrl":"https://doi.org/10.12720/jait.14.5.1063-1072","url":null,"abstract":"—Cloud computing seems to be currently the hottest new trend in data storage, processing, visualization, and analysis. There has also been a significant rise in cloud computing as government organizations and commercial businesses have migrated toward the cloud system. It has to do with dynamic resource allocation on demand to provide guaranteed services to clients. Another of the fastest-growing segments of computer business involves cloud computing. It was a brand-new approach to delivering IT services through the Internet. This paradigm allows consumers to access computing resources as in puddles over the Internet. It is necessary and challenging to deal with the allocation of resources and planning in cloud computing. The Random Forest (RF) and the Genetic Algorithm (GA) are used in a hybrid strategy for virtual machine allocation in this work. This is a supervised machine-learning technique. Power consumption will be minimized while resources are better distributed and utilized, and the project’s goal is to maximize resource usage. There is an approach that can be used to produce training data that can be used to train a random forest. Planet Lab’s real-time workload traces are utilized to test the method. The suggested GA-RF model outperformed in terms of data center and host resource utilization, energy consumption, and execution time. Resource utilization, Power consumption, and execution time were employed as performance measures in this work. Random Forest provides better results compared with the Genetic Algorithm.","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135052623","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Research Opportunities in Microservices Quality Assessment: A Systematic Literature Review 微服务质量评估的研究机会:系统文献综述
Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.5.991-1002
Verónica C. Tapia, Carlos M. Gaona
—The growth in the development of microservices has sparked interest in evaluating their quality. This study seeks to determine the key criteria and challenges in evaluating microservices to drive research and optimize processes. The systematic review of the literature presented in this research identified that the most commonly used evaluation criteria are performance, scalability, security, cohesion, coupling, and granularity. Although evaluation tools exist, they mainly measure performance aspects such as latency and resource consumption. Challenges were identified in security, granularity, throughput, monitoring, organizational strategy, orchestration, choreography, scalability, decomposition, and monolith refactoring. In addition, research opportunities in empirical studies, analysis of quality trade-offs, and broadening of relevant perspectives and tools are noted. Challenges in the interrelation of quality attributes, metrics and patterns, automatic evaluation, architectural decisions and technical debt, domain-based design, testing, monitoring, and performance modeling are also highlighted. Challenges in orchestration, communication management and consistency between microservices, independent evolution, and scalability are also mentioned. Therefore, it is critical to address these particular challenges in microservices and to continue research to improve the understanding and practices related to quality.
{"title":"Research Opportunities in Microservices Quality Assessment: A Systematic Literature Review","authors":"Verónica C. Tapia, Carlos M. Gaona","doi":"10.12720/jait.14.5.991-1002","DOIUrl":"https://doi.org/10.12720/jait.14.5.991-1002","url":null,"abstract":"—The growth in the development of microservices has sparked interest in evaluating their quality. This study seeks to determine the key criteria and challenges in evaluating microservices to drive research and optimize processes. The systematic review of the literature presented in this research identified that the most commonly used evaluation criteria are performance, scalability, security, cohesion, coupling, and granularity. Although evaluation tools exist, they mainly measure performance aspects such as latency and resource consumption. Challenges were identified in security, granularity, throughput, monitoring, organizational strategy, orchestration, choreography, scalability, decomposition, and monolith refactoring. In addition, research opportunities in empirical studies, analysis of quality trade-offs, and broadening of relevant perspectives and tools are noted. Challenges in the interrelation of quality attributes, metrics and patterns, automatic evaluation, architectural decisions and technical debt, domain-based design, testing, monitoring, and performance modeling are also highlighted. Challenges in orchestration, communication management and consistency between microservices, independent evolution, and scalability are also mentioned. Therefore, it is critical to address these particular challenges in microservices and to continue research to improve the understanding and practices related to quality.","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136202961","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An Intelligent Deep Learning Architecture Using Multi-scale Residual Network Model for Image Interpolation 基于多尺度残差网络模型的图像插值智能深度学习体系结构
Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.5.970-979
Diana Earshia V., Sumathi M.
—Image interpolation techniques based on learning have been shown to be efficient in recent days, due to their promising results. Deep neural networks can considerably enhance the quality of image super-resolution, according to recent studies. Convolutional neural networks with deeper layers are commonly used in current research to improve the performance of image interpolation. As the network’s depth grows, more issues with training arise. This research intends to implement an advanced deep learning mechanism called Deep Multi-Scaled Residual Network (DMResNet) for effective image interpolation. A network cannot be substantially improved by merely increasing the depth of the network. New training strategies are required for improving the accuracy of interpolated images. By using the proposed framework, the Low Resolution (LR) images are reconstructed to the High Resolution (HR) images with low computational burden and time complexity. In order to dynamically discover the image features at multiple scales, convolution kernels of various sizes based on the residual blocks have been utilized in this work. In the meantime, the multi-scaled residual architecture is formulated to allow these characteristics to interact with one another for obtaining the most accurate image data. The interpolation performance and image reconstruction efficiency of the proposed model have been validated by using a variety of measures such as PSNR, SSIM, RMSE, Run time analysis, and FSIM. Popular datasets IAPR TC-12, DIV 2K, and CVDS are used for validating the proposed model. This model outperforms the state-of-art interpolation techniques in its performance, by yielding an increase of 8% in PSNR, 6% in SSIM, 1.2% in FSIM, and a decrease of 38.79% in RMSE, 5.875 times in run time analysis.
{"title":"An Intelligent Deep Learning Architecture Using Multi-scale Residual Network Model for Image Interpolation","authors":"Diana Earshia V., Sumathi M.","doi":"10.12720/jait.14.5.970-979","DOIUrl":"https://doi.org/10.12720/jait.14.5.970-979","url":null,"abstract":"—Image interpolation techniques based on learning have been shown to be efficient in recent days, due to their promising results. Deep neural networks can considerably enhance the quality of image super-resolution, according to recent studies. Convolutional neural networks with deeper layers are commonly used in current research to improve the performance of image interpolation. As the network’s depth grows, more issues with training arise. This research intends to implement an advanced deep learning mechanism called Deep Multi-Scaled Residual Network (DMResNet) for effective image interpolation. A network cannot be substantially improved by merely increasing the depth of the network. New training strategies are required for improving the accuracy of interpolated images. By using the proposed framework, the Low Resolution (LR) images are reconstructed to the High Resolution (HR) images with low computational burden and time complexity. In order to dynamically discover the image features at multiple scales, convolution kernels of various sizes based on the residual blocks have been utilized in this work. In the meantime, the multi-scaled residual architecture is formulated to allow these characteristics to interact with one another for obtaining the most accurate image data. The interpolation performance and image reconstruction efficiency of the proposed model have been validated by using a variety of measures such as PSNR, SSIM, RMSE, Run time analysis, and FSIM. Popular datasets IAPR TC-12, DIV 2K, and CVDS are used for validating the proposed model. This model outperforms the state-of-art interpolation techniques in its performance, by yielding an increase of 8% in PSNR, 6% in SSIM, 1.2% in FSIM, and a decrease of 38.79% in RMSE, 5.875 times in run time analysis.","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136202962","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An Optimized Deep Learning Based Malicious Nodes Detection in Intelligent Sensor-Based Systems Using Blockchain 基于区块链的智能传感器系统中基于深度学习的恶意节点检测优化
Q3 Computer Science Pub Date : 2023-01-01 DOI: 10.12720/jait.14.5.1037-1045
Swathi Darla, C. Naveena
—In this research work, a blockchain-based secure routing model is proposed for Internet of Sensor Things (IoST), with the assistance acquired from deep learning-based hybrid meta-heuristic optimization model. The proposed model includes three major phases: (a) optimal cluster head selection, (b) lightweight blockchain-based registration and authentication mechanism, (c) optimized deep learning based malicious node identification and (d) optimal path identification. Initially, the network is constructed with N number of nodes. Among those nodes certain count of nodes is selected as optimal cluster head based on the two-fold objectives (energy consumption and delay) based hybrid optimization model. The proposed Chimp social incentive-based Mutated Poor Rich Optimization (CMPRO) Algorithm is the conceptual amalgamation of the standard Chimp Optimization Algorithm (ChOA) and Poor and Rich Optimization (PRO) approach. Moreover, blockchain is deployed on the optimal CHs and base station because they have sufficient storage and computational resources. Subsequently, a lightweight blockchain-based registration and authentication mechanism is undergone. After the authentication of the network, the presence of malicious nodes in the network is detected using the new Optimized Deep Belief Network. To enhance the detection accuracy of the model, the hidden layers of Deep Belief Network (DBN) is optimized using the new hybrid optimization model (CMPRO). After the detection of malicious nodes, the source node selects the shortest path to the destination and performs secure routing in the absence of malicious node. In the proposed model, the optimal path for routing the data is identified using the Dijkstra algorithm. As a whole the network becomes secured. Finally, the performance of the model is validated to manifest its efficiency over the existing models
{"title":"An Optimized Deep Learning Based Malicious Nodes Detection in Intelligent Sensor-Based Systems Using Blockchain","authors":"Swathi Darla, C. Naveena","doi":"10.12720/jait.14.5.1037-1045","DOIUrl":"https://doi.org/10.12720/jait.14.5.1037-1045","url":null,"abstract":"—In this research work, a blockchain-based secure routing model is proposed for Internet of Sensor Things (IoST), with the assistance acquired from deep learning-based hybrid meta-heuristic optimization model. The proposed model includes three major phases: (a) optimal cluster head selection, (b) lightweight blockchain-based registration and authentication mechanism, (c) optimized deep learning based malicious node identification and (d) optimal path identification. Initially, the network is constructed with N number of nodes. Among those nodes certain count of nodes is selected as optimal cluster head based on the two-fold objectives (energy consumption and delay) based hybrid optimization model. The proposed Chimp social incentive-based Mutated Poor Rich Optimization (CMPRO) Algorithm is the conceptual amalgamation of the standard Chimp Optimization Algorithm (ChOA) and Poor and Rich Optimization (PRO) approach. Moreover, blockchain is deployed on the optimal CHs and base station because they have sufficient storage and computational resources. Subsequently, a lightweight blockchain-based registration and authentication mechanism is undergone. After the authentication of the network, the presence of malicious nodes in the network is detected using the new Optimized Deep Belief Network. To enhance the detection accuracy of the model, the hidden layers of Deep Belief Network (DBN) is optimized using the new hybrid optimization model (CMPRO). After the detection of malicious nodes, the source node selects the shortest path to the destination and performs secure routing in the absence of malicious node. In the proposed model, the optimal path for routing the data is identified using the Dijkstra algorithm. As a whole the network becomes secured. Finally, the performance of the model is validated to manifest its efficiency over the existing models","PeriodicalId":36452,"journal":{"name":"Journal of Advances in Information Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136305713","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Journal of Advances in Information Technology
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1