首页 > 最新文献

2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)最新文献

英文 中文
Web Application Security Threats and Mitigation Strategies when Using Cloud Computing as Backend 使用云计算作为后端时的Web应用程序安全威胁和缓解策略
Asma Z. Yamani, Khawlah Bajbaa, Reem Aljunaid
Cloud computing plays an important role in businesses' digital transformation as they offer easy-to-use services that save time and effort. Despite incredible features that are provided by cloud computing platforms, these platforms become the desirable target of attackers. This study aims to survey the literature for security threats related to web applications that have been developed using cloud computing services and then provide a set of recommendations to mitigate these threats. In this study, we first surveyed the literature for documented cases of threats faced while relying on cloud computing, then an online survey was sent to Computer Science students and web developers. The survey's questions were related to web threats whether they are aware of these threats or not and whether they have already applied any prevention measures for these threats. Then, a set of recommendations were provided that can help them to mitigate these threats. Finally, a tool was designed for generating security policies for the Broken Access Control threat for Firebase. Eighty-five responses were considered for this study. The average participants' awareness of all threats is 61 % despite 92% of participants having taken at least one security course. The main causes for neglecting to implement mitigation techniques was the lack of training and that developers are relying on the web services to provide security measures, then comes the process being time-consuming. The designed tool for mitigating Broken Access control showed promising results to ease the implementation of mitigation techniques. We conclude that due to the lack of awareness and negligence in implementing mitigation techniques, many present web apps may be compromised. Developing security tools for novice users, whenever possible, provides a solution for the main causes of the neglect to implement such measures and should be investigated further.
云计算在企业的数字化转型中扮演着重要的角色,因为它们提供了易于使用的服务,节省了时间和精力。尽管云计算平台提供了令人难以置信的功能,但这些平台成为攻击者的理想目标。本研究旨在调查与使用云计算服务开发的web应用程序相关的安全威胁的文献,然后提供一组减轻这些威胁的建议。在这项研究中,我们首先调查了依赖云计算所面临威胁的文献记录案例,然后向计算机科学专业的学生和web开发人员发送了一份在线调查。调查的问题与网络威胁有关,他们是否意识到这些威胁,是否已经针对这些威胁采取了任何预防措施。然后,提供了一组建议,可以帮助他们减轻这些威胁。最后,设计了一个工具,用于为Firebase生成访问控制中断威胁的安全策略。本研究考虑了85个回答。尽管92%的参与者至少参加过一门安全课程,但平均61%的参与者对所有威胁都有意识。忽略实现缓解技术的主要原因是缺乏培训,开发人员依赖web服务来提供安全措施,然后是耗时的过程。所设计的用于缓解中断访问控制的工具在简化缓解技术的实现方面显示了有希望的结果。我们的结论是,由于在实施缓解技术方面缺乏意识和疏忽,许多当前的web应用程序可能会受到损害。只要有可能,为新手用户开发安全工具可以为忽视实现这些措施的主要原因提供解决方案,并且应该进一步调查。
{"title":"Web Application Security Threats and Mitigation Strategies when Using Cloud Computing as Backend","authors":"Asma Z. Yamani, Khawlah Bajbaa, Reem Aljunaid","doi":"10.1109/CICN56167.2022.10008368","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008368","url":null,"abstract":"Cloud computing plays an important role in businesses' digital transformation as they offer easy-to-use services that save time and effort. Despite incredible features that are provided by cloud computing platforms, these platforms become the desirable target of attackers. This study aims to survey the literature for security threats related to web applications that have been developed using cloud computing services and then provide a set of recommendations to mitigate these threats. In this study, we first surveyed the literature for documented cases of threats faced while relying on cloud computing, then an online survey was sent to Computer Science students and web developers. The survey's questions were related to web threats whether they are aware of these threats or not and whether they have already applied any prevention measures for these threats. Then, a set of recommendations were provided that can help them to mitigate these threats. Finally, a tool was designed for generating security policies for the Broken Access Control threat for Firebase. Eighty-five responses were considered for this study. The average participants' awareness of all threats is 61 % despite 92% of participants having taken at least one security course. The main causes for neglecting to implement mitigation techniques was the lack of training and that developers are relying on the web services to provide security measures, then comes the process being time-consuming. The designed tool for mitigating Broken Access control showed promising results to ease the implementation of mitigation techniques. We conclude that due to the lack of awareness and negligence in implementing mitigation techniques, many present web apps may be compromised. Developing security tools for novice users, whenever possible, provides a solution for the main causes of the neglect to implement such measures and should be investigated further.","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132968707","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Processing of Images Based on Machine Learning to Avoid Unauthorized Entry 基于机器学习的图像处理避免未经授权的进入
C. Peña, Ciro Rodríguez, Israel Arellano Romero
The proposal of a facial recognition system to increase security, through facial recognition with multiple utilities such as facilitating the access of people with adequate protection measures in times of Covid-19, as well as security when seeking to hide their identity. The methodology considers the use of tools such as Python and OpenCV, as well as models such as Eigen Faces, Fisher Faces, and LBPH Faces, as units of analysis are considered photographs and portions of the video that capture facial expressions that then their patterns are trained with facial recognition algorithms. The results obtained show that the LBPH Faces obtained confidence values lower than 70, with a 95% certainty of recognition and a shorter recognition time, improving the accuracy of facial recognition, also with the increase of the data was achieved to improve the accuracy of recognition as well as improve confidence regarding the safety of people.
面部识别系统的建议,通过面部识别的多种功能来提高安全性,例如在2019冠状病毒病期间为采取适当保护措施的人提供便利,以及在寻求隐藏身份时提供安全保障。该方法考虑使用Python和OpenCV等工具,以及Eigen Faces, Fisher Faces和LBPH Faces等模型,作为分析单元,被认为是照片和视频的一部分,捕捉面部表情,然后用面部识别算法训练它们的模式。结果表明,LBPH人脸获得的置信度值小于70,识别确定性为95%,识别时间较短,提高了人脸识别的准确性,并且随着数据量的增加,人脸识别的准确性也有所提高,对人身安全的信心也有所提高。
{"title":"Processing of Images Based on Machine Learning to Avoid Unauthorized Entry","authors":"C. Peña, Ciro Rodríguez, Israel Arellano Romero","doi":"10.1109/CICN56167.2022.10008350","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008350","url":null,"abstract":"The proposal of a facial recognition system to increase security, through facial recognition with multiple utilities such as facilitating the access of people with adequate protection measures in times of Covid-19, as well as security when seeking to hide their identity. The methodology considers the use of tools such as Python and OpenCV, as well as models such as Eigen Faces, Fisher Faces, and LBPH Faces, as units of analysis are considered photographs and portions of the video that capture facial expressions that then their patterns are trained with facial recognition algorithms. The results obtained show that the LBPH Faces obtained confidence values lower than 70, with a 95% certainty of recognition and a shorter recognition time, improving the accuracy of facial recognition, also with the increase of the data was achieved to improve the accuracy of recognition as well as improve confidence regarding the safety of people.","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133329232","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Robotic Welding Path Identification Using FPGA-Based Image Processing 基于fpga图像处理的机器人焊接路径识别
Abdulkadir Saday, I. Ozkan
The robotic welding process is widely used in many industry sectors, and its use in production lines is becoming more common day by day. Obtaining a smooth weld seam in robot welding depends on the geometric structure of the welding path and the stability of the control loop. However, the weld path and the weld gap are usually not fixed, and their change negatively affects the automatic control. Programming the complex welding path by the operator may take more time than executing the task for some welding jobs. In addition, the variable weld gap negatively affects the weld quality in the constant control loop. This study proposes a system that provides a real-time definition of the weld path and its geometry on the embedded system to address this issue. The weld path image is captured using a camera, and the weld path is determined by image processing techniques using the embedded Linux operating system running on system-on-chip (SoC) hardware. The images captured through the Hard Processor System (HPS) unit are stored in memory, processed in the FPGA unit, and output by the HPS unit. Unprocessed SoC images and measurement images of weld pieces are presented with their values. When the values obtained from the processed weld path image are compared to manually measured path values, it is seen that the proposed system produces successful results.
机器人焊接工艺广泛应用于许多工业领域,其在生产线上的应用日益普遍。在机器人焊接中,获得光滑的焊缝取决于焊接路径的几何结构和控制回路的稳定性。然而,焊缝路径和焊缝间隙通常不是固定的,它们的变化会对自动控制产生不利影响。对某些焊接作业来说,由操作员编写复杂的焊接路径可能比执行任务花费更多的时间。此外,在恒控制回路中,可变的焊缝间隙对焊缝质量有不利影响。本研究提出了一个系统,该系统可以在嵌入式系统上实时定义焊接路径及其几何形状,以解决这一问题。焊接路径图像由摄像机捕获,焊接路径由运行在片上系统(SoC)硬件上的嵌入式Linux操作系统的图像处理技术确定。通过硬处理器系统(HPS)单元捕获的图像存储在存储器中,在FPGA单元中进行处理,并由HPS单元输出。给出了焊件未处理的SoC图像和测量图像及其值。将处理后的焊缝路径图像获得的值与人工测量的路径值进行比较,可以看出所提出的系统产生了成功的结果。
{"title":"Robotic Welding Path Identification Using FPGA-Based Image Processing","authors":"Abdulkadir Saday, I. Ozkan","doi":"10.1109/CICN56167.2022.10008273","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008273","url":null,"abstract":"The robotic welding process is widely used in many industry sectors, and its use in production lines is becoming more common day by day. Obtaining a smooth weld seam in robot welding depends on the geometric structure of the welding path and the stability of the control loop. However, the weld path and the weld gap are usually not fixed, and their change negatively affects the automatic control. Programming the complex welding path by the operator may take more time than executing the task for some welding jobs. In addition, the variable weld gap negatively affects the weld quality in the constant control loop. This study proposes a system that provides a real-time definition of the weld path and its geometry on the embedded system to address this issue. The weld path image is captured using a camera, and the weld path is determined by image processing techniques using the embedded Linux operating system running on system-on-chip (SoC) hardware. The images captured through the Hard Processor System (HPS) unit are stored in memory, processed in the FPGA unit, and output by the HPS unit. Unprocessed SoC images and measurement images of weld pieces are presented with their values. When the values obtained from the processed weld path image are compared to manually measured path values, it is seen that the proposed system produces successful results.","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133635092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Validity of Using Technical Indicators When forecasting Stock Prices Using Deep Learning Models: Empirical Evidence Using Saudi Stocks 使用深度学习模型预测股票价格时使用技术指标的有效性:使用沙特股票的经验证据
S. Mohammed
Many researchers use deep learning and technical indicators to forecast future stock prices. There are several hundred technical indicators and each one of them has a number of parameters. Finding the optimal combination of indicators with their optimal parameter values is very challenging. The aim of this work is to study if there is any benefit of feeding deep learning models with technical indicators instead of only feeding them with price and volume. After all, technical indicators are just functions of price and volume. Empirical studies done in this work using Saudi stocks show that deep learning models can benefit from technical indicators only if the right combination of technical indicators together with their right parameter values are used. The experimental results show that the right combination of technical indicators can improve the forecasting accuracy of deep learning modules. They also showed that using the wrong combination of indicators is worse than using no indicator even if they were assigned the best parameter values.
许多研究人员使用深度学习和技术指标来预测未来的股票价格。有几百项技术指标,每一项指标都有若干参数。找到指标的最佳组合及其最优参数值是非常具有挑战性的。这项工作的目的是研究给深度学习模型提供技术指标,而不是只提供价格和数量,是否有任何好处。毕竟,技术指标只是价格和量的函数。利用沙特股票进行的实证研究表明,深度学习模型只有在正确使用技术指标及其正确参数值的情况下才能从技术指标中受益。实验结果表明,正确的技术指标组合可以提高深度学习模块的预测精度。他们还表明,使用错误的指标组合比不使用指标更糟糕,即使他们被分配了最佳参数值。
{"title":"The Validity of Using Technical Indicators When forecasting Stock Prices Using Deep Learning Models: Empirical Evidence Using Saudi Stocks","authors":"S. Mohammed","doi":"10.1109/CICN56167.2022.10008298","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008298","url":null,"abstract":"Many researchers use deep learning and technical indicators to forecast future stock prices. There are several hundred technical indicators and each one of them has a number of parameters. Finding the optimal combination of indicators with their optimal parameter values is very challenging. The aim of this work is to study if there is any benefit of feeding deep learning models with technical indicators instead of only feeding them with price and volume. After all, technical indicators are just functions of price and volume. Empirical studies done in this work using Saudi stocks show that deep learning models can benefit from technical indicators only if the right combination of technical indicators together with their right parameter values are used. The experimental results show that the right combination of technical indicators can improve the forecasting accuracy of deep learning modules. They also showed that using the wrong combination of indicators is worse than using no indicator even if they were assigned the best parameter values.","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129852751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Intrusion Classification for Cloud Computing Network: A Step Towards an Intelligent Classification System 云计算网络入侵分类:迈向智能分类系统的一步
Kanda Alamer, Abdulaziz Aldribi
One of the most rapidly spreading areas of infor-mation technology is cloud computing. However, this raises sig-nificant security issues that entice burglars. This paper presents a machine learning-based framework for intrusion classification for cloud computing networks. It offers new capabilities derived from cloud network flow. By dividing the flow into windows of time, a method known as the Riemann Chunking Scheme computes these features. After experimenting with this dataset, we have extracted 40 features that best describe the problem of anomaly classification and improve the accuracy of the study on multilayer perceptron for anomaly classification in cloud network traffic
云计算是信息技术中发展最为迅速的领域之一。然而,这引发了重大的安全问题,吸引了窃贼。提出了一种基于机器学习的云计算网络入侵分类框架。它提供了源自云网络流的新功能。通过将流划分为时间窗口,一种称为Riemann Chunking Scheme的方法可以计算这些特征。通过对该数据集的实验,我们提取了40个最能描述异常分类问题的特征,提高了云网络流量中多层感知器异常分类研究的准确性
{"title":"Intrusion Classification for Cloud Computing Network: A Step Towards an Intelligent Classification System","authors":"Kanda Alamer, Abdulaziz Aldribi","doi":"10.1109/CICN56167.2022.10008346","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008346","url":null,"abstract":"One of the most rapidly spreading areas of infor-mation technology is cloud computing. However, this raises sig-nificant security issues that entice burglars. This paper presents a machine learning-based framework for intrusion classification for cloud computing networks. It offers new capabilities derived from cloud network flow. By dividing the flow into windows of time, a method known as the Riemann Chunking Scheme computes these features. After experimenting with this dataset, we have extracted 40 features that best describe the problem of anomaly classification and improve the accuracy of the study on multilayer perceptron for anomaly classification in cloud network traffic","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130975920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
JUX - A Cloud Hosted Learning Management System Based on OpenedX JUX——基于OpenedX的云托管学习管理系统
Muhammad Noman Saeed, Ahmad Mufarreh Al Mufarreh, K. M. Noaman, Muhammad Arshad, Atiq Rafiq Shaikh
Higher education across the world during the COVID pandemic changes its knowledge delivery mode from on-campus studies to off-campus studies, i.e. E-Learning. The e-education provider must be competent in order to create a robust learning environment that can handle the difficulties facing teachers, students, and system administrators at this rapid pace of change. The system administrator needs to improve the network connectivity, bandwidth etc. for providing seamless connectivity for E-Learning alongside their campus network services. The challenge of providing smooth services for e-learning is sometimes hurdled the other network services for the campus and therefore the management and administrator suggest deploying the e-learning services on the cloud and setting apart the campus network services. This will solve the problem of available network limits can face by the institute due to the limited amount of hardware and bandwidth issues. Furthermore, the cloud deployment reduces the capital as well as the recurring cost of running the services. This paper will focus to address the problem defined above and providing Amazon Web Services (AWS) based cost-effective cloud architecture for OpenedX based learning solutions. This study is expected to demonstrate a technological solution for the process of implementing a cloud-based LMS.
新冠肺炎疫情期间,世界各地的高等教育将知识传递模式从校内学习转变为校外学习,即E-Learning。电子教育提供者必须有能力创建一个健壮的学习环境,能够处理教师、学生和系统管理员在这种快速变化中面临的困难。系统管理员需要改善网络连接性、带宽等,以便与校园网服务一起为E-Learning提供无缝连接。为e-learning提供顺畅的服务有时会受到其他校园网络服务的阻碍,因此管理层和管理员建议将e-learning服务部署在云上,并将校园网服务分开。这将解决可用网络限制的问题,可能面临的研究所由于有限数量的硬件和带宽的问题。此外,云部署减少了资本以及运行服务的重复成本。本文将重点解决上面定义的问题,并为基于OpenedX的学习解决方案提供基于亚马逊网络服务(AWS)的经济高效的云架构。本研究旨在为实现基于云的LMS的过程展示一种技术解决方案。
{"title":"JUX - A Cloud Hosted Learning Management System Based on OpenedX","authors":"Muhammad Noman Saeed, Ahmad Mufarreh Al Mufarreh, K. M. Noaman, Muhammad Arshad, Atiq Rafiq Shaikh","doi":"10.1109/CICN56167.2022.10008279","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008279","url":null,"abstract":"Higher education across the world during the COVID pandemic changes its knowledge delivery mode from on-campus studies to off-campus studies, i.e. E-Learning. The e-education provider must be competent in order to create a robust learning environment that can handle the difficulties facing teachers, students, and system administrators at this rapid pace of change. The system administrator needs to improve the network connectivity, bandwidth etc. for providing seamless connectivity for E-Learning alongside their campus network services. The challenge of providing smooth services for e-learning is sometimes hurdled the other network services for the campus and therefore the management and administrator suggest deploying the e-learning services on the cloud and setting apart the campus network services. This will solve the problem of available network limits can face by the institute due to the limited amount of hardware and bandwidth issues. Furthermore, the cloud deployment reduces the capital as well as the recurring cost of running the services. This paper will focus to address the problem defined above and providing Amazon Web Services (AWS) based cost-effective cloud architecture for OpenedX based learning solutions. This study is expected to demonstrate a technological solution for the process of implementing a cloud-based LMS.","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131223004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Prediction of Downhole Pressure while Tripping 起下钻时井下压力预测
A. Mohammad, Subankan Karunakaran, Mithushankar Panchalingam, R. Davidrajuh
During drilling operations for oil and gas, swab and surge pressure occur while tripping in and out of a wellbore. High tripping speed can lead to fracturing the well's formation, whereas low tripping speed can increase non-productive time and cost. Hence, there is a need to predict surge/swab pressure accurately. Several analytical and machine learning models have already been developed to predict surge/swab pressure. However, these existing models use numerical calculations to generate the data. This paper explored four supervised machine learning models, i.e., Linear Regression, XGBoost, Feedforward Neural Network (FFNN), and Long-Short-Term Memory (LSTM). In this study, actual field data from the Norwegian Continental Shelf provided by an Exploration & Production company is utilized to develop the four machine learning models. The results indicated that XGBoost was the best-performing model with an R2-score of 0.99073. Therefore, this trained model can be applied during a tripping operation to regulate tripping speed where repetitive surge/swab pressure calculation is needed.
在油气钻井作业中,在起下钻进出井筒的过程中会出现抽汲和涌压。高起下钻速度会导致地层破裂,而低起下钻速度会增加非生产时间和成本。因此,需要准确预测井喷/抽汲压力。已经开发了几种分析和机器学习模型来预测浪涌/抽汲压力。然而,这些现有的模型使用数值计算来生成数据。本文探讨了线性回归、XGBoost、前馈神经网络(FFNN)和长短期记忆(LSTM)四种监督式机器学习模型。在这项研究中,利用一家勘探与生产公司提供的挪威大陆架的实际现场数据来开发四种机器学习模型。结果表明,XGBoost模型表现最佳,r2得分为0.99073。因此,该训练模型可以应用于起下钻作业,在需要重复喘振/抽汲压力计算的情况下调节起下钻速度。
{"title":"Prediction of Downhole Pressure while Tripping","authors":"A. Mohammad, Subankan Karunakaran, Mithushankar Panchalingam, R. Davidrajuh","doi":"10.1109/CICN56167.2022.10008376","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008376","url":null,"abstract":"During drilling operations for oil and gas, swab and surge pressure occur while tripping in and out of a wellbore. High tripping speed can lead to fracturing the well's formation, whereas low tripping speed can increase non-productive time and cost. Hence, there is a need to predict surge/swab pressure accurately. Several analytical and machine learning models have already been developed to predict surge/swab pressure. However, these existing models use numerical calculations to generate the data. This paper explored four supervised machine learning models, i.e., Linear Regression, XGBoost, Feedforward Neural Network (FFNN), and Long-Short-Term Memory (LSTM). In this study, actual field data from the Norwegian Continental Shelf provided by an Exploration & Production company is utilized to develop the four machine learning models. The results indicated that XGBoost was the best-performing model with an R2-score of 0.99073. Therefore, this trained model can be applied during a tripping operation to regulate tripping speed where repetitive surge/swab pressure calculation is needed.","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"69 17","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113933096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Deep Pre-trained Contrastive Self-Supervised Learning: A Cyberbullying Detection Approach with Augmented Datasets 深度预训练对比自监督学习:基于增强数据集的网络欺凌检测方法
Lulwah M. Al-Harigy, H. Al-Nuaim, N. Moradpoor, Zhiyuan Tan
Cyberbullying is a widespread problem that has only increased in recent years due to the massive dependence on social media. Although, there are many approaches for detecting cyberbullying they still need to be improved upon for more accurate detection. We need new approaches that understand the context of the words used in cyberbullying by generating different representations of each word. In addition. there is a large amount of unlabelled data on the Internet that needs to be labelled for a more accurate detection process. Even though multiple methods for annotating datasets exists, the most widely used are still manual approaches, either using experts or crowdsourcing. However, The time needed and high cost of labor for manually annotation approaches result in a lack of annotated social network datasets for training a robust cyberbullying detector. Automated approaches can be relied upon in labelling data, such as using the Self-Supervised Learning (SSL) model. In this paper, we proposed two main parts. The first part is proposing a model of parallel BERT + Bi-LSTM used for detecting cyberbullying terms. The second part is utilizing Contrastive Self-Supervised Learning (a form of SSL) to augment the training set from unlabeled data using a small portion of another manually annotated dataset. Our proposed model that used deep pre-trained contrastive self-supervised learning for detecting cyberbullying using augmented datasets achieved a performance of (0.9311) using macro average F1 score. This result shows our model outperformed the baseline models - the top three teams in the competition SemEval-2020 Task 12.
网络欺凌是一个普遍存在的问题,近年来由于对社交媒体的大量依赖而加剧。尽管有很多检测网络欺凌的方法,但为了更准确的检测,它们仍然需要改进。我们需要新的方法,通过生成每个单词的不同表示来理解网络欺凌中使用的单词的上下文。此外。互联网上有大量未标记的数据需要标记,以便更准确地检测过程。尽管存在多种注释数据集的方法,但最广泛使用的仍然是手动方法,要么使用专家,要么使用众包。然而,手动标注方法所需的时间和高昂的人工成本导致缺乏用于训练鲁棒网络欺凌检测器的标注社交网络数据集。自动化方法可以用于标记数据,例如使用自监督学习(Self-Supervised Learning, SSL)模型。在本文中,我们提出两个主要部分。第一部分提出了一种用于检测网络欺凌术语的并行BERT + Bi-LSTM模型。第二部分是利用对比自监督学习(SSL的一种形式),使用另一个手动注释数据集的一小部分从未标记的数据中扩展训练集。我们提出的模型使用深度预训练的对比自监督学习来检测使用增强数据集的网络欺凌,使用宏观平均F1分数获得了(0.9311)的性能。这个结果表明,我们的模型优于基准模型——SemEval-2020 Task 12竞赛中的前三名团队。
{"title":"Deep Pre-trained Contrastive Self-Supervised Learning: A Cyberbullying Detection Approach with Augmented Datasets","authors":"Lulwah M. Al-Harigy, H. Al-Nuaim, N. Moradpoor, Zhiyuan Tan","doi":"10.1109/CICN56167.2022.10008274","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008274","url":null,"abstract":"Cyberbullying is a widespread problem that has only increased in recent years due to the massive dependence on social media. Although, there are many approaches for detecting cyberbullying they still need to be improved upon for more accurate detection. We need new approaches that understand the context of the words used in cyberbullying by generating different representations of each word. In addition. there is a large amount of unlabelled data on the Internet that needs to be labelled for a more accurate detection process. Even though multiple methods for annotating datasets exists, the most widely used are still manual approaches, either using experts or crowdsourcing. However, The time needed and high cost of labor for manually annotation approaches result in a lack of annotated social network datasets for training a robust cyberbullying detector. Automated approaches can be relied upon in labelling data, such as using the Self-Supervised Learning (SSL) model. In this paper, we proposed two main parts. The first part is proposing a model of parallel BERT + Bi-LSTM used for detecting cyberbullying terms. The second part is utilizing Contrastive Self-Supervised Learning (a form of SSL) to augment the training set from unlabeled data using a small portion of another manually annotated dataset. Our proposed model that used deep pre-trained contrastive self-supervised learning for detecting cyberbullying using augmented datasets achieved a performance of (0.9311) using macro average F1 score. This result shows our model outperformed the baseline models - the top three teams in the competition SemEval-2020 Task 12.","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125240131","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Study of Global Temperature Anomalies and their Changing Trends due to Global Warming 全球变暖导致的全球温度异常及其变化趋势研究
Bikash Sadhukhan, S. Mukherjee, R. Samanta
The analysis of global temperature trends at various regional and temporal dimensions has received considerable interest from the scientific community over the past century due to the growing awareness of the effects of climate change on the earth. The objective of this research is to determine and analyse the trend of monthly fluctuations in land and ocean temperatures around the world. This was accomplished by scraping a database maintained by the National Oceanic and Atmospheric Administration (NOAA) for monthly global land and ocean temperature anomaly data between 1881 and 2020. This study uses the Mann-Kendall trend test and Sen's estimator for slope to examine the global impact of climate change by comparing the trends of global land temperature anomalies, global ocean temperature anomalies, and their combined global land and ocean temperature anomaly records. The substantial magnitude of several statistical parameters demonstrates that the temperature anomalies (land, ocean, and combined) have significantly increased during the previous five decades, mostly as a consequence of strong anthropogenic sources. This necessitates the development of a proper action plan to limit global warming and the design of policies to reduce the elements that are likely to have detrimental effects on the climate on a local and global scale.
在过去的一个世纪里,由于人们对气候变化对地球的影响的认识日益加深,在各个区域和时间尺度上对全球温度趋势的分析引起了科学界的极大兴趣。这项研究的目的是确定和分析世界各地陆地和海洋温度每月波动的趋势。这是通过抓取美国国家海洋和大气管理局(NOAA)维护的1881年至2020年间每月全球陆地和海洋温度异常数据的数据库来完成的。本研究采用Mann-Kendall趋势检验和Sen’s斜率估计,通过比较全球陆地温度异常、全球海洋温度异常及其全球陆地和海洋温度综合异常记录的趋势,来检验气候变化的全球影响。几个统计参数的显著幅度表明,在过去50年中,温度异常(陆地、海洋和联合)显著增加,主要是由于强烈的人为来源。这就需要制定一项适当的行动计划来限制全球变暖,并制定政策来减少可能在地方和全球范围内对气候产生有害影响的因素。
{"title":"A Study of Global Temperature Anomalies and their Changing Trends due to Global Warming","authors":"Bikash Sadhukhan, S. Mukherjee, R. Samanta","doi":"10.1109/CICN56167.2022.10008329","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008329","url":null,"abstract":"The analysis of global temperature trends at various regional and temporal dimensions has received considerable interest from the scientific community over the past century due to the growing awareness of the effects of climate change on the earth. The objective of this research is to determine and analyse the trend of monthly fluctuations in land and ocean temperatures around the world. This was accomplished by scraping a database maintained by the National Oceanic and Atmospheric Administration (NOAA) for monthly global land and ocean temperature anomaly data between 1881 and 2020. This study uses the Mann-Kendall trend test and Sen's estimator for slope to examine the global impact of climate change by comparing the trends of global land temperature anomalies, global ocean temperature anomalies, and their combined global land and ocean temperature anomaly records. The substantial magnitude of several statistical parameters demonstrates that the temperature anomalies (land, ocean, and combined) have significantly increased during the previous five decades, mostly as a consequence of strong anthropogenic sources. This necessitates the development of a proper action plan to limit global warming and the design of policies to reduce the elements that are likely to have detrimental effects on the climate on a local and global scale.","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121983088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Text Classification and Categorization through Deep Learning 通过深度学习的文本分类和分类
Saiman Quazi, Sarhan M. Musa
Text classification is one of the important fields in Natural Language Processing (NLP). It assigns text documents into at least two categories in the domain by submitting and deriving a set of features to describe each document and to select the correct category for each one for a set of pre-defined tags or categories based on content. It is even used in several real-life applications such as engineering, science, and marketing and it can be quite effective in addressing problems with labeled data. There are certain Deep Learning (DL) algorithms that can be handy in categorizing text data such as Support Vector Machine (SVM), K-Nearest Neighbors (KNN), and Naïve Bayes. This paper illustrates how the text in each document is reviewed and grouped into different sets through the above-mentioned techniques. That way, it will determine which method is best suited for higher accuracy and what possible problems the deep learning model faces using text classification and categorization so that new solutions can be invented to resolve these issues without interfering with the processes in the future.
文本分类是自然语言处理(NLP)中的一个重要领域。它通过提交和派生一组特征来描述每个文档,并根据内容为一组预定义的标记或类别为每个文档选择正确的类别,从而将文本文档分配到域中至少两个类别中。它甚至被用于工程、科学和市场营销等现实生活中的一些应用中,它可以非常有效地解决带有标记数据的问题。有一些深度学习(DL)算法可以方便地对文本数据进行分类,例如支持向量机(SVM)、k近邻(KNN)和Naïve贝叶斯。本文说明了如何通过上述技术审查每个文档中的文本并将其分组为不同的集合。这样,它将确定哪种方法最适合更高的精度,以及深度学习模型在使用文本分类和分类时可能面临的问题,以便可以发明新的解决方案来解决这些问题,而不会干扰未来的过程。
{"title":"Text Classification and Categorization through Deep Learning","authors":"Saiman Quazi, Sarhan M. Musa","doi":"10.1109/CICN56167.2022.10008380","DOIUrl":"https://doi.org/10.1109/CICN56167.2022.10008380","url":null,"abstract":"Text classification is one of the important fields in Natural Language Processing (NLP). It assigns text documents into at least two categories in the domain by submitting and deriving a set of features to describe each document and to select the correct category for each one for a set of pre-defined tags or categories based on content. It is even used in several real-life applications such as engineering, science, and marketing and it can be quite effective in addressing problems with labeled data. There are certain Deep Learning (DL) algorithms that can be handy in categorizing text data such as Support Vector Machine (SVM), K-Nearest Neighbors (KNN), and Naïve Bayes. This paper illustrates how the text in each document is reviewed and grouped into different sets through the above-mentioned techniques. That way, it will determine which method is best suited for higher accuracy and what possible problems the deep learning model faces using text classification and categorization so that new solutions can be invented to resolve these issues without interfering with the processes in the future.","PeriodicalId":287589,"journal":{"name":"2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122361937","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
2022 14th International Conference on Computational Intelligence and Communication Networks (CICN)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1