Pub Date : 2020-01-01DOI: 10.1109/Confluence47617.2020.9057849
Marita Turpin, M. Matthee, S. Kruger, Jean-Paul Van Belle
The Fourth Industrial Revolution (4IR) will dramatically change our work and personal lives. New developments in the fields of artificial intelligence, big data and the internet of things (IoT) hold big promise but also present challenges to our entire society. However, only a very small fraction of the population is sufficiently versed in the new technologies to be able to make informed decisions on matters that will affect all our future lives. In addition, many people feel threatened and fear that they will lose their jobs. Even students in the fields of Information Systems (IS) and IT management do not feel familiar with and confident about their ability to navigate the world of 4IR. This study reports on a series of projects that have been undertaken in South Africa to encourage IS/IT students and professionals to embrace IoT technologies and to upskill themselves in this field. The projects have been undertaken from 2016 to 2019 at a South African university, by making use of a makerspace as well as a maker philosophy. Results indicate that the students and professionals were able to increase their skills as well as their confidence and attitude to engage with IoT technology. The contribution of this study is to suggest good practices for the use of IoT and a maker philosophy to prepare students and professionals for the world of 4IR.
{"title":"Assisting Information Systems students to Engage with the Internet of Things (IoT)","authors":"Marita Turpin, M. Matthee, S. Kruger, Jean-Paul Van Belle","doi":"10.1109/Confluence47617.2020.9057849","DOIUrl":"https://doi.org/10.1109/Confluence47617.2020.9057849","url":null,"abstract":"The Fourth Industrial Revolution (4IR) will dramatically change our work and personal lives. New developments in the fields of artificial intelligence, big data and the internet of things (IoT) hold big promise but also present challenges to our entire society. However, only a very small fraction of the population is sufficiently versed in the new technologies to be able to make informed decisions on matters that will affect all our future lives. In addition, many people feel threatened and fear that they will lose their jobs. Even students in the fields of Information Systems (IS) and IT management do not feel familiar with and confident about their ability to navigate the world of 4IR. This study reports on a series of projects that have been undertaken in South Africa to encourage IS/IT students and professionals to embrace IoT technologies and to upskill themselves in this field. The projects have been undertaken from 2016 to 2019 at a South African university, by making use of a makerspace as well as a maker philosophy. Results indicate that the students and professionals were able to increase their skills as well as their confidence and attitude to engage with IoT technology. The contribution of this study is to suggest good practices for the use of IoT and a maker philosophy to prepare students and professionals for the world of 4IR.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117065268","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-01-01DOI: 10.1109/Confluence47617.2020.9058017
Archana Singh, R. Kumar
Today cloud computing is the trending technology and has been proven as a viable business model. We can see fastest growth in the recent years because of it’s easy to access mechanism. As the number of cloud users are increasing exponentially, so to handle this, the concept of load balancing is needed to minimize the overhead and maximize the throughput on the cloud. In this paper, we analyze three cloud algorithms namely Round Robin, Throttled, Equally Spread Current Execution Load and their performance in terms of average response time, hourly data center response time and the cost of Virtual Machine (VM) etc. with the help of Cloud Analyst simulator. Cloud Analyst is the simulator which is best among all simulator for algorithm testing in cloud environment. Simulation results demonstrate that Throttled outperforms among these algorithms as number of users increases.
如今,云计算是一种趋势技术,并已被证明是一种可行的商业模式。我们能看到近年来最快的增长,因为它是容易进入的机制。由于云用户的数量呈指数级增长,因此要处理这个问题,需要负载平衡的概念来最小化开销并最大化云上的吞吐量。本文借助cloud Analyst模拟器,分析了轮循(Round Robin)、节流(throttled)、均摊当前执行负载(equarespread Current Execution Load)三种云算法,以及它们在平均响应时间、每小时数据中心响应时间和虚拟机成本等方面的性能。Cloud Analyst是云环境下算法测试的最好的模拟器。仿真结果表明,随着用户数量的增加,节流算法的性能优于这些算法。
{"title":"Performance Evaluation of Load Balancing Algorithms Using Cloud Analyst","authors":"Archana Singh, R. Kumar","doi":"10.1109/Confluence47617.2020.9058017","DOIUrl":"https://doi.org/10.1109/Confluence47617.2020.9058017","url":null,"abstract":"Today cloud computing is the trending technology and has been proven as a viable business model. We can see fastest growth in the recent years because of it’s easy to access mechanism. As the number of cloud users are increasing exponentially, so to handle this, the concept of load balancing is needed to minimize the overhead and maximize the throughput on the cloud. In this paper, we analyze three cloud algorithms namely Round Robin, Throttled, Equally Spread Current Execution Load and their performance in terms of average response time, hourly data center response time and the cost of Virtual Machine (VM) etc. with the help of Cloud Analyst simulator. Cloud Analyst is the simulator which is best among all simulator for algorithm testing in cloud environment. Simulation results demonstrate that Throttled outperforms among these algorithms as number of users increases.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127471077","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-01-01DOI: 10.1109/Confluence47617.2020.9058151
A. Bhatt, S. Dubey, A. Bhatt
The paper analyzes the Coronary Artery Calcium (CAC) score metric with respect to the age and gender of the individual. CAC score fairly predicts the risk of cardiovascular diseases. As there is an upsurge in the number of heart patients every day, cardiac health of a person should never be neglected. The Calcium Score involves a non-invasive CT scan of the heart and keeps track of the amount of calcified plaque in coronary arteries. The paper presents statistical comparisons among the scores obtained for males and females across different age groups. These findings are beneficial to predict early Cardiovascular Diseases and surely be useful as an awareness to maintain a healthy lifestyle.
{"title":"Age-Gender Analysis of Coronary Artery Calcium (CAC) Score to predict early Cardiovascular Diseases","authors":"A. Bhatt, S. Dubey, A. Bhatt","doi":"10.1109/Confluence47617.2020.9058151","DOIUrl":"https://doi.org/10.1109/Confluence47617.2020.9058151","url":null,"abstract":"The paper analyzes the Coronary Artery Calcium (CAC) score metric with respect to the age and gender of the individual. CAC score fairly predicts the risk of cardiovascular diseases. As there is an upsurge in the number of heart patients every day, cardiac health of a person should never be neglected. The Calcium Score involves a non-invasive CT scan of the heart and keeps track of the amount of calcified plaque in coronary arteries. The paper presents statistical comparisons among the scores obtained for males and females across different age groups. These findings are beneficial to predict early Cardiovascular Diseases and surely be useful as an awareness to maintain a healthy lifestyle.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114581499","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-01-01DOI: 10.1109/Confluence47617.2020.9058029
M. Anbarasi, K. S. Sendhil Kumar, R. Balamurugan, Thejasswini
Swarm Intelligence (SI) is increasing day by day in the various research fields. There are many swarm-based optimizations introduced since the early ’60s, Evolutionary Algorithms (EA) is the most updated one. All Evolutionary Algorithms have proved their capability to resolve most of the optimization problems. These algorithms are using for training the neural networks in this paper. The main difficulty for any optimization problem is selecting the correct values of parameters to get possible results. The main idea to get the best convergence rate and best performance is to vary the parameters of the algorithms. This paper provides a comparison of the most used and essential swarm-based optimization algorithms. Here, comparing the optimization algorithms, Particle Swarm Optimization (PSO), and Multi-Verse Optimization (MVO) before and after tuning the parameters with three different datasets.
{"title":"Disease Prediction using Hybrid Optimization Methods based on Tuning Parameters","authors":"M. Anbarasi, K. S. Sendhil Kumar, R. Balamurugan, Thejasswini","doi":"10.1109/Confluence47617.2020.9058029","DOIUrl":"https://doi.org/10.1109/Confluence47617.2020.9058029","url":null,"abstract":"Swarm Intelligence (SI) is increasing day by day in the various research fields. There are many swarm-based optimizations introduced since the early ’60s, Evolutionary Algorithms (EA) is the most updated one. All Evolutionary Algorithms have proved their capability to resolve most of the optimization problems. These algorithms are using for training the neural networks in this paper. The main difficulty for any optimization problem is selecting the correct values of parameters to get possible results. The main idea to get the best convergence rate and best performance is to vary the parameters of the algorithms. This paper provides a comparison of the most used and essential swarm-based optimization algorithms. Here, comparing the optimization algorithms, Particle Swarm Optimization (PSO), and Multi-Verse Optimization (MVO) before and after tuning the parameters with three different datasets.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114599058","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-01-01DOI: 10.1109/Confluence47617.2020.9058033
Rohit Kumar Sachan, D. S. Kushwaha
Software Effort Estimation (SEE) is an important activity during development and production of software projects. The estimated effort is directly associated with the various planning and financial activities. It is also directly associated with business success. Constructive Cost Model (COCOMO) is a widely accepted SEE model. But in the current development scenario, existing parameters of COCOMO don't give realistic results. In the recent past, many researchers improved the performance of COCOMO by optimizing the parameters with the help of various Nature-Inspired Algorithms (NIAs). In this paper, a recently proposed NIA which is based on the frog's anti-predator behavior is used for the optimizing the parameters of basic COCOMO for SEE of 18 software projects listed in NASA data set. The performance of the Anti-Predatory NIA (APNIA) based proposed approach is also evaluated on NASA18 software data set in terms of the Mean Absolute Error (MAE). The result obtained shows 93.41% improvement in terms of MAE as compared to the basic COCOMO, 40.69% improvement as compared to Genetic Algorithm (GA) and 0.93% improvement as compared to Particle Swarm optimization (PSO) with inertia weight in effort estimation by proposed approach.
{"title":"Anti-Predatory NIA Based Approach for Optimizing Basic COCOMO Model","authors":"Rohit Kumar Sachan, D. S. Kushwaha","doi":"10.1109/Confluence47617.2020.9058033","DOIUrl":"https://doi.org/10.1109/Confluence47617.2020.9058033","url":null,"abstract":"Software Effort Estimation (SEE) is an important activity during development and production of software projects. The estimated effort is directly associated with the various planning and financial activities. It is also directly associated with business success. Constructive Cost Model (COCOMO) is a widely accepted SEE model. But in the current development scenario, existing parameters of COCOMO don't give realistic results. In the recent past, many researchers improved the performance of COCOMO by optimizing the parameters with the help of various Nature-Inspired Algorithms (NIAs). In this paper, a recently proposed NIA which is based on the frog's anti-predator behavior is used for the optimizing the parameters of basic COCOMO for SEE of 18 software projects listed in NASA data set. The performance of the Anti-Predatory NIA (APNIA) based proposed approach is also evaluated on NASA18 software data set in terms of the Mean Absolute Error (MAE). The result obtained shows 93.41% improvement in terms of MAE as compared to the basic COCOMO, 40.69% improvement as compared to Genetic Algorithm (GA) and 0.93% improvement as compared to Particle Swarm optimization (PSO) with inertia weight in effort estimation by proposed approach.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129503627","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wireless sensor networks (WSN) became very popular in last few years. They are deployed in distributed manner for collecting variety of data. There are a lot of research issues and challenges in WSN viz; energy efficiency, security, localization etc. Outlier or anomaly detection is one of such area to prevent malicious attacks or reducing the errors and noisy data in millions of wireless sensor networks. Outlier detection models should not compromise with quality of data. We have to identify the anomalies in offline mode or online mode with accuracy, better performance and intake of minimal resources in the network. There are various machine learning techniques which have been used by several researchers these days to detect outliers. This paper presents a survey on outlier detection in WSN data using various machine learning techniques.
{"title":"A Study on Machine Learning Based Anomaly Detection Approaches in Wireless Sensor Network","authors":"Rajendra Kumar Dwivedi, Arun Kumar Rai, Rakesh Kumar","doi":"10.1109/Confluence47617.2020.9058311","DOIUrl":"https://doi.org/10.1109/Confluence47617.2020.9058311","url":null,"abstract":"Wireless sensor networks (WSN) became very popular in last few years. They are deployed in distributed manner for collecting variety of data. There are a lot of research issues and challenges in WSN viz; energy efficiency, security, localization etc. Outlier or anomaly detection is one of such area to prevent malicious attacks or reducing the errors and noisy data in millions of wireless sensor networks. Outlier detection models should not compromise with quality of data. We have to identify the anomalies in offline mode or online mode with accuracy, better performance and intake of minimal resources in the network. There are various machine learning techniques which have been used by several researchers these days to detect outliers. This paper presents a survey on outlier detection in WSN data using various machine learning techniques.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130590300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-01-01DOI: 10.1109/Confluence47617.2020.9058040
Priyanka Mahajan
From the past few years, Generative adversarial networks (GANs) have gained more and more interest of researchers of Artificial Intelligence and this is only due to the reliability on huge amount of data, well designed network architectures and smart training techniques because of which they produce highly realistic pieces of content of images, texts and sounds. The inspirational idea of working in GANs has been derived from game theory, named as the zero–sum game. GANs consist of two components-a generator as well as a discriminator both of which act like two players of the game playing in opposition with each other. This paper focuses on the basic theory and principle mechanism of GANs. Next, the paper discusses few variants based on architecture as well as loss functions of some kinds. Finally, the last section of paper presents few other variants of GANs which are implemented in the field of computer vision and other real world problems. It is found that this area has a wider scope in terms of virtual real interaction and integration along with parallel learning. So it is considered as new implementation area for GANs in the coming future.
{"title":"Recent Advances in Generative Adversarial Networks: An Analysis along with its outlook","authors":"Priyanka Mahajan","doi":"10.1109/Confluence47617.2020.9058040","DOIUrl":"https://doi.org/10.1109/Confluence47617.2020.9058040","url":null,"abstract":"From the past few years, Generative adversarial networks (GANs) have gained more and more interest of researchers of Artificial Intelligence and this is only due to the reliability on huge amount of data, well designed network architectures and smart training techniques because of which they produce highly realistic pieces of content of images, texts and sounds. The inspirational idea of working in GANs has been derived from game theory, named as the zero–sum game. GANs consist of two components-a generator as well as a discriminator both of which act like two players of the game playing in opposition with each other. This paper focuses on the basic theory and principle mechanism of GANs. Next, the paper discusses few variants based on architecture as well as loss functions of some kinds. Finally, the last section of paper presents few other variants of GANs which are implemented in the field of computer vision and other real world problems. It is found that this area has a wider scope in terms of virtual real interaction and integration along with parallel learning. So it is considered as new implementation area for GANs in the coming future.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121478115","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-01-01DOI: 10.1109/Confluence47617.2020.9058266
Tushar Bhardwaj, Himanshu Upadhyay, S. Sharma
The user’s request changes dynamically in service-based cloud applications, which requires optimal amount of computing resources to meet service-level agreements (SLAs). The existing server-side resource allocation mechanisms have limitations in provisioning the required resources to handle the incoming load on the basis of user’s requests. To overcome the aforementioned situation, cloud computing provides ample amount of computing resources to meet the SLAs. There are possibilities that cloud resources might not be properly utilized and might suffer over and under utilization. In this study, the authors have proposed an autonomic resource allocation framework, that automatically provisions (allocate and deallocate) the required computing resources as per the load. In this study, the proposed model leverages the queuing model to optimize the resource allocation process. The primary goal of this study is to improve the virtual resource utilization and response time with respect to the existing methods. Finally, the results have shown that the response time and resource utilization have been improved.
{"title":"Autonomic Resource Provisioning Framework for Service-based Cloud Applications: A Queuing-Model Based Approach","authors":"Tushar Bhardwaj, Himanshu Upadhyay, S. Sharma","doi":"10.1109/Confluence47617.2020.9058266","DOIUrl":"https://doi.org/10.1109/Confluence47617.2020.9058266","url":null,"abstract":"The user’s request changes dynamically in service-based cloud applications, which requires optimal amount of computing resources to meet service-level agreements (SLAs). The existing server-side resource allocation mechanisms have limitations in provisioning the required resources to handle the incoming load on the basis of user’s requests. To overcome the aforementioned situation, cloud computing provides ample amount of computing resources to meet the SLAs. There are possibilities that cloud resources might not be properly utilized and might suffer over and under utilization. In this study, the authors have proposed an autonomic resource allocation framework, that automatically provisions (allocate and deallocate) the required computing resources as per the load. In this study, the proposed model leverages the queuing model to optimize the resource allocation process. The primary goal of this study is to improve the virtual resource utilization and response time with respect to the existing methods. Finally, the results have shown that the response time and resource utilization have been improved.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130684790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-01-01DOI: 10.1109/Confluence47617.2020.9058188
M. Sánchez, Sergio, Corchado Rodríguez, J. Manuel
Wearable technologies have begun to play an important role in the workplace. This paper presents a Smart PPE (Personal Protective Equipment) solution which employs a sensor network located on a helmet and a belt to monitor the state of the worker and the environment. Most of the accidents that occur in the workplace are caused by the lack of prevention measures, poor safety training and obsolete safety systems which do no adapt technologically to the needs of today’s work environments. The solutions provided by Industry 4.0 for hazard prevention and propose a wireless PPE model that incorporates intelligent tools and fabrics capable of reacting in real time to a risk situation. This novel model implements continuous risk monitoring biometrics of the worker, detects the external impact, shock, luminosity, gases, temperature of the environment and provides real-time recommendations to workers. The motivation behind this work is to improve health and safety in work sectors with high accident risk.
{"title":"Smart Protective Protection Equipment for an accessible work environment and occupational hazard prevention","authors":"M. Sánchez, Sergio, Corchado Rodríguez, J. Manuel","doi":"10.1109/Confluence47617.2020.9058188","DOIUrl":"https://doi.org/10.1109/Confluence47617.2020.9058188","url":null,"abstract":"Wearable technologies have begun to play an important role in the workplace. This paper presents a Smart PPE (Personal Protective Equipment) solution which employs a sensor network located on a helmet and a belt to monitor the state of the worker and the environment. Most of the accidents that occur in the workplace are caused by the lack of prevention measures, poor safety training and obsolete safety systems which do no adapt technologically to the needs of today’s work environments. The solutions provided by Industry 4.0 for hazard prevention and propose a wireless PPE model that incorporates intelligent tools and fabrics capable of reacting in real time to a risk situation. This novel model implements continuous risk monitoring biometrics of the worker, detects the external impact, shock, luminosity, gases, temperature of the environment and provides real-time recommendations to workers. The motivation behind this work is to improve health and safety in work sectors with high accident risk.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133304382","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-01-01DOI: 10.1109/confluence47617.2020.9058024
Lakshya Tyagi, Abhishek Singhal
An explosion consequence analysis utilizes explosives science and engineering to determine potential hazards to targets through objective processes and scientific evidence. This paper proposes the implementation of adaptive neuro-fuzzy inference system in providing decision support for accurate and effective explosion consequence analysis. The model is trained over data obtained from United Nations SaferGuard platform and incorporates the consequence analysis of seven different types of explosives, on brick structures over a range of twenty meters. The model has been implemented using block diagrams on MATLAB Simulink. This work adds to the body of evidence that soft computing techniques can be implemented in designing accurate artificial intelligence decision support and expert systems for both military and civilian applications.
{"title":"Neuro-Fuzzy Approach to Explosion Consequence Analysis","authors":"Lakshya Tyagi, Abhishek Singhal","doi":"10.1109/confluence47617.2020.9058024","DOIUrl":"https://doi.org/10.1109/confluence47617.2020.9058024","url":null,"abstract":"An explosion consequence analysis utilizes explosives science and engineering to determine potential hazards to targets through objective processes and scientific evidence. This paper proposes the implementation of adaptive neuro-fuzzy inference system in providing decision support for accurate and effective explosion consequence analysis. The model is trained over data obtained from United Nations SaferGuard platform and incorporates the consequence analysis of seven different types of explosives, on brick structures over a range of twenty meters. The model has been implemented using block diagrams on MATLAB Simulink. This work adds to the body of evidence that soft computing techniques can be implemented in designing accurate artificial intelligence decision support and expert systems for both military and civilian applications.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131301406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}