Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933668
Pooja Gupta, Sanket Choudhary, Ayoush Johari
Ring Oscillators are basic blocks of key digital modules like PLL, microprocessor or any other time related and memory driven circuits. The purpose of this paper is to compare transients and analyze FFT of multistage ring oscillators and physical design of the same. Additional DRC and NCC checks are applied on the design using 300 nm process technologies. Ring oscillator design presented is driven and controlled by a voltage bias that can be varied by SPICE code. Here a VCO based design is presented that had less floor plan area, lesser power dissipation and consumption and also providing better noise profile which is helpful and applicable for wideband analog mixed signal circuits. Ring Oscillators in this paper, 11 stages, 21 stages and 51 stages are presented. and suitable transients and FFT is done on the design. The paper will also consider a variety of design considerations which are SCMOS cell designs, supply or biasing circuitries and tool based implementation and SPICE based simulations of the multistage oscillator design.
{"title":"Transients, FFT analysis and Physical design of multi stage Ring Oscillators using 0.3µm CMOS technology","authors":"Pooja Gupta, Sanket Choudhary, Ayoush Johari","doi":"10.1109/ICACAT.2018.8933668","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933668","url":null,"abstract":"Ring Oscillators are basic blocks of key digital modules like PLL, microprocessor or any other time related and memory driven circuits. The purpose of this paper is to compare transients and analyze FFT of multistage ring oscillators and physical design of the same. Additional DRC and NCC checks are applied on the design using 300 nm process technologies. Ring oscillator design presented is driven and controlled by a voltage bias that can be varied by SPICE code. Here a VCO based design is presented that had less floor plan area, lesser power dissipation and consumption and also providing better noise profile which is helpful and applicable for wideband analog mixed signal circuits. Ring Oscillators in this paper, 11 stages, 21 stages and 51 stages are presented. and suitable transients and FFT is done on the design. The paper will also consider a variety of design considerations which are SCMOS cell designs, supply or biasing circuitries and tool based implementation and SPICE based simulations of the multistage oscillator design.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"48 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85659369","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933810
A. Gadicha, M. Sarode, V. Thakare
The idea of Video content reclamation is a youthful field that has its genetics grounded forebears instinctive intelligence, numerical signal rectification, statistics, natural language understanding, If researchers are concentrating all these fast growing fields so none of these parental fields alone antiquated able to directly solve the retrieval problem. In this paper shows the path towards a step by step mechanism of CBVR i.e. analysis of entire video, video segmentation, key frames mining, feature extraction mining for retrieving the video from large video datasets. The proposed system inclination focuses on performing key frame mining using adaptive thresholding algorithm and canny mechanism for feature extraction purpose. In order to legalize this claim, content based video reclamation systems were furnished using color histogram, features extraction and different approaches are applied for the supervision of the semantic temperament of each frame in the video.
{"title":"Aggregating and Searching frame in Video Using Semantic Analysis","authors":"A. Gadicha, M. Sarode, V. Thakare","doi":"10.1109/ICACAT.2018.8933810","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933810","url":null,"abstract":"The idea of Video content reclamation is a youthful field that has its genetics grounded forebears instinctive intelligence, numerical signal rectification, statistics, natural language understanding, If researchers are concentrating all these fast growing fields so none of these parental fields alone antiquated able to directly solve the retrieval problem. In this paper shows the path towards a step by step mechanism of CBVR i.e. analysis of entire video, video segmentation, key frames mining, feature extraction mining for retrieving the video from large video datasets. The proposed system inclination focuses on performing key frame mining using adaptive thresholding algorithm and canny mechanism for feature extraction purpose. In order to legalize this claim, content based video reclamation systems were furnished using color histogram, features extraction and different approaches are applied for the supervision of the semantic temperament of each frame in the video.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"50 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85707046","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933707
Vrinda Sachdeva, Sachin Gupta
In recent years the usage of Information technology has unexpectedly increasing resulting in huge data generation. Many companies have taken initiative to use no relational database for managing the data. It has powerful function to manage big data. Huge amount of data has been generated every day. It possibly results in malicious information into the database. No sql database gaining popularity due to its powerful features like scalability, flexibility, faster data access and availability. In this paper, we will analyze the injection on NOSQL database. We also propose defense method by using php and java script. MongoDB is one of the most secure and powerful no sql database. In this paper we demonstrate, basic no sql injection attack and propose defense method to secure the no sql database. In this way, no sql database programmer be aware of the basic no sql injection attack mechanism and create a more secure database to store huge data.
{"title":"Basic NOSQL Injection Analysis And Detection On MongoDB","authors":"Vrinda Sachdeva, Sachin Gupta","doi":"10.1109/ICACAT.2018.8933707","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933707","url":null,"abstract":"In recent years the usage of Information technology has unexpectedly increasing resulting in huge data generation. Many companies have taken initiative to use no relational database for managing the data. It has powerful function to manage big data. Huge amount of data has been generated every day. It possibly results in malicious information into the database. No sql database gaining popularity due to its powerful features like scalability, flexibility, faster data access and availability. In this paper, we will analyze the injection on NOSQL database. We also propose defense method by using php and java script. MongoDB is one of the most secure and powerful no sql database. In this paper we demonstrate, basic no sql injection attack and propose defense method to secure the no sql database. In this way, no sql database programmer be aware of the basic no sql injection attack mechanism and create a more secure database to store huge data.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"66 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76472284","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933624
Jayati Bodkhe, Harshita Dighe, Aparna R. Gupta, Litesh Bopche
Animal identification has become necessary in livestock management at global level. It is one of the three pillars of trace ability, along with premises identification and movement. RFID tags transmit animal identification information using radio waves. This paper contributes and illustrates the different methods of animal identification focusing on radio frequency identification and how it works.
{"title":"Animal Identification","authors":"Jayati Bodkhe, Harshita Dighe, Aparna R. Gupta, Litesh Bopche","doi":"10.1109/ICACAT.2018.8933624","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933624","url":null,"abstract":"Animal identification has become necessary in livestock management at global level. It is one of the three pillars of trace ability, along with premises identification and movement. RFID tags transmit animal identification information using radio waves. This paper contributes and illustrates the different methods of animal identification focusing on radio frequency identification and how it works.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"83 1","pages":"1-4"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76336970","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933783
A. Gaur, R. Dubey
Customer churn analysis and prediction in telecom sector is an issue now a days because it’s very important for telecommunication industries to analyze behaviors of various customer to predict which customers are about to leave the subscription from telecom company. So data mining techniques and algorithm plays an important role for companies in today’s commercial conditions because gaining a new customer’s cost is more than retaining the existing ones. In this paper we can focuses on various machine learning techniques for predicting customer churn through which we can build the classification models such as Logistic Regression, SVM, Random Forest and Gradient boosted tree and also compare the performance of these models.
{"title":"Predicting Customer Churn Prediction In Telecom Sector Using Various Machine Learning Techniques","authors":"A. Gaur, R. Dubey","doi":"10.1109/ICACAT.2018.8933783","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933783","url":null,"abstract":"Customer churn analysis and prediction in telecom sector is an issue now a days because it’s very important for telecommunication industries to analyze behaviors of various customer to predict which customers are about to leave the subscription from telecom company. So data mining techniques and algorithm plays an important role for companies in today’s commercial conditions because gaining a new customer’s cost is more than retaining the existing ones. In this paper we can focuses on various machine learning techniques for predicting customer churn through which we can build the classification models such as Logistic Regression, SVM, Random Forest and Gradient boosted tree and also compare the performance of these models.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"191 1 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78956372","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933664
Deepika Rai, N. Chaudhari, M. Ingle
Graph k-Coloring Problem (GKCP) is a renowned NP Complete Problem (NPC) that has been received noteworthy contribution in diverse research areas. One of the important applications of GKCP is the Sudoku puzzle which is also an NPC. We encoded the problem of solving Sudoku puzzle of size (n$times$n) into GKCP firstly. Further, we reduced GKCP into 3-SAT clauses to obtain the solution of Sudoku puzzle (n$times$ n). Encoding of Sudoku puzzle to 3-SAT clauses straightforwardly leads to large number of clauses. In this way, we developed an algorithm ${SP}_{2}mathrm{G}_{2}3$ SAT on the basis of 3-SAT formulation of Sudoku puzzle using GKCP. This algorithm generates $[(n^{2}/2)^{star}(3n^{2}+(1-2sqrt{n})^{star}n-4)+m]3$-SAT clauses that can be solved by SAT solver to obtain the solution of Sudoku puzzle $(ntimes n)$. 3-SAT reduction of Sudoku puzzle using GKCP provides an efficient way to acquire the solution of Sudoku puzzle of size $(ntimes n)$ as it generates fewer clauses than earlier approach.
{"title":"An Efficient Algorithmic 3-SAT Formulation for Sudoku Puzzle using Graph Coloring","authors":"Deepika Rai, N. Chaudhari, M. Ingle","doi":"10.1109/ICACAT.2018.8933664","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933664","url":null,"abstract":"Graph k-Coloring Problem (GKCP) is a renowned NP Complete Problem (NPC) that has been received noteworthy contribution in diverse research areas. One of the important applications of GKCP is the Sudoku puzzle which is also an NPC. We encoded the problem of solving Sudoku puzzle of size (n$times$n) into GKCP firstly. Further, we reduced GKCP into 3-SAT clauses to obtain the solution of Sudoku puzzle (n$times$ n). Encoding of Sudoku puzzle to 3-SAT clauses straightforwardly leads to large number of clauses. In this way, we developed an algorithm ${SP}_{2}mathrm{G}_{2}3$ SAT on the basis of 3-SAT formulation of Sudoku puzzle using GKCP. This algorithm generates $[(n^{2}/2)^{star}(3n^{2}+(1-2sqrt{n})^{star}n-4)+m]3$-SAT clauses that can be solved by SAT solver to obtain the solution of Sudoku puzzle $(ntimes n)$. 3-SAT reduction of Sudoku puzzle using GKCP provides an efficient way to acquire the solution of Sudoku puzzle of size $(ntimes n)$ as it generates fewer clauses than earlier approach.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"40 1","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80506953","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933629
S. Vijayalakshmi, S. Muruganand
Gaussian mixed model, LK optical flow method and background subtraction from foreground method are used to extract the fire and smoke region in foreground of video image. Multi feature of fire characteristics are used to extract the information. Colour feature of suspected region are extracted according to the colour model RGB and HSI spaces. Background blur feature is extracted using two dimensional discrete wavelet transform. If smoke appears in scene, the contour edge of the background would become blurry. The motion direction feature is extracted using LK optical flow method and gaussion mixed model. The DHT 11 digital temperature - humidity sensor in sensor node is used to extract temperature and humidity values for measurement and TIMSP430 micro controller for processing the information. The video node and sensor node extracted information are combined to detect the possibility of fire in the area during worst season conditions. By this method, the accuracy of fire and smoke detection is improved even in the worst environmental condition such as rainy weather. From the simulated and experimental results, the proposed method improves the accuracy and detection rate. Combination of sensor output and video output give excellent value in finding smoke or fire from videos. They reduces false detection rate of detecting smoke from non-smoke videos. It can be used in outdoor large environment.
{"title":"Fire Recognition Based on Sensor node and Feature of Video Smoke","authors":"S. Vijayalakshmi, S. Muruganand","doi":"10.1109/ICACAT.2018.8933629","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933629","url":null,"abstract":"Gaussian mixed model, LK optical flow method and background subtraction from foreground method are used to extract the fire and smoke region in foreground of video image. Multi feature of fire characteristics are used to extract the information. Colour feature of suspected region are extracted according to the colour model RGB and HSI spaces. Background blur feature is extracted using two dimensional discrete wavelet transform. If smoke appears in scene, the contour edge of the background would become blurry. The motion direction feature is extracted using LK optical flow method and gaussion mixed model. The DHT 11 digital temperature - humidity sensor in sensor node is used to extract temperature and humidity values for measurement and TIMSP430 micro controller for processing the information. The video node and sensor node extracted information are combined to detect the possibility of fire in the area during worst season conditions. By this method, the accuracy of fire and smoke detection is improved even in the worst environmental condition such as rainy weather. From the simulated and experimental results, the proposed method improves the accuracy and detection rate. Combination of sensor output and video output give excellent value in finding smoke or fire from videos. They reduces false detection rate of detecting smoke from non-smoke videos. It can be used in outdoor large environment.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"1021 1","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74140080","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933579
Shreerag Marar, Debabrata Swain, Vivek Hiwarkar, Nikhil Motwani, A. Awari
Parkinson is a disease that directly degrades the functioning of central nervous system, more specifically the motor system. If diagnosed in a later stage, this disease may become incurable. Hence, it is necessary to diagnose the disease at an early stage. Voice frequency plays a vital role in the prediction of Parkinson disease. This paper presents the study for the diagnosis of Parkinson disease using various machine learning algorithms through the amount of voice data attained from UCI repository [1]. The voice dataset consists of voice frequencies of 31 people with early-stage Parkinson's disease recruited to a six-month trial of a telemonitoring device for remote symptom progression monitoring. Various machine learning algorithms were applied on the dataset and among them ANN has shown highest accuracy (94.87%). Random Forest which is a Classification algorithm has shown good accuracy (87.17%) while Naïve Bayes has shown least accuracy (71.79%). We have summarized all the results using the confusion matrix.
{"title":"Predicting the occurrence of Parkinson’s Disease using various Classification Models","authors":"Shreerag Marar, Debabrata Swain, Vivek Hiwarkar, Nikhil Motwani, A. Awari","doi":"10.1109/ICACAT.2018.8933579","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933579","url":null,"abstract":"Parkinson is a disease that directly degrades the functioning of central nervous system, more specifically the motor system. If diagnosed in a later stage, this disease may become incurable. Hence, it is necessary to diagnose the disease at an early stage. Voice frequency plays a vital role in the prediction of Parkinson disease. This paper presents the study for the diagnosis of Parkinson disease using various machine learning algorithms through the amount of voice data attained from UCI repository [1]. The voice dataset consists of voice frequencies of 31 people with early-stage Parkinson's disease recruited to a six-month trial of a telemonitoring device for remote symptom progression monitoring. Various machine learning algorithms were applied on the dataset and among them ANN has shown highest accuracy (94.87%). Random Forest which is a Classification algorithm has shown good accuracy (87.17%) while Naïve Bayes has shown least accuracy (71.79%). We have summarized all the results using the confusion matrix.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"167 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85175638","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933739
J. Mehra, R. S. Thakur
WWW is a huge repository of information which is growing exponentially. More and more people visit various web sites and search engines to find relevant information. To provide the huge information is not the problem, but the problem is that day by day more and more people having different needs and requirements search through this huge WWW and get lost in complex web structures and hence miss their inquiry goals. Web personalization can be the solution to this problem. Web personalization is the process where web site contents are tailored as per the needs of a user. For the personalization, the interesting access patterns can be mined from web usage data. In many applications of web personalization, dynamic recommendations of items are made based on user's browsing behavior and his/her profile. The regular explosion of e-Commerce, there is strong competition amongst companies and other sectors to be a focus for the customers. Web server analysis is very difficult to find out the web user behavior for any organization. It is useful for future web site improvement and design. In this paper proposed a Fuzzy dynamic approach for finding the web user session clusters from web log data. Direct elimination of the small-sized estimated sessions may bring about loss of an essential measure of data specially when small session large in number. This proposes a "Fuzzy Dynamic" approach to deal with manage this issue.
{"title":"Efficiently reducing the size of web log data using Fuzzy Dynamic Approach","authors":"J. Mehra, R. S. Thakur","doi":"10.1109/ICACAT.2018.8933739","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933739","url":null,"abstract":"WWW is a huge repository of information which is growing exponentially. More and more people visit various web sites and search engines to find relevant information. To provide the huge information is not the problem, but the problem is that day by day more and more people having different needs and requirements search through this huge WWW and get lost in complex web structures and hence miss their inquiry goals. Web personalization can be the solution to this problem. Web personalization is the process where web site contents are tailored as per the needs of a user. For the personalization, the interesting access patterns can be mined from web usage data. In many applications of web personalization, dynamic recommendations of items are made based on user's browsing behavior and his/her profile. The regular explosion of e-Commerce, there is strong competition amongst companies and other sectors to be a focus for the customers. Web server analysis is very difficult to find out the web user behavior for any organization. It is useful for future web site improvement and design. In this paper proposed a Fuzzy dynamic approach for finding the web user session clusters from web log data. Direct elimination of the small-sized estimated sessions may bring about loss of an essential measure of data specially when small session large in number. This proposes a \"Fuzzy Dynamic\" approach to deal with manage this issue.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"26 2 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80384923","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-01DOI: 10.1109/ICACAT.2018.8933587
Dilip Singh Sisodia, Namrata Verma
The HTTP flood attacks are carried out through enormous HTTP requests generated by automated software agents within a short period. The application layer is more vulnerable to HTTP flood attacks and exhausted computing and communication resources of the web server to disrupt the different web services. All HTTP requests are stored at the server as a web log file. However, malicious automated software agents camouflage their behavior on the web server logs and pose a great challenge to detect their HTTP requests. It is assumed that navigational behavior of actual visitors and automated software agents are fundamentally different. In this paper, a framework for weblog preprocessing and extracting various predefined features from raw web server logs is implemented. The most effective features are identified which are potentially useful in differentiating legitimate users and automated software agents. The sessionized HTTP feature vectors are also labeled as an actual visitor or possible web robots. The experiments are performed on raw weblogs of a commercial web portal.
{"title":"Framework for Preprocessing and Feature Extraction from Weblogs for Identification of HTTP Flood Request Attacks","authors":"Dilip Singh Sisodia, Namrata Verma","doi":"10.1109/ICACAT.2018.8933587","DOIUrl":"https://doi.org/10.1109/ICACAT.2018.8933587","url":null,"abstract":"The HTTP flood attacks are carried out through enormous HTTP requests generated by automated software agents within a short period. The application layer is more vulnerable to HTTP flood attacks and exhausted computing and communication resources of the web server to disrupt the different web services. All HTTP requests are stored at the server as a web log file. However, malicious automated software agents camouflage their behavior on the web server logs and pose a great challenge to detect their HTTP requests. It is assumed that navigational behavior of actual visitors and automated software agents are fundamentally different. In this paper, a framework for weblog preprocessing and extracting various predefined features from raw web server logs is implemented. The most effective features are identified which are potentially useful in differentiating legitimate users and automated software agents. The sessionized HTTP feature vectors are also labeled as an actual visitor or possible web robots. The experiments are performed on raw weblogs of a commercial web portal.","PeriodicalId":6575,"journal":{"name":"2018 International Conference on Advanced Computation and Telecommunication (ICACAT)","volume":"64 1","pages":"1-4"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84008185","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}