Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496509
S. Priyadarshini, S. Karthik
Agents are probably the fastest growing area in Information Technology. A software agent is a piece of software that functions as an agent for a user or another program, working autonomously and independently in a particular environment. As agents has ability to migrate to any system, perform the tasks and return the results. The field of software agents is a broad and rapidly developing area of research, which encompasses a diverse range of topics and interests. Huge number of researches is going on by comparing the functional similarities of the Human Immune System for making the agents more adaptable with respect to security. An agent-based system was proposed by using functional commonalities of human Nervous system to adapt themselves to threats such as host attack. The agents become aware of the malicious hosts' attack by learning and coordination is maintained by a Co-Agent to make this system work successfully. The idea of learning and coordination are taken from the Human Nervous system functionality. This system has shown a better functioning in maintaining the system performance by making the agents aware of malicious hosts and by producing limited number of clones.
{"title":"Analysis of agent based system in agile methodology","authors":"S. Priyadarshini, S. Karthik","doi":"10.1109/ICPRIME.2013.6496509","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496509","url":null,"abstract":"Agents are probably the fastest growing area in Information Technology. A software agent is a piece of software that functions as an agent for a user or another program, working autonomously and independently in a particular environment. As agents has ability to migrate to any system, perform the tasks and return the results. The field of software agents is a broad and rapidly developing area of research, which encompasses a diverse range of topics and interests. Huge number of researches is going on by comparing the functional similarities of the Human Immune System for making the agents more adaptable with respect to security. An agent-based system was proposed by using functional commonalities of human Nervous system to adapt themselves to threats such as host attack. The agents become aware of the malicious hosts' attack by learning and coordination is maintained by a Co-Agent to make this system work successfully. The idea of learning and coordination are taken from the Human Nervous system functionality. This system has shown a better functioning in maintaining the system performance by making the agents aware of malicious hosts and by producing limited number of clones.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131091842","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496459
R. Nithiavathy
The current utilization of the spectrum is quite inefficient; consequently, if properly used, there is no shortage of the spectrum that is presently available. Therefore, it is anticipated that more flexible use of spectrum and spectrum sharing between radio systems will be key enablers to facilitate the successful implementation of future systems. Cognitive radio, however, is known as the most intelligent and promising technique in solving the problem of spectrum sharing. In this paper, we consider the technique of spectrum sharing among users of service providers to share the licensed spectrum of licensed service providers. It is shown that the proposed technique rCloud computing is upcoming technology which has gained a lot of hype in the current world of I. T. Cloud computing is said to be the next big thing in the computer world after the internet. Though the benefits are huge, such a service is also relinquishing users' physical possession of their outsourced data, which inevitably poses new security risks towards the correctness of the data in cloud. In order to address this new problem and further achieve a secure and dependable cloud storage service. We propose in this paper a flexible distributed storage integrity auditing mechanism, utilizing the homomorphic token and distributed erasure-coded data for dynamically storing data. The proposed design allows users or third party auditor to audit the cloud storage with very lightweight communication and less computation cost. The auditing result ensures reliable cloud storage correctness, and simultaneously achieves fast data error localization, i.e., finding which server is misbehaving in the fast rapidly changing its data in the cloud and the user's application stored in cloud, it also supports efficient dynamic operations on outsourced data which is secured, including, deletion, and append, block modification and resilient against Byzantine failure, malicious data modification attack, and even server colluding attacks.
{"title":"Data integrity and data dynamics with secure storage service in cloud","authors":"R. Nithiavathy","doi":"10.1109/ICPRIME.2013.6496459","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496459","url":null,"abstract":"The current utilization of the spectrum is quite inefficient; consequently, if properly used, there is no shortage of the spectrum that is presently available. Therefore, it is anticipated that more flexible use of spectrum and spectrum sharing between radio systems will be key enablers to facilitate the successful implementation of future systems. Cognitive radio, however, is known as the most intelligent and promising technique in solving the problem of spectrum sharing. In this paper, we consider the technique of spectrum sharing among users of service providers to share the licensed spectrum of licensed service providers. It is shown that the proposed technique rCloud computing is upcoming technology which has gained a lot of hype in the current world of I. T. Cloud computing is said to be the next big thing in the computer world after the internet. Though the benefits are huge, such a service is also relinquishing users' physical possession of their outsourced data, which inevitably poses new security risks towards the correctness of the data in cloud. In order to address this new problem and further achieve a secure and dependable cloud storage service. We propose in this paper a flexible distributed storage integrity auditing mechanism, utilizing the homomorphic token and distributed erasure-coded data for dynamically storing data. The proposed design allows users or third party auditor to audit the cloud storage with very lightweight communication and less computation cost. The auditing result ensures reliable cloud storage correctness, and simultaneously achieves fast data error localization, i.e., finding which server is misbehaving in the fast rapidly changing its data in the cloud and the user's application stored in cloud, it also supports efficient dynamic operations on outsourced data which is secured, including, deletion, and append, block modification and resilient against Byzantine failure, malicious data modification attack, and even server colluding attacks.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125363172","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496486
M. Kalaiselvan, A. V. Kathiravan
Text Summarization is condensing the source text into a shorter version preserving its information content and overall meaning. It is very difficult for human beings to manually summarize large documents of text. The text summarization method consists of selecting important sentences, paragraphs etc. from the original document and converting them into a star map. The importance of sentences is decided based on statistical and linguistic features of sentences. A star map summarization method consists of understanding the original text and re-telling it in fewer words. This star map is used for duplicate elimination, exam paper evaluator, lesson planning, Identify Shingling. A Pioneering Tool For Text Summarization using Star Map has been presented.
{"title":"A pioneering tool for text summarization using star map","authors":"M. Kalaiselvan, A. V. Kathiravan","doi":"10.1109/ICPRIME.2013.6496486","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496486","url":null,"abstract":"Text Summarization is condensing the source text into a shorter version preserving its information content and overall meaning. It is very difficult for human beings to manually summarize large documents of text. The text summarization method consists of selecting important sentences, paragraphs etc. from the original document and converting them into a star map. The importance of sentences is decided based on statistical and linguistic features of sentences. A star map summarization method consists of understanding the original text and re-telling it in fewer words. This star map is used for duplicate elimination, exam paper evaluator, lesson planning, Identify Shingling. A Pioneering Tool For Text Summarization using Star Map has been presented.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124097790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496718
I. L. Aroquiaraj, laurence. raj
Feature Selection (FS) aims to determine a minimal feature subset from a problem domain while retaining a suitably high accuracy in representing the original features. Rough set theory (RST) has been used as such a tool with much success. In the supervised FS methods, various feature subsets are evaluated using an evaluation function or metric to select only those features which are related to the decision classes of the data under consideration. However, for many data mining applications, decision class labels are often unknown or incomplete, thus indicating the significance of unsupervised feature selection. However, in unsupervised learning, decision class labels are not provided. The problem is that not all features are important. Some of the features may be redundant, and others may be irrelevant and noisy. In this paper, a novel unsupervised feature selection in mammogram image, using tolerance rough set based relative reduct is proposed. And also, compared with Tolerance Quick Reduct and PSO - Relative Reduct unsupervised feature selection methods. A typical mammogram image processing system generally consists of mammogram image acquisition, pre-processing of image segmentation, feature extraction, feature selection and classification. The proposed method is used to reduce features from the extracted features and the method is compared with existing unsupervised features selection methods. The proposed method is evaluated through clustering and classification algorithms in K-means and WEKA.
{"title":"Mammogram image feature selection using unsupervised tolerance rough set relative reduct algorithm","authors":"I. L. Aroquiaraj, laurence. raj","doi":"10.1109/ICPRIME.2013.6496718","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496718","url":null,"abstract":"Feature Selection (FS) aims to determine a minimal feature subset from a problem domain while retaining a suitably high accuracy in representing the original features. Rough set theory (RST) has been used as such a tool with much success. In the supervised FS methods, various feature subsets are evaluated using an evaluation function or metric to select only those features which are related to the decision classes of the data under consideration. However, for many data mining applications, decision class labels are often unknown or incomplete, thus indicating the significance of unsupervised feature selection. However, in unsupervised learning, decision class labels are not provided. The problem is that not all features are important. Some of the features may be redundant, and others may be irrelevant and noisy. In this paper, a novel unsupervised feature selection in mammogram image, using tolerance rough set based relative reduct is proposed. And also, compared with Tolerance Quick Reduct and PSO - Relative Reduct unsupervised feature selection methods. A typical mammogram image processing system generally consists of mammogram image acquisition, pre-processing of image segmentation, feature extraction, feature selection and classification. The proposed method is used to reduce features from the extracted features and the method is compared with existing unsupervised features selection methods. The proposed method is evaluated through clustering and classification algorithms in K-means and WEKA.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131035859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496725
S. U. Kumar, H. Inbarani, S. S. Kumar
Classification is one of the main issues in Data Mining Research fields. The classification difficulties in medical area frequently classify medical dataset based on the result of medical diagnosis or description of medical treatment by the medical specialist. The Extensive amounts of information and data warehouse in medical databases need the development of specialized tools for storing, retrieving, investigation, and effectiveness usage of stored knowledge and data. Intelligent methods such as neural networks, fuzzy sets, decision trees, and expert systems are, slowly but steadily, applied in the medical fields. Recently, Bijective soft set theory has been proposed as a new intelligent technique for the discovery of data dependencies, data reduction, classification and rule generation from databases. In this paper, we present a novel approach based on Bijective soft sets for the generation of classification rules from the data set. Investigational results from applying the Bijective soft set analysis to the set of data samples are given and evaluated. In addition, the generated rules are also compared to the well-known decision tree classifier algorithm and Naïve bayes. The learning illustrates that the theory of Bijective soft set seems to be a valuable tool for inductive learning and provides a valuable support for building expert systems.
{"title":"Bijective soft set based classification of medical data","authors":"S. U. Kumar, H. Inbarani, S. S. Kumar","doi":"10.1109/ICPRIME.2013.6496725","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496725","url":null,"abstract":"Classification is one of the main issues in Data Mining Research fields. The classification difficulties in medical area frequently classify medical dataset based on the result of medical diagnosis or description of medical treatment by the medical specialist. The Extensive amounts of information and data warehouse in medical databases need the development of specialized tools for storing, retrieving, investigation, and effectiveness usage of stored knowledge and data. Intelligent methods such as neural networks, fuzzy sets, decision trees, and expert systems are, slowly but steadily, applied in the medical fields. Recently, Bijective soft set theory has been proposed as a new intelligent technique for the discovery of data dependencies, data reduction, classification and rule generation from databases. In this paper, we present a novel approach based on Bijective soft sets for the generation of classification rules from the data set. Investigational results from applying the Bijective soft set analysis to the set of data samples are given and evaluated. In addition, the generated rules are also compared to the well-known decision tree classifier algorithm and Naïve bayes. The learning illustrates that the theory of Bijective soft set seems to be a valuable tool for inductive learning and provides a valuable support for building expert systems.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"10 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114489803","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496473
S. Prabha, R. Anitha
With increasing trend in application services on large-scale internet scenario of both wired and wireless interface, intimidation to restrain the application service by Distributed Denial of Service (DDoS) attacks become a high-flying issue. Most of the present DDoS attacks resistance method work on application services in wired network and wireless network individually. No method is offered herewith for the two kinds of networks up to now. Though the present internet application services must switch between wired and wireless platform, well-matched resistance method for Distributed Denial of Service attacks have to be coined for better security which is the present requirement in the environment. With these issues in mind, the proposed model develops counter mechanism to mitigate the potency of the resource attacks and evaluate the efficacy. Application Service Network Request Identification (ASNRI) scheme is presented to provide an apparent demarcation of wired service and wireless services request, which is then fed to the Bayes packet classifier for its associated denial of service attack characteristics. From the Bayes packet classifier, resistance filters are stimulated to restrict denial of service attacks in the respective platform, that is., wired or wireless. The simulation of the proposed ASNRI scheme is conducted with NS-2 simulator to show its effectiveness of restricting Distributed Denial of Service attacks in terms of RESPONSE TIME, APPLICATION SERVICE THROUGHPUT, LOAD VARIANCE in the application server.
{"title":"Mitigation of application DDoS attacks using ASNRI scheme for IP and MAC frames","authors":"S. Prabha, R. Anitha","doi":"10.1109/ICPRIME.2013.6496473","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496473","url":null,"abstract":"With increasing trend in application services on large-scale internet scenario of both wired and wireless interface, intimidation to restrain the application service by Distributed Denial of Service (DDoS) attacks become a high-flying issue. Most of the present DDoS attacks resistance method work on application services in wired network and wireless network individually. No method is offered herewith for the two kinds of networks up to now. Though the present internet application services must switch between wired and wireless platform, well-matched resistance method for Distributed Denial of Service attacks have to be coined for better security which is the present requirement in the environment. With these issues in mind, the proposed model develops counter mechanism to mitigate the potency of the resource attacks and evaluate the efficacy. Application Service Network Request Identification (ASNRI) scheme is presented to provide an apparent demarcation of wired service and wireless services request, which is then fed to the Bayes packet classifier for its associated denial of service attack characteristics. From the Bayes packet classifier, resistance filters are stimulated to restrict denial of service attacks in the respective platform, that is., wired or wireless. The simulation of the proposed ASNRI scheme is conducted with NS-2 simulator to show its effectiveness of restricting Distributed Denial of Service attacks in terms of RESPONSE TIME, APPLICATION SERVICE THROUGHPUT, LOAD VARIANCE in the application server.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129336440","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496478
B. Kurhade, M. Kshirsagar
E-voting systems are important tools for community participation in essential decisions of society. In comparison with traditional voting systems, e-voting systems have special advantages. Any e-voting system is based on an e-voting protocol. The applied pi calculus is a language used to formalise the protocol. It is a language for describing concurrent processes and their intersections. Properties of processes described in the applied pi calculus can be proved by employing manual techniques or by automated tool such as proverif. A potentially much more secure system could be implemented, based on formal protocols that specify the messages sent to electronic voting machines. Such protocols have been studied for several decades. They offer the possibility of abstract analysis of protocol against formally stated properties. Formal verification techniques are notoriously difficult to design and analyse. Our aim is use verification technique to analyse the protocol. This review paper focus on modelling a known protocol for elections known as BORDA in the applied pi calculus, and this paper also focus on formalizing some of its expected properties, namely eligibility, fairness, Receipt freeness, individual verifiability and privacy. The applied pi calculus has a family of proof techniques which we can use is supported by the proverif tool and has been used to analyse a variety of security protocols.
{"title":"Formalization and analysis of Borda protocol using pi calculus","authors":"B. Kurhade, M. Kshirsagar","doi":"10.1109/ICPRIME.2013.6496478","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496478","url":null,"abstract":"E-voting systems are important tools for community participation in essential decisions of society. In comparison with traditional voting systems, e-voting systems have special advantages. Any e-voting system is based on an e-voting protocol. The applied pi calculus is a language used to formalise the protocol. It is a language for describing concurrent processes and their intersections. Properties of processes described in the applied pi calculus can be proved by employing manual techniques or by automated tool such as proverif. A potentially much more secure system could be implemented, based on formal protocols that specify the messages sent to electronic voting machines. Such protocols have been studied for several decades. They offer the possibility of abstract analysis of protocol against formally stated properties. Formal verification techniques are notoriously difficult to design and analyse. Our aim is use verification technique to analyse the protocol. This review paper focus on modelling a known protocol for elections known as BORDA in the applied pi calculus, and this paper also focus on formalizing some of its expected properties, namely eligibility, fairness, Receipt freeness, individual verifiability and privacy. The applied pi calculus has a family of proof techniques which we can use is supported by the proverif tool and has been used to analyse a variety of security protocols.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132973909","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496471
M. Ilayaraja, T. Meyyappan
The data mining is a process of analyzing a huge data from different perspectives and summarizing it into useful information. The information can be converted into knowledge about historical patterns and future trends. Data mining plays a significant role in the field of information technology. Health care industry today generates large amounts of complex data about patients, hospitals resources, diseases, diagnosis methods, electronic patients records, etc,. The data mining techniques are very useful to make medicinal decisions in curing diseases. The healthcare industry collects huge amount of healthcare data which, unfortunately, are not “mined” to discover hidden information for effective decision making. The discovered knowledge can be used by the healthcare administrators to improve the quality of service. In this paper, authors developed a method to identify frequency of diseases in particular geographical area at given time period with the aid of association rule based Apriori data mining technique.
{"title":"Mining medical data to identify frequent diseases using Apriori algorithm","authors":"M. Ilayaraja, T. Meyyappan","doi":"10.1109/ICPRIME.2013.6496471","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496471","url":null,"abstract":"The data mining is a process of analyzing a huge data from different perspectives and summarizing it into useful information. The information can be converted into knowledge about historical patterns and future trends. Data mining plays a significant role in the field of information technology. Health care industry today generates large amounts of complex data about patients, hospitals resources, diseases, diagnosis methods, electronic patients records, etc,. The data mining techniques are very useful to make medicinal decisions in curing diseases. The healthcare industry collects huge amount of healthcare data which, unfortunately, are not “mined” to discover hidden information for effective decision making. The discovered knowledge can be used by the healthcare administrators to improve the quality of service. In this paper, authors developed a method to identify frequency of diseases in particular geographical area at given time period with the aid of association rule based Apriori data mining technique.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"192 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133527956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496464
G. Indirani, K. Selvakumar, V. Sivaaamasundari
Due to the nature of a MANET, it is much more vulnerable to attack than a wired network. As a result, an Intrusion Detection System (IDS) for MANETs is designed to detect anomalous behavior and misuse. Swarm intelligence is a relatively novel field. These algorithms, like their natural systems of inspiration, show the desirable properties of being adaptive, scalable, and robust. The objective of this project is to detect and perform defense mechanism for packet replication attack. A Packet replication problem reduced by the NS2 simulator for MANET, each data packet is replicated in the IP layer at its source. The number of replication times is determined by the number of route entries for the destination, each data packet is uniquely identified with the tree-id provided by NS2 and the redundant packets are discarded in the IP layer at the receiver. The Performance of the Intrusion Detection system is verified using the parameters like Throughput, Packet-Received, Packet-Delivery Ratio, End-to-End Latency, Bandwidth and Latency.
{"title":"Intrusion detection and defense mechanism for packet replication attack over MANET using swarm intelligence","authors":"G. Indirani, K. Selvakumar, V. Sivaaamasundari","doi":"10.1109/ICPRIME.2013.6496464","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496464","url":null,"abstract":"Due to the nature of a MANET, it is much more vulnerable to attack than a wired network. As a result, an Intrusion Detection System (IDS) for MANETs is designed to detect anomalous behavior and misuse. Swarm intelligence is a relatively novel field. These algorithms, like their natural systems of inspiration, show the desirable properties of being adaptive, scalable, and robust. The objective of this project is to detect and perform defense mechanism for packet replication attack. A Packet replication problem reduced by the NS2 simulator for MANET, each data packet is replicated in the IP layer at its source. The number of replication times is determined by the number of route entries for the destination, each data packet is uniquely identified with the tree-id provided by NS2 and the redundant packets are discarded in the IP layer at the receiver. The Performance of the Intrusion Detection system is verified using the parameters like Throughput, Packet-Received, Packet-Delivery Ratio, End-to-End Latency, Bandwidth and Latency.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131538159","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-15DOI: 10.1109/ICPRIME.2013.6496441
S. Bhatt, T. Santhanam
The latest trend in authenticating users is by using the potentiality of biometrics. Keystroke dynamics is a behavioural biometrics which captures the typing rhythms of users and then authenticates them based on the dynamics captured. In this paper, a detailed study on the evolution of keystroke dynamics as a measure of authentication is carried out. This paper gives an insight from the infancy stage to the current work done on this domain which can be used by researchers working on this topic.
{"title":"Keystroke dynamics for biometric authentication — A survey","authors":"S. Bhatt, T. Santhanam","doi":"10.1109/ICPRIME.2013.6496441","DOIUrl":"https://doi.org/10.1109/ICPRIME.2013.6496441","url":null,"abstract":"The latest trend in authenticating users is by using the potentiality of biometrics. Keystroke dynamics is a behavioural biometrics which captures the typing rhythms of users and then authenticates them based on the dynamics captured. In this paper, a detailed study on the evolution of keystroke dynamics as a measure of authentication is carried out. This paper gives an insight from the infancy stage to the current work done on this domain which can be used by researchers working on this topic.","PeriodicalId":123210,"journal":{"name":"2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129609524","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}