Pub Date : 2006-12-20DOI: 10.1109/ADCOM.2006.4289872
S. Mathew, K. P. Jacob
The focus of this work is to provide authentication and confidentiality of messages in a swift and cost effective manner to suit the fast growing Internet applications. A nested hash function with lower computational and storage demands is designed with a view to providing authentication as also to encrypt the message as well as the hash code using a fast stream cipher MAJE4 with a variable key size of 128-bit or 256-bit for achieving confidentiality. Both nested Hash function and MAJE4 stream cipher algorithm use primitive computational operators commonly found in microprocessors; this makes the method simple and fast to implement both in hardware and software. Since the memory requirement is less, it can be used for handheld devices for security purposes.
{"title":"Message Integrity in the World Wide Web: Use of Nested Hash Function and a Fast Stream Cipher","authors":"S. Mathew, K. P. Jacob","doi":"10.1109/ADCOM.2006.4289872","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289872","url":null,"abstract":"The focus of this work is to provide authentication and confidentiality of messages in a swift and cost effective manner to suit the fast growing Internet applications. A nested hash function with lower computational and storage demands is designed with a view to providing authentication as also to encrypt the message as well as the hash code using a fast stream cipher MAJE4 with a variable key size of 128-bit or 256-bit for achieving confidentiality. Both nested Hash function and MAJE4 stream cipher algorithm use primitive computational operators commonly found in microprocessors; this makes the method simple and fast to implement both in hardware and software. Since the memory requirement is less, it can be used for handheld devices for security purposes.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124801662","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-12-01DOI: 10.1109/ADCOM.2006.4289901
K. Umamaheswari, S. Sumathi, S. Sivanandam
The ever-increasing volume in the collection of image data in various fields of science, medicine, security and other fields has brought the necessity to extract knowledge. Face classification/recognition is one of the challenging problems of computer vision. The use of Data mining techniques has a legitimate and enabling ways to explore these large image collections using neuro-genetic approaches. A novel Symmetric Based Algorithm is proposed for face detection in still gray level images, which acts as a selective attentional mechanism. The three face classifiers/recognizers, Linear Discriminant Analysis (LDA), Line Based Algorithm (LBA) and Kernel Direct Discriminant Analysis (KDDA) are fused using Radial Basis network for efficient feature extraction of the face images. The use of Genetic algorithm approach optimizes the weights of neural network to extract only the essential features that effectively and successively improves the classification/recognition accuracy. A total of 1024 images for 22 subjects taken from BioID Laboratory, Texas, USA are used for analysis.
{"title":"Neuro - Genetic approaches to classification of Face Images with effective feature selection using hybrid classifiers","authors":"K. Umamaheswari, S. Sumathi, S. Sivanandam","doi":"10.1109/ADCOM.2006.4289901","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289901","url":null,"abstract":"The ever-increasing volume in the collection of image data in various fields of science, medicine, security and other fields has brought the necessity to extract knowledge. Face classification/recognition is one of the challenging problems of computer vision. The use of Data mining techniques has a legitimate and enabling ways to explore these large image collections using neuro-genetic approaches. A novel Symmetric Based Algorithm is proposed for face detection in still gray level images, which acts as a selective attentional mechanism. The three face classifiers/recognizers, Linear Discriminant Analysis (LDA), Line Based Algorithm (LBA) and Kernel Direct Discriminant Analysis (KDDA) are fused using Radial Basis network for efficient feature extraction of the face images. The use of Genetic algorithm approach optimizes the weights of neural network to extract only the essential features that effectively and successively improves the classification/recognition accuracy. A total of 1024 images for 22 subjects taken from BioID Laboratory, Texas, USA are used for analysis.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116677956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-12-01DOI: 10.1109/ADCOM.2006.4289955
M. Singh, P. Singh, Hardeep Singh
Drug discoverers need to predict the functions of proteins which are responsible for various diseases in human body. The proposed method is to use priority based packages of SDFs (Sequence Derived Features) so that decision tree may be created by their depth exploration rather than exclusion. This research work develops a new decision tree induction technique in which uncertainty measure is used for best attribute selection. The model creates better decision tree in terms of depth than the existing C4.5 technique. The tree with greater depth ensures more number of tests before functional class assignment and thus results in more accurate predictions than the existing prediction technique. For the same test data, the percentage accuracy of the new HPF (human protein function) predictor is 72% and that of the existing prediction technique is 44%.
{"title":"Decision Tree Classifier for Human Protein Function Prediction","authors":"M. Singh, P. Singh, Hardeep Singh","doi":"10.1109/ADCOM.2006.4289955","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289955","url":null,"abstract":"Drug discoverers need to predict the functions of proteins which are responsible for various diseases in human body. The proposed method is to use priority based packages of SDFs (Sequence Derived Features) so that decision tree may be created by their depth exploration rather than exclusion. This research work develops a new decision tree induction technique in which uncertainty measure is used for best attribute selection. The model creates better decision tree in terms of depth than the existing C4.5 technique. The tree with greater depth ensures more number of tests before functional class assignment and thus results in more accurate predictions than the existing prediction technique. For the same test data, the percentage accuracy of the new HPF (human protein function) predictor is 72% and that of the existing prediction technique is 44%.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125028657","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-12-01DOI: 10.1109/ADCOM.2006.4289970
P. Nandini, K. Sarukesi
The security risks associated with network systems is a topic that has become increasingly significant in this millennium. The increased visibility, the catastrophic financial effects of a number of corporate security breaches have made the management of risk of all types is the front-burner issue of all corporate sectors. This paper assesses Information Security that commences with a statement of the problem of information security risks and presents a comprehensive methodology to analyze security risks along with suitable parameters and areas of risks involved. The final outcome of this research is assessing security that is practical enough to be used in real applications with acceptable results, without having to be an expert in the security arena. It is built upon the concepts drawn from the computer security leaders in the industry and are tested.
{"title":"Methodology To Assess Security Risks Involved In Networked Systems","authors":"P. Nandini, K. Sarukesi","doi":"10.1109/ADCOM.2006.4289970","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289970","url":null,"abstract":"The security risks associated with network systems is a topic that has become increasingly significant in this millennium. The increased visibility, the catastrophic financial effects of a number of corporate security breaches have made the management of risk of all types is the front-burner issue of all corporate sectors. This paper assesses Information Security that commences with a statement of the problem of information security risks and presents a comprehensive methodology to analyze security risks along with suitable parameters and areas of risks involved. The final outcome of this research is assessing security that is practical enough to be used in real applications with acceptable results, without having to be an expert in the security arena. It is built upon the concepts drawn from the computer security leaders in the industry and are tested.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125386138","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-12-01DOI: 10.1109/ADCOM.2006.4289922
C. Kalaiarasan, S. Selvan
This paper introduces an efficient and new multicast congestion control algorithm for asymmetric paths. In the current Internet scenario streaming applications becomes popular and which uses multicast approach. Congestion control in multicast protocol is a difficult task and lot of research is going in this area. To enhance the bandwidth utilization, instead of symmetric path, an asymmetric path is a viable alternate. But the existing algorithms are not directly applicable for such application. Using new approach protocol the receiver explicitly adjusts its reception rate according to the network conditions using TCP throughput equation and Packet-Pair probe. Its effectiveness has been checked using simulation techniques. The most popular network simulator (NS2) was used in this analysis. It is found that the new approach protocol passes the basic requirements for an effective internet multicast protocol such as responsiveness, efficiency in network utilization, scalability, fairness and TCP-friendliness.
{"title":"A New Approach for Multicast Congestion Control using Asymmetric Paths","authors":"C. Kalaiarasan, S. Selvan","doi":"10.1109/ADCOM.2006.4289922","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289922","url":null,"abstract":"This paper introduces an efficient and new multicast congestion control algorithm for asymmetric paths. In the current Internet scenario streaming applications becomes popular and which uses multicast approach. Congestion control in multicast protocol is a difficult task and lot of research is going in this area. To enhance the bandwidth utilization, instead of symmetric path, an asymmetric path is a viable alternate. But the existing algorithms are not directly applicable for such application. Using new approach protocol the receiver explicitly adjusts its reception rate according to the network conditions using TCP throughput equation and Packet-Pair probe. Its effectiveness has been checked using simulation techniques. The most popular network simulator (NS2) was used in this analysis. It is found that the new approach protocol passes the basic requirements for an effective internet multicast protocol such as responsiveness, efficiency in network utilization, scalability, fairness and TCP-friendliness.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116612449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-12-01DOI: 10.1109/ADCOM.2006.4289980
P. Bhattacharjee, G. Sanyal
With the ever increasing backbone bandwidth, the design and planning of networks are usually done by simulating the influence of various traffic type on the network. Accurate traffic classification is the key stone of numerous network activities. The research in this paper results from a study and analysis of packet trace obtained from a backbone link in NITNET. Millions of packets are passed through the link that makes the traffic measurement difficult or even impossible. An attempt has been made to study the online inbound and outbound packet trace analysis. Poisson based sampling methodology has been used to estimate the characteristics and distribution of packets.
{"title":"Study of Stochastic Characteristics of Traffic on a High Speed Network","authors":"P. Bhattacharjee, G. Sanyal","doi":"10.1109/ADCOM.2006.4289980","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289980","url":null,"abstract":"With the ever increasing backbone bandwidth, the design and planning of networks are usually done by simulating the influence of various traffic type on the network. Accurate traffic classification is the key stone of numerous network activities. The research in this paper results from a study and analysis of packet trace obtained from a backbone link in NITNET. Millions of packets are passed through the link that makes the traffic measurement difficult or even impossible. An attempt has been made to study the online inbound and outbound packet trace analysis. Poisson based sampling methodology has been used to estimate the characteristics and distribution of packets.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128694453","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-12-01DOI: 10.1109/ADCOM.2006.4289953
G J Sreenivasa, V. S. Ananthanarayana
Mining frequent rooted continuous directed (RCD) subgraphs is very useful in Web usage mining domain. We formulate the problem of mining RCD subgraphs in a database of rooted labeled continuous directed graphs. We propose a novel approach of merging like RCD subgraphs. This approach builds a Pattern Super Graph (PSG) structure.This PSG is a compact structure and ideal for extracting frequent patterns in the form of RCD subgraphs. The PSG based mine avoids costly, repeated database scans and there is no generation of candidates. Results obtained are appreciating the approach proposed.
{"title":"Efficient Mining of Frequent Rooted Continuous Directed Subgraphs","authors":"G J Sreenivasa, V. S. Ananthanarayana","doi":"10.1109/ADCOM.2006.4289953","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289953","url":null,"abstract":"Mining frequent rooted continuous directed (RCD) subgraphs is very useful in Web usage mining domain. We formulate the problem of mining RCD subgraphs in a database of rooted labeled continuous directed graphs. We propose a novel approach of merging like RCD subgraphs. This approach builds a Pattern Super Graph (PSG) structure.This PSG is a compact structure and ideal for extracting frequent patterns in the form of RCD subgraphs. The PSG based mine avoids costly, repeated database scans and there is no generation of candidates. Results obtained are appreciating the approach proposed.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129436612","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-12-01DOI: 10.1109/ADCOM.2006.4289887
S. Jambhorkar, S. Gornale, V. Humbe, R. Manza, K V Kale
In this paper, we have considered a problem of uneven background extraction and segmentation of good, normal and bad quality fingerprint images, though we propose an algorithm based on morphological transformations. Our result shows that the proposed algorithm can successfully extract the background of good, normal and bad quality images of fingerprint and well segment the foreground area. The algorithm has been tested and executed on FVC2002 database and the performance of proposed algorithm is evaluated through subjective and objective quality measures. This algorithm gives good and promising result and found suitable to remove superfluous information without affecting the structure of fingerprint image as well as reduces the storage space for the resultant image upto 77%. Our results will be useful for precise feature extraction in automatic fingerprint recognition system.
{"title":"Uneven Background Extraction And Segmentation Of Good, Normal And Bad Quality Fingerprint Images","authors":"S. Jambhorkar, S. Gornale, V. Humbe, R. Manza, K V Kale","doi":"10.1109/ADCOM.2006.4289887","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289887","url":null,"abstract":"In this paper, we have considered a problem of uneven background extraction and segmentation of good, normal and bad quality fingerprint images, though we propose an algorithm based on morphological transformations. Our result shows that the proposed algorithm can successfully extract the background of good, normal and bad quality images of fingerprint and well segment the foreground area. The algorithm has been tested and executed on FVC2002 database and the performance of proposed algorithm is evaluated through subjective and objective quality measures. This algorithm gives good and promising result and found suitable to remove superfluous information without affecting the structure of fingerprint image as well as reduces the storage space for the resultant image upto 77%. Our results will be useful for precise feature extraction in automatic fingerprint recognition system.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"744 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123874767","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-12-01DOI: 10.1109/ADCOM.2006.4289917
V. Rajpurohit, M. M. Manohara Pai
Stereovision based systems represent the real-world information in the form of a gray scale image known as depth-map with intensity of each pixel representing the distance of that pixel from the cameras. For static indoor environment where the surface is smooth, the ground information remains constant and can be removed to locate and identify the boundaries of the obstacles of interest in a better way. This paper proposes a novel approach for ground surface removal using a trained multilayer neural network and a novel object-clustering algorithm to reconstruct the objects of interest from the depth-map generated by the stereovision algorithm. Histogram analysis and the object reconstruction algorithm are used to test the results.
{"title":"Feature Extraction Learning for Stereovision Based Robot Navigation System","authors":"V. Rajpurohit, M. M. Manohara Pai","doi":"10.1109/ADCOM.2006.4289917","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289917","url":null,"abstract":"Stereovision based systems represent the real-world information in the form of a gray scale image known as depth-map with intensity of each pixel representing the distance of that pixel from the cameras. For static indoor environment where the surface is smooth, the ground information remains constant and can be removed to locate and identify the boundaries of the obstacles of interest in a better way. This paper proposes a novel approach for ground surface removal using a trained multilayer neural network and a novel object-clustering algorithm to reconstruct the objects of interest from the depth-map generated by the stereovision algorithm. Histogram analysis and the object reconstruction algorithm are used to test the results.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"2002 10","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113966485","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2006-12-01DOI: 10.1109/ADCOM.2006.4289911
P. Jaganathan, K. Thangavel, Pethalakshmi A, M. V. M. Govt
Ant colony optimization (ACO) algorithms have been applied successfully to combinatorial optimization problems. More recently, Parpinelli et al have applied ACO to data mining classification problems, where they introduced a classification algorithm called Ant Miner. In this paper, we present a hybrid system that combines both the proposed Enhanced Quickreduct algorithm for data preprocessing and ant miner. The system was tested on standard data set and its performance is better than the original Ant Miner algorithm.
{"title":"Effective Classification with Hybrid Evolutionary Techniques","authors":"P. Jaganathan, K. Thangavel, Pethalakshmi A, M. V. M. Govt","doi":"10.1109/ADCOM.2006.4289911","DOIUrl":"https://doi.org/10.1109/ADCOM.2006.4289911","url":null,"abstract":"Ant colony optimization (ACO) algorithms have been applied successfully to combinatorial optimization problems. More recently, Parpinelli et al have applied ACO to data mining classification problems, where they introduced a classification algorithm called Ant Miner. In this paper, we present a hybrid system that combines both the proposed Enhanced Quickreduct algorithm for data preprocessing and ant miner. The system was tested on standard data set and its performance is better than the original Ant Miner algorithm.","PeriodicalId":296627,"journal":{"name":"2006 International Conference on Advanced Computing and Communications","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132047619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}