Rough sets theory can be applied to the problem of pattern recognition using neural networks in three different stages: preprocessing, learning rule and in the architecture. This paper discusses the use of rough set theory in the architecture of the unsupervised neural network, which is implemented, by the use of rough neuron. The rough neuron consists of two neurons: upper boundary neuron and lower boundary neuron, derived on the upper and lower boundaries of the input vector. The proposed neural network uses the Kohonen learning rule. Problem of character recognition is taken to verify the usefulness of such a network. The data set is formed by the images of English alphabets of ten different fonts. The approximation quality of such a network is better compared to the traditional networks. The number of iterations reduce significantly for such a network and hence the convergence time.
{"title":"Rough Neuron Based Neural Classifier","authors":"A. Kothari, A. Keskar, R. Chalasani, S. Srinath","doi":"10.1109/ICETET.2008.229","DOIUrl":"https://doi.org/10.1109/ICETET.2008.229","url":null,"abstract":"Rough sets theory can be applied to the problem of pattern recognition using neural networks in three different stages: preprocessing, learning rule and in the architecture. This paper discusses the use of rough set theory in the architecture of the unsupervised neural network, which is implemented, by the use of rough neuron. The rough neuron consists of two neurons: upper boundary neuron and lower boundary neuron, derived on the upper and lower boundaries of the input vector. The proposed neural network uses the Kohonen learning rule. Problem of character recognition is taken to verify the usefulness of such a network. The data set is formed by the images of English alphabets of ten different fonts. The approximation quality of such a network is better compared to the traditional networks. The number of iterations reduce significantly for such a network and hence the convergence time.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133639836","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wireless sensor networks (WSNs) have emerged as an interesting research area in the last few years. The applications envisioned for such networks require collaborative execution of a distributed task amongst a large set of sensor nodes. The collaborative execution is realized by exchanging messages that are time-stamped using the local clocks on the nodes. Hence, time synchronization becomes indispensable in such distributed systems. For years, protocols such as network time protocol (NTP) have kept the clocks of networked systems in perfect synchronization. However, wireless sensor networks has a large density of nodes and very limited energy resource at every node which leads to improved scalability requirements while limiting the resources. This paper proposes a technique called level-based time synchronization to provide redundant ways for each node to synchronize its clock with the common source, so that it can tolerate partially missing or false synchronization information provided by compromised nodes. The efficacy of this technique is evaluated via simulations.
{"title":"Secure Time Synchronization against Malicious Attacks for Wireless Sensor Networks","authors":"V. Vijayalakshmi, T. G. Palanivelu, N. Agalya","doi":"10.1109/ICETET.2008.224","DOIUrl":"https://doi.org/10.1109/ICETET.2008.224","url":null,"abstract":"Wireless sensor networks (WSNs) have emerged as an interesting research area in the last few years. The applications envisioned for such networks require collaborative execution of a distributed task amongst a large set of sensor nodes. The collaborative execution is realized by exchanging messages that are time-stamped using the local clocks on the nodes. Hence, time synchronization becomes indispensable in such distributed systems. For years, protocols such as network time protocol (NTP) have kept the clocks of networked systems in perfect synchronization. However, wireless sensor networks has a large density of nodes and very limited energy resource at every node which leads to improved scalability requirements while limiting the resources. This paper proposes a technique called level-based time synchronization to provide redundant ways for each node to synchronize its clock with the common source, so that it can tolerate partially missing or false synchronization information provided by compromised nodes. The efficacy of this technique is evaluated via simulations.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"221 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134000803","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In 1992, Sakakibara introduced a well-known approach for learning context-free grammars from positive samples of structural descriptions (skeletons). In particular, Sakakibarapsilas approach uses reversible tree automata construction algorithm RT. Here, we introduce a modification of the learning algorithm RT for reversible tree automata. With respect to n, where n is the sum of the sizes of the input skeletons, our modification for RT, called e_RT, needs O(n3) operations and achieves the storage space saving by a factor of O(n) over RT. Using our e_RT, we give an algorithm e_RC to learn reversible context-free grammars from positive samples of their structural descriptions. Furthermore, we modify e_RC to learn extended reversible context-free grammars from positive-only examples. Finally, we present summary of our experiments carried out to see how our results compare with those of Sakakibara, which also confirms our approach as efficient and useful.
{"title":"On Learning Context-Free Grammars Using Skeletons","authors":"G. L. Prajapati, N. Chaudhari, M. Chandwani","doi":"10.1109/ICETET.2008.167","DOIUrl":"https://doi.org/10.1109/ICETET.2008.167","url":null,"abstract":"In 1992, Sakakibara introduced a well-known approach for learning context-free grammars from positive samples of structural descriptions (skeletons). In particular, Sakakibarapsilas approach uses reversible tree automata construction algorithm RT. Here, we introduce a modification of the learning algorithm RT for reversible tree automata. With respect to n, where n is the sum of the sizes of the input skeletons, our modification for RT, called e_RT, needs O(n3) operations and achieves the storage space saving by a factor of O(n) over RT. Using our e_RT, we give an algorithm e_RC to learn reversible context-free grammars from positive samples of their structural descriptions. Furthermore, we modify e_RC to learn extended reversible context-free grammars from positive-only examples. Finally, we present summary of our experiments carried out to see how our results compare with those of Sakakibara, which also confirms our approach as efficient and useful.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"84 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115718753","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Density based clustering technique like DBSCAN finds arbitrary shaped clusters along with noisy outliers. DBSCAN finds the density at a point by counting the number of points falling in a sphere of radius epsi and it has a time complexity of O(n2). Hence it is not suitable for large data sets. The proposed method in this paper is an efficient and fast Parzen-Window density based clustering method which uses (i) prototypes to reduce the computational burden, (ii) a smooth kernel function to estimate density at a point and hence the estimated density is also varies smoothly. Enriched prototypes are derived using counted leaders method. These are used with a special form of the Gaussian kernel function which is radially symmetrical and hence the function can be completely specified by a variance parameter only. The proposed method is experimentally compared with DBSCAN which shows that it is a suitable method for large data sets.
{"title":"An Efficient and Fast Parzen-Window Density Based Clustering Method for Large Data Sets","authors":"V. S. Babu, P. Viswanath","doi":"10.1109/ICETET.2008.166","DOIUrl":"https://doi.org/10.1109/ICETET.2008.166","url":null,"abstract":"Density based clustering technique like DBSCAN finds arbitrary shaped clusters along with noisy outliers. DBSCAN finds the density at a point by counting the number of points falling in a sphere of radius epsi and it has a time complexity of O(n2). Hence it is not suitable for large data sets. The proposed method in this paper is an efficient and fast Parzen-Window density based clustering method which uses (i) prototypes to reduce the computational burden, (ii) a smooth kernel function to estimate density at a point and hence the estimated density is also varies smoothly. Enriched prototypes are derived using counted leaders method. These are used with a special form of the Gaussian kernel function which is radially symmetrical and hence the function can be completely specified by a variance parameter only. The proposed method is experimentally compared with DBSCAN which shows that it is a suitable method for large data sets.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114806208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Climate change is a highly complex problem which has the potential to impact every sphere of life. If left unchecked, its impact on people and ecosystems could be drastic. In recent years, abnormal weather such as droughts, EI Nino, hurricane Katrina and heavy rains have been commonly observed around the world. Temperatures at the planetpsilas surface increased by an estimated 0.8 OC between 1900 and 2005. The past decade was the hottest on record during the last 150 years, and 2005 was the hottest year of the last 150 years. Sea levels haver isen by 10 to 20 cm during the same time while both the north and south poles have lost some of its glaciers. Scientists claim that the rising emission of greenhouse gases (GHGs), a byproduct of burning fossil fuel, is the main cause of such phenomena.
{"title":"Emerging Trends in Environmental Engineering CDM and Carbon Trading","authors":"R. Daryapurkar","doi":"10.1109/ICETET.2008.262","DOIUrl":"https://doi.org/10.1109/ICETET.2008.262","url":null,"abstract":"Climate change is a highly complex problem which has the potential to impact every sphere of life. If left unchecked, its impact on people and ecosystems could be drastic. In recent years, abnormal weather such as droughts, EI Nino, hurricane Katrina and heavy rains have been commonly observed around the world. Temperatures at the planetpsilas surface increased by an estimated 0.8 OC between 1900 and 2005. The past decade was the hottest on record during the last 150 years, and 2005 was the hottest year of the last 150 years. Sea levels haver isen by 10 to 20 cm during the same time while both the north and south poles have lost some of its glaciers. Scientists claim that the rising emission of greenhouse gases (GHGs), a byproduct of burning fossil fuel, is the main cause of such phenomena.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114498927","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In our paper we consider the comparison performance of wireless sensor network to optimize the event detection and TDMA schemes using cross layer interaction. We consider the accuracy and lifetime of WSN because accuracy and lifetime are the important parameters of wireless sensor network because working of system depends upon it. Lifetime maximization relates with various factors such as throughput, end to end delay, lifetime parameter such as time, output, packet delivery rates, no. of nodes, nodes efficiency, operating frequency to operate and relate each parameter. It is very difficult to study and compare all the above parameters simultaneously in the Wireless Sensor Network. So throughput is end to end delay or packet delivery of nodes can be verified any improvement in one of the parameter so that it can optimized the operation capability of the whole wireless network. Depending upon the no. of nodes corresponding hoping is developed. Routing through no. of hopes can disturbed the lifetime of WSN.
{"title":"Lifetime Maximization in Wireless Sensor Network Using Cross Layer Design: A Design Review","authors":"Manisha S. Masurkar, G. Asutkar, K. Kulat","doi":"10.1109/ICETET.2008.142","DOIUrl":"https://doi.org/10.1109/ICETET.2008.142","url":null,"abstract":"In our paper we consider the comparison performance of wireless sensor network to optimize the event detection and TDMA schemes using cross layer interaction. We consider the accuracy and lifetime of WSN because accuracy and lifetime are the important parameters of wireless sensor network because working of system depends upon it. Lifetime maximization relates with various factors such as throughput, end to end delay, lifetime parameter such as time, output, packet delivery rates, no. of nodes, nodes efficiency, operating frequency to operate and relate each parameter. It is very difficult to study and compare all the above parameters simultaneously in the Wireless Sensor Network. So throughput is end to end delay or packet delivery of nodes can be verified any improvement in one of the parameter so that it can optimized the operation capability of the whole wireless network. Depending upon the no. of nodes corresponding hoping is developed. Routing through no. of hopes can disturbed the lifetime of WSN.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124964321","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Agent technology is a software entity with self-adaptation and intelligence that can accomplish a task by means of initiative service on behalf of users. The purpose of this paper is to formulate a functional architecture that supports the e-learning objectives: study individually and cooperatively, anytime and anywhere, guide specific individuals to their most relevant community. In order to make the work more efficient and to improve the performance an answering system is designed that does an intelligent search using algorithms like full text based search, word segmentation and fuzzy search. It displays the result by analyzing the userpsilas profile, experience and users needs so that the he is fully satisfied. It is made possible using case base reasoning (CBR) technique. A decentralized search engine is also built on top of grid technology with the help of software agents that allows the user to search through heterogeneous resources stored in geographically distributed digital collections.
{"title":"Design of an Intelligent Answering System Through Agent Based Search Engine Using Grid Technology","authors":"D. Jayalatchumy, D. Kadhirvelu, P. Ramkumar","doi":"10.1109/ICETET.2008.29","DOIUrl":"https://doi.org/10.1109/ICETET.2008.29","url":null,"abstract":"Agent technology is a software entity with self-adaptation and intelligence that can accomplish a task by means of initiative service on behalf of users. The purpose of this paper is to formulate a functional architecture that supports the e-learning objectives: study individually and cooperatively, anytime and anywhere, guide specific individuals to their most relevant community. In order to make the work more efficient and to improve the performance an answering system is designed that does an intelligent search using algorithms like full text based search, word segmentation and fuzzy search. It displays the result by analyzing the userpsilas profile, experience and users needs so that the he is fully satisfied. It is made possible using case base reasoning (CBR) technique. A decentralized search engine is also built on top of grid technology with the help of software agents that allows the user to search through heterogeneous resources stored in geographically distributed digital collections.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123844295","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Security is of great concern for software engineering community for the development of online systems. To handle all the security issues properly we need to have well defined process for Security engineering and this process should be integrated with software development life cycle (SDLC). For integrating security engineering with SDLC the requirement engineers should have clearly understanding of what security requirements are and he/she is able to distinguish security requirements from various architectural and behavioral constraints. In this paper we go on to define a process for security requirements elicitation presenting techniques for activities like requirements discovery, analysis, prioritization and management. With true security requirements identified as early as possible and systematically identified, Architecture team can choose most appropriate mechanism to implement them.
{"title":"Security Requirements Elicitation Using View Points for Online System","authors":"A. Agarwal, D. Gupta","doi":"10.1109/ICETET.2008.91","DOIUrl":"https://doi.org/10.1109/ICETET.2008.91","url":null,"abstract":"Security is of great concern for software engineering community for the development of online systems. To handle all the security issues properly we need to have well defined process for Security engineering and this process should be integrated with software development life cycle (SDLC). For integrating security engineering with SDLC the requirement engineers should have clearly understanding of what security requirements are and he/she is able to distinguish security requirements from various architectural and behavioral constraints. In this paper we go on to define a process for security requirements elicitation presenting techniques for activities like requirements discovery, analysis, prioritization and management. With true security requirements identified as early as possible and systematically identified, Architecture team can choose most appropriate mechanism to implement them.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125215550","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, the algorithm of digital image watermarking is proposed using discrete wavelet transform (DWT) for copyright protection. The detail coefficients of second level host image after taking DWT is used as a watermark key for the watermark logo insertion or addition and extraction process. We have proposed the learning of low dimensional representation in the context of image recognition using principle component analysis (PCA). It is based on the second order statistics of image set. The PCA aims to find second order correlation of patterns. The effectiveness of PCA is tested on original watermark logo, extracted watermark logo. We were able to design a prototype system, which provides user authentication. The proposed system of watermark image recognition may be applied in identification systems, document control and access control.
{"title":"Watermark Image Recognition Using Principal Component Analysis","authors":"Amol R. Madane, M. M. Shah","doi":"10.1109/ICETET.2008.86","DOIUrl":"https://doi.org/10.1109/ICETET.2008.86","url":null,"abstract":"In this paper, the algorithm of digital image watermarking is proposed using discrete wavelet transform (DWT) for copyright protection. The detail coefficients of second level host image after taking DWT is used as a watermark key for the watermark logo insertion or addition and extraction process. We have proposed the learning of low dimensional representation in the context of image recognition using principle component analysis (PCA). It is based on the second order statistics of image set. The PCA aims to find second order correlation of patterns. The effectiveness of PCA is tested on original watermark logo, extracted watermark logo. We were able to design a prototype system, which provides user authentication. The proposed system of watermark image recognition may be applied in identification systems, document control and access control.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125559137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Keeping in mind the emerging consumer demand based network traffic scenario wherein assortment of traffics and resulting traffic conditions, we propose a generalized analytical method to assess network traffic first with resulting connections that follows. That is, we model traffic to embody two fundamental components transitions and transfer rates that helps us understand, establish, evaluate, qualify and quantify traffic patterns and hence traffic dynamics in optical networks.
{"title":"Modelling and Pre-Assessment of Traffic Transitions in Dynamic Optical Networks","authors":"T. Indumathi, V. Patel","doi":"10.1109/ICETET.2008.88","DOIUrl":"https://doi.org/10.1109/ICETET.2008.88","url":null,"abstract":"Keeping in mind the emerging consumer demand based network traffic scenario wherein assortment of traffics and resulting traffic conditions, we propose a generalized analytical method to assess network traffic first with resulting connections that follows. That is, we model traffic to embody two fundamental components transitions and transfer rates that helps us understand, establish, evaluate, qualify and quantify traffic patterns and hence traffic dynamics in optical networks.","PeriodicalId":269929,"journal":{"name":"2008 First International Conference on Emerging Trends in Engineering and Technology","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126389282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}