A semi-supervised approach for classification of network flows is analyzed and implemented. This traffic classification methodology uses only flow statistics to classify traffic. Specifically, a semi-supervised method that allows classifiers to be designed from training data consisting of only a few labeled and many unlabeled flows. The approach consists of two steps, clustering and classification. Clustering partitions the training data set into disjoint groups (“clusters”). After making clusters, classification is performed in which labeled data are used for assigning class labels to the clusters. A KDD Cup 1999 data set is being taken for testing this approach. It includes many kind of attack data, also includes the normal data. The testing results are then compared with SVM based classifier. The result of our approach is comparable.
{"title":"Network Traffic Classification Using Semi-Supervised Approach","authors":"A. Shrivastav, Aruna Tiwari","doi":"10.1109/ICMLC.2010.79","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.79","url":null,"abstract":"A semi-supervised approach for classification of network flows is analyzed and implemented. This traffic classification methodology uses only flow statistics to classify traffic. Specifically, a semi-supervised method that allows classifiers to be designed from training data consisting of only a few labeled and many unlabeled flows. The approach consists of two steps, clustering and classification. Clustering partitions the training data set into disjoint groups (“clusters”). After making clusters, classification is performed in which labeled data are used for assigning class labels to the clusters. A KDD Cup 1999 data set is being taken for testing this approach. It includes many kind of attack data, also includes the normal data. The testing results are then compared with SVM based classifier. The result of our approach is comparable.","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128092886","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Traditional input systems for interaction with machines include keyboards, joystick or the mouse. Those suffering from physical handicaps such as Carpel Tunnel Syndrome, Rheumatoid Arthritis or Quadriplegia may be unable to use such forms of input . In this paper, we propose a “Human Machine Interfacing Device” utilizing hand gestures to communicate with computers and other embedded systems acting as an intermediary to an appliance. Developments in field of communication have enabled computer commands being executed using hand gestures. Inertial navigation sensor like an accelerometer is utilized to get dynamic/static profile of movement to navigate the mouse on the computer or provide commands to appliances, thus accelerometer profiles are converted into wireless interactivity. The device involves non-tactile interaction with machines to manipulate or control them in accordance with hand gestures. The applications envisioned: interaction using gesture technology for effective communication empowering physically challenged to interact with machines and computing devices including 3-D graphic interactions and simulations.
{"title":"Handicap Assistance Device for Appliance Control Using User-Defined Gestures","authors":"Sisil Mehta, Dhairya Dand, Shashank Sabesan, Ankit Daftery","doi":"10.1109/ICMLC.2010.18","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.18","url":null,"abstract":"Traditional input systems for interaction with machines include keyboards, joystick or the mouse. Those suffering from physical handicaps such as Carpel Tunnel Syndrome, Rheumatoid Arthritis or Quadriplegia may be unable to use such forms of input . In this paper, we propose a “Human Machine Interfacing Device” utilizing hand gestures to communicate with computers and other embedded systems acting as an intermediary to an appliance. Developments in field of communication have enabled computer commands being executed using hand gestures. Inertial navigation sensor like an accelerometer is utilized to get dynamic/static profile of movement to navigate the mouse on the computer or provide commands to appliances, thus accelerometer profiles are converted into wireless interactivity. The device involves non-tactile interaction with machines to manipulate or control them in accordance with hand gestures. The applications envisioned: interaction using gesture technology for effective communication empowering physically challenged to interact with machines and computing devices including 3-D graphic interactions and simulations.","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126933648","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper introduces an approach to Q-learning algorithm with rough set theory introduced by Zdzislaw Pawlak in 1981. During Q-learning, an agent makes action selections in an effort to maximize a reward signal obtained from the environment. Based on reward, agent will make changes in its policy for future actions. The problem considered in this paper is the overestimation of expected value of cumulative future discounted rewards. This discounted reward is used in evaluating agent actions and policy during reinforcement learning. Due to the overestimation of discounted reward action evaluation and policy changes are not accurate. The solution to this problem results from a form Q-learning algorithm using a combination of approximation spaces and Q-learning to estimate the expected value of returns on actions. This is made possible by considering behavior patterns of an agent in scope of approximation spaces. The framework provided by an approximation space makes it possible to measure the degree that agent behaviors are a part of (''covered by'') a set of accepted agent behaviors that serve as a behavior evaluation norm.
{"title":"Approximate Q-Learning: An Introduction","authors":"Deepshikha Pandey, Punit Pandey","doi":"10.1109/ICMLC.2010.38","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.38","url":null,"abstract":"This paper introduces an approach to Q-learning algorithm with rough set theory introduced by Zdzislaw Pawlak in 1981. During Q-learning, an agent makes action selections in an effort to maximize a reward signal obtained from the environment. Based on reward, agent will make changes in its policy for future actions. The problem considered in this paper is the overestimation of expected value of cumulative future discounted rewards. This discounted reward is used in evaluating agent actions and policy during reinforcement learning. Due to the overestimation of discounted reward action evaluation and policy changes are not accurate. The solution to this problem results from a form Q-learning algorithm using a combination of approximation spaces and Q-learning to estimate the expected value of returns on actions. This is made possible by considering behavior patterns of an agent in scope of approximation spaces. The framework provided by an approximation space makes it possible to measure the degree that agent behaviors are a part of (''covered by'') a set of accepted agent behaviors that serve as a behavior evaluation norm.","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125243243","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This is web based project which mainly deals with GENOMIC COMPRESSION. Here we have used several compression techniques i,e Huffman Compression Techniques, Four base to single base compression techniques..etc for compressing Nucleotide sequence of huge size. There are two phases one is ADMINISTRATOR and another NORMAL USER. ADMINISTRATOR handles the data and maintains the database. Initially our aim to generate the encoded file for a particular file at runtime and the signature of that particular file are stored in another file to identify that particular file while decoding but we were not able to generate at runtime but rather we store the encoded file along with signature file in the database and while retrieving decoded data from encoded data we use encoded data file along with the signature file. The DNA sequences storing and transmitting them may require a huge amount of space. This web page are help to reduce the space for storing and transmitting data, also introduce one new techniques along with exiting Huffman Technique of compression routine. DNA and RNA sequences can be considered as tests over a four letter alphabet, namely {a, t, g and c}. This algorithm can approach a compression rate of 2.1 bits /base and even lower. We tested the program on standard benchmark data used. The greatest advantage of this program is fast execution, small memory occupation and easy implementation.
{"title":"Notice of RetractionWebpage Development for Genome Compression Technique","authors":"Md. Syed Mahamud Hossein, A. Mukherjee, S. Ghosh","doi":"10.1109/ICMLC.2010.59","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.59","url":null,"abstract":"This is web based project which mainly deals with GENOMIC COMPRESSION. Here we have used several compression techniques i,e Huffman Compression Techniques, Four base to single base compression techniques..etc for compressing Nucleotide sequence of huge size. There are two phases one is ADMINISTRATOR and another NORMAL USER. ADMINISTRATOR handles the data and maintains the database. Initially our aim to generate the encoded file for a particular file at runtime and the signature of that particular file are stored in another file to identify that particular file while decoding but we were not able to generate at runtime but rather we store the encoded file along with signature file in the database and while retrieving decoded data from encoded data we use encoded data file along with the signature file. The DNA sequences storing and transmitting them may require a huge amount of space. This web page are help to reduce the space for storing and transmitting data, also introduce one new techniques along with exiting Huffman Technique of compression routine. DNA and RNA sequences can be considered as tests over a four letter alphabet, namely {a, t, g and c}. This algorithm can approach a compression rate of 2.1 bits /base and even lower. We tested the program on standard benchmark data used. The greatest advantage of this program is fast execution, small memory occupation and easy implementation.","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129930732","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Large amount of multivariate data in many areas of science raises the problem of data analysis and visualization. Focusing on high dimensional and nonlinear data analysis, an improved manifold learning algorithm is introduced, then a new approach is proposed by combining adaptive local linear embedding (ALLE) and recursively applying normalized cut algorithm (RANCA). A novel adaptive local linear embedding algorithm is employed for nonlinear dimension reduction of original dataset. The recursively applying normalized cut algorithm is used for clustering of low dimensional data. The simulation results on three UCI standard datasets show that the new algorithm maps high-dimensional data into low-dimensional intrinsic space, and perfectly solves the problem of higher dependence on the structure of datasets in the traditional methods. Thus classification accuracy and robustness of spectral clustering algorithm are remarkably improved. The experiment results on Tennessee-Eastman process (TEP) also demonstrate the feasibility and effectiveness in fault pattern recognition.
{"title":"Fusion of Manifold Learning and Spectral Clustering Algorithmwith Applications to Fault Diagnosis","authors":"Yulin Zhang, Jian Zhuang, Sun'an Wang","doi":"10.1109/ICMLC.2010.10","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.10","url":null,"abstract":"Large amount of multivariate data in many areas of science raises the problem of data analysis and visualization. Focusing on high dimensional and nonlinear data analysis, an improved manifold learning algorithm is introduced, then a new approach is proposed by combining adaptive local linear embedding (ALLE) and recursively applying normalized cut algorithm (RANCA). A novel adaptive local linear embedding algorithm is employed for nonlinear dimension reduction of original dataset. The recursively applying normalized cut algorithm is used for clustering of low dimensional data. The simulation results on three UCI standard datasets show that the new algorithm maps high-dimensional data into low-dimensional intrinsic space, and perfectly solves the problem of higher dependence on the structure of datasets in the traditional methods. Thus classification accuracy and robustness of spectral clustering algorithm are remarkably improved. The experiment results on Tennessee-Eastman process (TEP) also demonstrate the feasibility and effectiveness in fault pattern recognition.","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132084776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Expert systems have been used in various fields of life like diagnosis and troubleshooting of devices and systems of all kinds, configuration of manufactured objects from subassemblies, planning and scheduling, financial decision making, knowledge publishing, process monitoring and control, design and manufacturing, agriculture, medicine, etc. Similarly these systems are also being used in field of training and education, and consultation because of its structured way of deriving knowledge and its explanation facility. This paper presents development of a web-based expert system as a Spiritual Guru or Guru of Life Ethics. This expert system is to cater those people who ask questions about the life and the world. It answers the questions as a spiritual guru, as a philosopher as well as a scientist. Since the difference between in being a real scientist and being a real spiritual person is only that the scientist doesn’t believe in the existence of soul and the spiritual person knows the soul or in the way of knowing it. The knowledge base of the proposed expert system will contain questions and their answers with reasoning. Inference engine will be a mechanism that will fetch keywords from working memory and match it with the questions stored in the knowledge base to answer the questions asked by the user. This paper presents how an expert system can be developed as Spiritual Guru to serve mankind and general work-flow of Spiritual Guru.
{"title":"Development of an Expert System as Spiritual Guru","authors":"Puja Shrivastava, S. K. Satpathy, K. Nagwanshi","doi":"10.1109/ICMLC.2010.20","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.20","url":null,"abstract":"Expert systems have been used in various fields of life like diagnosis and troubleshooting of devices and systems of all kinds, configuration of manufactured objects from subassemblies, planning and scheduling, financial decision making, knowledge publishing, process monitoring and control, design and manufacturing, agriculture, medicine, etc. Similarly these systems are also being used in field of training and education, and consultation because of its structured way of deriving knowledge and its explanation facility. This paper presents development of a web-based expert system as a Spiritual Guru or Guru of Life Ethics. This expert system is to cater those people who ask questions about the life and the world. It answers the questions as a spiritual guru, as a philosopher as well as a scientist. Since the difference between in being a real scientist and being a real spiritual person is only that the scientist doesn’t believe in the existence of soul and the spiritual person knows the soul or in the way of knowing it. The knowledge base of the proposed expert system will contain questions and their answers with reasoning. Inference engine will be a mechanism that will fetch keywords from working memory and match it with the questions stored in the knowledge base to answer the questions asked by the user. This paper presents how an expert system can be developed as Spiritual Guru to serve mankind and general work-flow of Spiritual Guru.","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121674367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
It is increasingly difficult for the traditional fault diagnosis technologies to meet the complex and automation requirements of electronic equipments, so the combination of artificial intelligence technology has become a development direction of fault diagnosis. In the fault diagnosis, BP neural network has also been widely used. As for the deficiency of BP network, the paper presented an improved BP network dynamic parameter adjust algorithm and applied it in the research of electronic equipment fault diagnosis. Proved theoretically and practically, the method can effectively overcome the deficiency of standard BP algorithm, and provides efficient way for the fault diagnosis of electronic equipments
{"title":"Research on Electronic Equipment Fault Diagnosis Based on Improved BP Algorithm","authors":"Dong-Sheng Xu","doi":"10.1109/ICMLC.2010.14","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.14","url":null,"abstract":"It is increasingly difficult for the traditional fault diagnosis technologies to meet the complex and automation requirements of electronic equipments, so the combination of artificial intelligence technology has become a development direction of fault diagnosis. In the fault diagnosis, BP neural network has also been widely used. As for the deficiency of BP network, the paper presented an improved BP network dynamic parameter adjust algorithm and applied it in the research of electronic equipment fault diagnosis. Proved theoretically and practically, the method can effectively overcome the deficiency of standard BP algorithm, and provides efficient way for the fault diagnosis of electronic equipments","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124626945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Efficient data collection is challenging in Wireless Sensor Networks. Use of Mobile Agents for various operations like data collection and software updates in Wireless Sensor Networks is receiving significant attention recently as it increases the lifetime of the network. Mobile Agents are privileged hence they are attractive targets. Compromised mobile agents may be used to launch various attacks in turn may compromise the entire network. Also an adversary may deploy some malicious nodes which act as mobile agents. In order to provide security for the network in this paper we are proposing a simple authentication scheme for heterogeneous sensor network which uses mobile agents for efficient data collection. The proposed scheme is used to identify malicious nodes acting as mobile agents. Also we are achieving confidentiality of the data collected using simple key derivation technique which allows a cluster head to encrypt the data every time using a different key which can be easily derived by the base station.
{"title":"Agent Based Secure Data Collection in Heterogeneous Sensor Networks","authors":"A. Poornima, B. B. Amberker","doi":"10.1109/ICMLC.2010.43","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.43","url":null,"abstract":"Efficient data collection is challenging in Wireless Sensor Networks. Use of Mobile Agents for various operations like data collection and software updates in Wireless Sensor Networks is receiving significant attention recently as it increases the lifetime of the network. Mobile Agents are privileged hence they are attractive targets. Compromised mobile agents may be used to launch various attacks in turn may compromise the entire network. Also an adversary may deploy some malicious nodes which act as mobile agents. In order to provide security for the network in this paper we are proposing a simple authentication scheme for heterogeneous sensor network which uses mobile agents for efficient data collection. The proposed scheme is used to identify malicious nodes acting as mobile agents. Also we are achieving confidentiality of the data collected using simple key derivation technique which allows a cluster head to encrypt the data every time using a different key which can be easily derived by the base station.","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124891625","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The continuing explosive growth of textual content within the World Wide Web has given rise to the need for sophisticated Text Classification (TC) techniques that combine efficiency with high quality of results. E-mail filtering is one application that has the potential to affect every user of the internet. Even though a large body of research has delved into this problem, there is a paucity of survey that indicates trends and directions. This paper attempts to categorize the prevalent popular techniques for classifying email as spam or legitimate and suggest possible techniques to fill in the lacunae. Our findings suggest that context-based email filtering has the most potential in improving quality by learning various contexts such as n-gram phrases, linguistic constructs or users’ profile based context to tailor his/her filtering scheme.
{"title":"A Survey on Text Classification Techniques for E-mail Filtering","authors":"U. Pandey, Shampa Chakravarty","doi":"10.1109/ICMLC.2010.61","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.61","url":null,"abstract":"The continuing explosive growth of textual content within the World Wide Web has given rise to the need for sophisticated Text Classification (TC) techniques that combine efficiency with high quality of results. E-mail filtering is one application that has the potential to affect every user of the internet. Even though a large body of research has delved into this problem, there is a paucity of survey that indicates trends and directions. This paper attempts to categorize the prevalent popular techniques for classifying email as spam or legitimate and suggest possible techniques to fill in the lacunae. Our findings suggest that context-based email filtering has the most potential in improving quality by learning various contexts such as n-gram phrases, linguistic constructs or users’ profile based context to tailor his/her filtering scheme.","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127950723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hafiz Muhammad Atiq, U. Farooq, Rabbia Ibrahim, Oneeza Khalid, Muhammad Amar
Optical systems are well suited for traffic observation and management. The real-time requirements can be met by implementation of appropriate image processing algorithms in hardware. Being one of the most important applications of optical sensors, vision-based vehicle detection and shape recognition for collecting information about road congestion, for driver assistance and for providing information for future development of roads has received considerable attention over the last one-two decades. There are many reasons for the intense research in this field including security requirements in the countries, the increased number of road accidents, the increased number of vehicles on the roads and the availability of feasible computer technologies that has brought a tremendous progress for computer vision research. This paper provides a critical survey of recent vision based road vehicle detection and shape recognition systems appeared in the literature.
{"title":"Vehicle Detection and Shape Recognition Using Optical Sensors: A Review","authors":"Hafiz Muhammad Atiq, U. Farooq, Rabbia Ibrahim, Oneeza Khalid, Muhammad Amar","doi":"10.1109/ICMLC.2010.73","DOIUrl":"https://doi.org/10.1109/ICMLC.2010.73","url":null,"abstract":"Optical systems are well suited for traffic observation and management. The real-time requirements can be met by implementation of appropriate image processing algorithms in hardware. Being one of the most important applications of optical sensors, vision-based vehicle detection and shape recognition for collecting information about road congestion, for driver assistance and for providing information for future development of roads has received considerable attention over the last one-two decades. There are many reasons for the intense research in this field including security requirements in the countries, the increased number of road accidents, the increased number of vehicles on the roads and the availability of feasible computer technologies that has brought a tremendous progress for computer vision research. This paper provides a critical survey of recent vision based road vehicle detection and shape recognition systems appeared in the literature.","PeriodicalId":423912,"journal":{"name":"2010 Second International Conference on Machine Learning and Computing","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128468384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}