Pub Date : 2016-02-01DOI: 10.1109/STARTUP.2016.7583978
Ankita J. Gakare, Kavita R. Singh, J. Peters
A wavelet-based tolerance Nearness Measure (tNM) makes possible to measure fine-grained changes in shapes in pairs of images. The image correspondence utilizes image matching tactics to establish closeness between two or more images. This is one of the central tasks in computer vision. The problem considered that how can we measure the nearness or apartness of digital images. In case when it is important to detect conversion in the contour, position, and approximal orientation of bounded regions. However, the solution of this problem is that results from an application of anisotropic (direction dependent) a tolerance and wavelets near set approach to detecting affinities in pairs of images. It has been shown that tolerance near sets can be used in a concept-based approach to discovering correspondences between images. In this paper we are showing detail survey on near set approach. By near set approach an effective means of images is nothing but grouping together that correspond to each other relative to diminutive similarities in the features of bounded regions in the images.
{"title":"Wavelet-based tolerance near set approach in classifying hand images: A review","authors":"Ankita J. Gakare, Kavita R. Singh, J. Peters","doi":"10.1109/STARTUP.2016.7583978","DOIUrl":"https://doi.org/10.1109/STARTUP.2016.7583978","url":null,"abstract":"A wavelet-based tolerance Nearness Measure (tNM) makes possible to measure fine-grained changes in shapes in pairs of images. The image correspondence utilizes image matching tactics to establish closeness between two or more images. This is one of the central tasks in computer vision. The problem considered that how can we measure the nearness or apartness of digital images. In case when it is important to detect conversion in the contour, position, and approximal orientation of bounded regions. However, the solution of this problem is that results from an application of anisotropic (direction dependent) a tolerance and wavelets near set approach to detecting affinities in pairs of images. It has been shown that tolerance near sets can be used in a concept-based approach to discovering correspondences between images. In this paper we are showing detail survey on near set approach. By near set approach an effective means of images is nothing but grouping together that correspond to each other relative to diminutive similarities in the features of bounded regions in the images.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133915225","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-02-01DOI: 10.1109/STARTUP.2016.7583955
K. S. Chaturbhuj, Gauri Chaudhary
Traditional data processing techniques are not enough to handle rapidly growing data. Hadoop can be used for processing such large data. K-means is the traditional clustering method which is simple, scalable and can easily implement but K-means converges to local minima from starting position and sensitive to initial centers. K-means required number of clusters in advance. Particle Swarm Optimization i.e PSO is mimic behavior based algorithm used to introduce the connectivity principle in the centroid based clustering algorithm that will gives optimum centroid and hence find better clusters. We used PSO for finding initial centroids and K-means to find better clusters. Hadoop is used for fast and parallel processing of large datasets.
{"title":"Parallel clustering of large data set on Hadoop using data mining techniques","authors":"K. S. Chaturbhuj, Gauri Chaudhary","doi":"10.1109/STARTUP.2016.7583955","DOIUrl":"https://doi.org/10.1109/STARTUP.2016.7583955","url":null,"abstract":"Traditional data processing techniques are not enough to handle rapidly growing data. Hadoop can be used for processing such large data. K-means is the traditional clustering method which is simple, scalable and can easily implement but K-means converges to local minima from starting position and sensitive to initial centers. K-means required number of clusters in advance. Particle Swarm Optimization i.e PSO is mimic behavior based algorithm used to introduce the connectivity principle in the centroid based clustering algorithm that will gives optimum centroid and hence find better clusters. We used PSO for finding initial centroids and K-means to find better clusters. Hadoop is used for fast and parallel processing of large datasets.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117314948","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-02-01DOI: 10.1109/STARTUP.2016.7583922
D. S. Bhutada
This paper presents To design a low voltage flip-flops based on CPAL circuit. The Complementary Pass-Transistor Adiabatic Logic is used to release flip-flops circuits with DTCMOS (Dual Threshold CMOS) techniques. All circuits are simulated using 180nm Tanner model technology by varying supply voltages. Based on the simulation results, the flip-flop working along with the power-gating technique is realized by CPAL which work on low voltage medium which help to increase speed of the execution. We use Ac power supply which work as low power characteristics of complementary pass-transistor logic (CPL) circuit. Power-clock scheme is more suitable for the design of flip-flops using two phase sequential circuits because it helps to decrease more transistors. The Adiabatic flip-flop has large energy saving over wide range of frequencies.
{"title":"Design of low voltage flip-flop based on complementary pass-transistor adiabatic logic circuit","authors":"D. S. Bhutada","doi":"10.1109/STARTUP.2016.7583922","DOIUrl":"https://doi.org/10.1109/STARTUP.2016.7583922","url":null,"abstract":"This paper presents To design a low voltage flip-flops based on CPAL circuit. The Complementary Pass-Transistor Adiabatic Logic is used to release flip-flops circuits with DTCMOS (Dual Threshold CMOS) techniques. All circuits are simulated using 180nm Tanner model technology by varying supply voltages. Based on the simulation results, the flip-flop working along with the power-gating technique is realized by CPAL which work on low voltage medium which help to increase speed of the execution. We use Ac power supply which work as low power characteristics of complementary pass-transistor logic (CPL) circuit. Power-clock scheme is more suitable for the design of flip-flops using two phase sequential circuits because it helps to decrease more transistors. The Adiabatic flip-flop has large energy saving over wide range of frequencies.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"85 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115929444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-02-01DOI: 10.1109/STARTUP.2016.7583970
Sonali J. Bagul, Rakhi D. Wajgi
E-Commerce website becomes more important in our day todays life because of varieties of information provided by it. 75 percent People are using it for purchasing online products. Buyers' comments are playing important role in taking decision regarding purchasing of products. As number of online products, their sales and comments are increasing day by day, it is not possible for potential consumer to review all comments and take decision based on them. Therefore in this paper a feedback analysis system is designed which will analyze users' reviews regarding different products by applying different data mining techniques like opinion mining, information filtering and sentimental analysis. This helps in rating the products and calculating trust score for the E-commerce organization.
{"title":"Design feedback analysis system for E-commerce organization","authors":"Sonali J. Bagul, Rakhi D. Wajgi","doi":"10.1109/STARTUP.2016.7583970","DOIUrl":"https://doi.org/10.1109/STARTUP.2016.7583970","url":null,"abstract":"E-Commerce website becomes more important in our day todays life because of varieties of information provided by it. 75 percent People are using it for purchasing online products. Buyers' comments are playing important role in taking decision regarding purchasing of products. As number of online products, their sales and comments are increasing day by day, it is not possible for potential consumer to review all comments and take decision based on them. Therefore in this paper a feedback analysis system is designed which will analyze users' reviews regarding different products by applying different data mining techniques like opinion mining, information filtering and sentimental analysis. This helps in rating the products and calculating trust score for the E-commerce organization.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"117 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116336719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-02-01DOI: 10.1109/STARTUP.2016.7583969
Jyoti Wagde, Prarthana A. Deshkar
Today, rapid growth in hardware technology has provided a means to generate huge volume of data continuously. Most of the real time data stream application such as network monitoring, stock market and URL filtering we found that the volume of data is so large that it may be impossible to store the data on disk. Furthermore, even if the data can be stored on the disk, the volume of the incoming data may be so large that it may be difficult to process any particular record more than once. These large volumes of data need to be mined for getting interesting patterns and relevant information out of it. Consequently, we need further enhanced technique for, data stream classification while dealing with various challenges which are not solved by traditional data mining methods such as large volume, concept drift, and concept evolution.
{"title":"A review on method of stream data classification through tree based approach","authors":"Jyoti Wagde, Prarthana A. Deshkar","doi":"10.1109/STARTUP.2016.7583969","DOIUrl":"https://doi.org/10.1109/STARTUP.2016.7583969","url":null,"abstract":"Today, rapid growth in hardware technology has provided a means to generate huge volume of data continuously. Most of the real time data stream application such as network monitoring, stock market and URL filtering we found that the volume of data is so large that it may be impossible to store the data on disk. Furthermore, even if the data can be stored on the disk, the volume of the incoming data may be so large that it may be difficult to process any particular record more than once. These large volumes of data need to be mined for getting interesting patterns and relevant information out of it. Consequently, we need further enhanced technique for, data stream classification while dealing with various challenges which are not solved by traditional data mining methods such as large volume, concept drift, and concept evolution.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114927639","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-02-01DOI: 10.1109/STARTUP.2016.7583972
Devarpita Sinha, Sanjay Kumar
Software Defined Radio (SDR) or Software Radio is one of the most important technologies for the modern wireless communication system. The vision of SDR is implementing a single radio that can emulate any radio signal of evolving or already existing wireless standards. It can be done simply by updating software without replacing the underlying hardware platform. Again, different air interface requires different sample rate for baseband processing. So, Sample Rate Conversion (SRC) is an important functionality of SDR. SRC includes both sample rate reduction or decimation and sample rate increase or interpolation. But in both the cases Comb Integrator Comb (CIC) filter plays an important role as anti-aliasing filter (in case of decimation) or anti-imaging filter (in case of interpolation). This paper describes the basic structure of CIC filter and illustrates important parameters to characterize this filter. Consequently it focuses on implementation of CIC filter in decimator and interpolator. This paper also tries to find a technique to improve the characteristics of this filter and point out some problems associated with it.
软件无线电(Software Defined Radio,简称SDR)是现代无线通信系统的重要技术之一。SDR的愿景是实现一个单一的无线电,可以模拟任何无线电信号的发展或已经存在的无线标准。只需更新软件即可,而无需更换底层硬件平台。同样,不同的空中接口需要不同的基带处理采样率。因此,采样率转换(SRC)是SDR的一个重要功能。SRC包括采样率降低或抽取和采样率增加或插值。但在这两种情况下,梳状积分器梳状(CIC)滤波器作为抗混叠滤波器(在抽取的情况下)或抗成像滤波器(在插值的情况下)发挥重要作用。本文介绍了CIC滤波器的基本结构,并举例说明了表征该滤波器的重要参数。因此重点研究了CIC滤波器在抽取器和插值器中的实现。本文还试图找到一种改进该滤波器特性的技术,并指出了存在的一些问题。
{"title":"CIC filter for sample rate conversion in software defined radio","authors":"Devarpita Sinha, Sanjay Kumar","doi":"10.1109/STARTUP.2016.7583972","DOIUrl":"https://doi.org/10.1109/STARTUP.2016.7583972","url":null,"abstract":"Software Defined Radio (SDR) or Software Radio is one of the most important technologies for the modern wireless communication system. The vision of SDR is implementing a single radio that can emulate any radio signal of evolving or already existing wireless standards. It can be done simply by updating software without replacing the underlying hardware platform. Again, different air interface requires different sample rate for baseband processing. So, Sample Rate Conversion (SRC) is an important functionality of SDR. SRC includes both sample rate reduction or decimation and sample rate increase or interpolation. But in both the cases Comb Integrator Comb (CIC) filter plays an important role as anti-aliasing filter (in case of decimation) or anti-imaging filter (in case of interpolation). This paper describes the basic structure of CIC filter and illustrates important parameters to characterize this filter. Consequently it focuses on implementation of CIC filter in decimator and interpolator. This paper also tries to find a technique to improve the characteristics of this filter and point out some problems associated with it.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115000850","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nowadays the number of vehicle users increasing day by day, so the vehicle manufacture trying to develop higher end vehicle that reduce the complexity during driving. Advance Driver Assists Sytsem is one of such type that provide alert, warning and information during driving. In our proposed method Gaussian filtering, median filtering and connected component analysis are used to detect speed bump. This system go well with the roads that are constructed with proper painting. Several existing method need special hardware, sensors, accelerometer and GPS for detecting speed bump.
{"title":"Real time speed bump detection using Gaussian filtering and connected component approach","authors":"W. Devapriya, C. Babu, T. Srihari","doi":"10.4236/CS.2016.79188","DOIUrl":"https://doi.org/10.4236/CS.2016.79188","url":null,"abstract":"Nowadays the number of vehicle users increasing day by day, so the vehicle manufacture trying to develop higher end vehicle that reduce the complexity during driving. Advance Driver Assists Sytsem is one of such type that provide alert, warning and information during driving. In our proposed method Gaussian filtering, median filtering and connected component analysis are used to detect speed bump. This system go well with the roads that are constructed with proper painting. Several existing method need special hardware, sensors, accelerometer and GPS for detecting speed bump.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125291000","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-02-01DOI: 10.1109/STARTUP.2016.7583979
Shweta T. Ukande, R. Dharmik
Cloud computing is efficient and scalable technology. Cloud is nothing but a pool of resources, which provides miscellaneous services to different kind of users. Users send requests to cloud when they want service from cloud. When request is received at cloud, resource is allocated to that user and respective service is provided. As it is efficient and scalable, there is a rapid growth in cloud users so the load of requests on cloud is increased. This load should be balanced properly, to improve the system performance. Usually, virtualization concept is used to reduce load of requests at cloud. Strategy of allocating resource to user, which involves load balancing techniques, has to be good enough to provide maximum resource utilization and throughput. In this article a better load balancing algorithm is proposed.
{"title":"Design and analysis of resource allocation techniques in cloud partitions","authors":"Shweta T. Ukande, R. Dharmik","doi":"10.1109/STARTUP.2016.7583979","DOIUrl":"https://doi.org/10.1109/STARTUP.2016.7583979","url":null,"abstract":"Cloud computing is efficient and scalable technology. Cloud is nothing but a pool of resources, which provides miscellaneous services to different kind of users. Users send requests to cloud when they want service from cloud. When request is received at cloud, resource is allocated to that user and respective service is provided. As it is efficient and scalable, there is a rapid growth in cloud users so the load of requests on cloud is increased. This load should be balanced properly, to improve the system performance. Usually, virtualization concept is used to reduce load of requests at cloud. Strategy of allocating resource to user, which involves load balancing techniques, has to be good enough to provide maximum resource utilization and throughput. In this article a better load balancing algorithm is proposed.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130110547","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-02-01DOI: 10.1109/STARTUP.2016.7583977
Neha V. Mankar, A. Khobragade, M. Raghuwanshi
With reference to the literature worldwide, it is obvious that Support Vector Machine (SVM), a machine learning algorithm has proven records for excellent results regarding Classification of Image. But, Remote Sensing Images are considered as most complex in nature as far as classification is concern. Supervised classification of Remote Sensing Images needs more precise machine learning models, which will be fast and efficient. SVM do satisfy researchers all over the world as far as Remote Sensing Images are concern. Basically, SVM is non-parametric statistical learning based model, which acts like binary classifier. SVM represents a group of superior machine learning algorithms, where it decomposes the parameter of the problem into a quadratic optimization technique. Hence, SVM is used to locate optimum boundaries between classes, which in return generalize to unseen samples with least error among all possible boundaries separating two classes. SVM uses density estimation function for developing easy and efficient learning parameters. Like other supervised algorithms, SVM also undergo into Training, Learning and Testing Phase for classifying any image. Besides all parameters, training sample selection and optimization is crucial part that affects the classification accuracy of remote sensing images. We need to address this issue in our project so as to devise noble algorithm or approach, which could make SVM, a more robust statistical learning model.
{"title":"Classification of remote sensing image using SVM kernels","authors":"Neha V. Mankar, A. Khobragade, M. Raghuwanshi","doi":"10.1109/STARTUP.2016.7583977","DOIUrl":"https://doi.org/10.1109/STARTUP.2016.7583977","url":null,"abstract":"With reference to the literature worldwide, it is obvious that Support Vector Machine (SVM), a machine learning algorithm has proven records for excellent results regarding Classification of Image. But, Remote Sensing Images are considered as most complex in nature as far as classification is concern. Supervised classification of Remote Sensing Images needs more precise machine learning models, which will be fast and efficient. SVM do satisfy researchers all over the world as far as Remote Sensing Images are concern. Basically, SVM is non-parametric statistical learning based model, which acts like binary classifier. SVM represents a group of superior machine learning algorithms, where it decomposes the parameter of the problem into a quadratic optimization technique. Hence, SVM is used to locate optimum boundaries between classes, which in return generalize to unseen samples with least error among all possible boundaries separating two classes. SVM uses density estimation function for developing easy and efficient learning parameters. Like other supervised algorithms, SVM also undergo into Training, Learning and Testing Phase for classifying any image. Besides all parameters, training sample selection and optimization is crucial part that affects the classification accuracy of remote sensing images. We need to address this issue in our project so as to devise noble algorithm or approach, which could make SVM, a more robust statistical learning model.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126023158","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-02-01DOI: 10.1109/STARTUP.2016.7583931
D. Kolhatkar, N. Wankhade
In medical field blood testing is considered to be one of the most important clinical examination test. In clinical laboratory counting of different types of blood cells is important for physician to diagnose the diseases in particular patient. Manual microscopic inspection of blood cells is time consuming and requires more technical knowledge. Therefore there is a need to research for an automated blood cell detection system that will help physician to diagnose diseases in fast and efficient way. Many researchers have done their research for counting blood cells using different methodologies. This paper reviews different methodologies that have been used for blood cell counting. The objective is to study these methodologies and identify future research direction in order to get more accuracy.
{"title":"Detection and counting of blood cells using image segmentation: A review","authors":"D. Kolhatkar, N. Wankhade","doi":"10.1109/STARTUP.2016.7583931","DOIUrl":"https://doi.org/10.1109/STARTUP.2016.7583931","url":null,"abstract":"In medical field blood testing is considered to be one of the most important clinical examination test. In clinical laboratory counting of different types of blood cells is important for physician to diagnose the diseases in particular patient. Manual microscopic inspection of blood cells is time consuming and requires more technical knowledge. Therefore there is a need to research for an automated blood cell detection system that will help physician to diagnose diseases in fast and efficient way. Many researchers have done their research for counting blood cells using different methodologies. This paper reviews different methodologies that have been used for blood cell counting. The objective is to study these methodologies and identify future research direction in order to get more accuracy.","PeriodicalId":355852,"journal":{"name":"2016 World Conference on Futuristic Trends in Research and Innovation for Social Welfare (Startup Conclave)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126621381","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}