Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8203994
Sagar Kumar Vinodiya, R. Gamad
This paper presents the analysis of different types of comparator and their simulated results as the requirement for low-power, high speed analog to digital converters (ADCs)is increasing, this drive towards the CMOS comparators which are capable to work in Low supply voltage at maximum speed with high power efficiency. Simulation is done by Cadence Virtuoso Analog Design Environment using SCL 180nm technology and simulation done at 1.2 Volt supply voltage. The comparator has least power consumption of 96.5pw with a delay of 0.56ns. During this the clock frequency was 250 MHz Also authors have analyzed DC responses and transient responses.
{"title":"Analysis and design of low power, high speed comparators in 180nm technology with low supply voltages for ADCs","authors":"Sagar Kumar Vinodiya, R. Gamad","doi":"10.1109/ICCCNT.2017.8203994","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8203994","url":null,"abstract":"This paper presents the analysis of different types of comparator and their simulated results as the requirement for low-power, high speed analog to digital converters (ADCs)is increasing, this drive towards the CMOS comparators which are capable to work in Low supply voltage at maximum speed with high power efficiency. Simulation is done by Cadence Virtuoso Analog Design Environment using SCL 180nm technology and simulation done at 1.2 Volt supply voltage. The comparator has least power consumption of 96.5pw with a delay of 0.56ns. During this the clock frequency was 250 MHz Also authors have analyzed DC responses and transient responses.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"106 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88060619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8204021
Abhijeet Upadhya, V. Dwivedi
Rapid increment in broadband networking needs has attracted researchers towards FSO systems. In this paper, relaying technique is proposed in FSO systems for increasing the link reliability. Present research work aims at analyzing a mixed RF/FSO system modeled as α-μ distributed fading. As suggested in the literature, α-μ distribution has the capability to generalize any fading scenario and distribution parameters provide better control on fading conditions. Finally, outage probability and average bit error rate closed form expressions have been derived and verified via computer based simulation.
{"title":"Asymmetric mixed RF/FSO relaying over α-μ fading channel","authors":"Abhijeet Upadhya, V. Dwivedi","doi":"10.1109/ICCCNT.2017.8204021","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8204021","url":null,"abstract":"Rapid increment in broadband networking needs has attracted researchers towards FSO systems. In this paper, relaying technique is proposed in FSO systems for increasing the link reliability. Present research work aims at analyzing a mixed RF/FSO system modeled as α-μ distributed fading. As suggested in the literature, α-μ distribution has the capability to generalize any fading scenario and distribution parameters provide better control on fading conditions. Finally, outage probability and average bit error rate closed form expressions have been derived and verified via computer based simulation.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"43 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86064076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8204187
Aseem John, Anurag Sharma, H. Singh, Virat Rehani
In an approach to manage health care issues, this work proposed a Fuzzy based decision making framework for diagnosis of Glaucoma. The fuzzy rule-based makes use of expert knowledge to deal with patient's symptom and give a precise decision according to rules constructed. The results obtained are compared with those of the ophthalmologist and are found to have 88% accuracy. Also in addition the proposed technique is efficient having a low computational cost.
{"title":"Fuzzy based decision making for detection of Glaucoma","authors":"Aseem John, Anurag Sharma, H. Singh, Virat Rehani","doi":"10.1109/ICCCNT.2017.8204187","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8204187","url":null,"abstract":"In an approach to manage health care issues, this work proposed a Fuzzy based decision making framework for diagnosis of Glaucoma. The fuzzy rule-based makes use of expert knowledge to deal with patient's symptom and give a precise decision according to rules constructed. The results obtained are compared with those of the ophthalmologist and are found to have 88% accuracy. Also in addition the proposed technique is efficient having a low computational cost.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"288 1","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86421476","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8204082
M. Dwivedi, Gitanjali Mehta, Asif Iqbal, H. Shekhar
The purpose of this research is to focus on performance enhancement of Photovoltaic system by improving Maximum Power Point Tracking (MPPT) techniques for sustainable green electricity production for future generation. The conventional MPPT algorithms performs well in uniform irradiance condition but unable to reach at the desired maximum power point (MPP)in partial shading condition (PSC). This demands necessity for development of efficient optimization techniques those are capable of reaching the global maximum power point (GMPP) in a PV system under PSC. Accordingly, this research work provides a comprehensive assessment on tracking performance of Particle Swarm Optimization (PSO) algorithm under PSC. A comparative study of this optimization technique has been performed against two conventional algorithms named Perturb and Observe (P&O) and Incremental Conductance (INC). Results confirm that the PSO algorithm guarantees fast convergence to GMPP and have better performance in comparison with the conventional ones.
本研究的目的是通过改进最大功率点跟踪(MPPT)技术来提高光伏系统的性能,为未来一代提供可持续的绿色电力生产。传统的MPPT算法在均匀光照条件下表现良好,但在部分遮光条件下无法达到期望的最大功率点(MPP)。这就要求有必要开发有效的优化技术,使光伏系统在PSC下能够达到全局最大功率点(GMPP)。因此,本研究工作对粒子群优化算法在PSC下的跟踪性能进行了综合评价。将该优化技术与两种传统算法Perturb and Observe (P&O)和Incremental conductivity (INC)进行了比较研究。结果表明,粒子群算法能够快速收敛到GMPP,并且与传统算法相比具有更好的性能。
{"title":"Performance enhancement of solar PV system under partial shaded condition using PSO","authors":"M. Dwivedi, Gitanjali Mehta, Asif Iqbal, H. Shekhar","doi":"10.1109/ICCCNT.2017.8204082","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8204082","url":null,"abstract":"The purpose of this research is to focus on performance enhancement of Photovoltaic system by improving Maximum Power Point Tracking (MPPT) techniques for sustainable green electricity production for future generation. The conventional MPPT algorithms performs well in uniform irradiance condition but unable to reach at the desired maximum power point (MPP)in partial shading condition (PSC). This demands necessity for development of efficient optimization techniques those are capable of reaching the global maximum power point (GMPP) in a PV system under PSC. Accordingly, this research work provides a comprehensive assessment on tracking performance of Particle Swarm Optimization (PSO) algorithm under PSC. A comparative study of this optimization technique has been performed against two conventional algorithms named Perturb and Observe (P&O) and Incremental Conductance (INC). Results confirm that the PSO algorithm guarantees fast convergence to GMPP and have better performance in comparison with the conventional ones.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"10 1","pages":"1-7"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86575036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8204037
R. Paul, P. Dhara, P. Maity, P. Roy, Amitava Roy
Modern Nuclear physicists employ large number of detector systems and different data acquisition systems for their nuclear physics experiments. Detectors with different response time and data acquisition systems with different processing time are efficiently dealt with independent readout and timestamping scheme. This paper explain the scheme, hardware development, software modification, experimental verification and future design of correlation of heterogeneous Data Acquisition Systems with timestamping and independent readout.
{"title":"Heterogeneous data acquisition system & time correlation","authors":"R. Paul, P. Dhara, P. Maity, P. Roy, Amitava Roy","doi":"10.1109/ICCCNT.2017.8204037","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8204037","url":null,"abstract":"Modern Nuclear physicists employ large number of detector systems and different data acquisition systems for their nuclear physics experiments. Detectors with different response time and data acquisition systems with different processing time are efficiently dealt with independent readout and timestamping scheme. This paper explain the scheme, hardware development, software modification, experimental verification and future design of correlation of heterogeneous Data Acquisition Systems with timestamping and independent readout.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"104 ","pages":"1-3"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91520121","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8204018
K. Kiran, B. Vignesh, P. D. Shenoy, K. Venugopal, T. V. Prabhu, M. Prasad
In Tor network, the relays run by volunteers are non-uniform in nature, mainly with respect to the bandwidths of the relay. Browsing with Tor is often slow due to the design where the path taken from client to server is not the shortest path but ensures that it is secure in such a way that user's anonymity is maintained. Due to this, speed and anonymity is inversely proportional in the Tor network. Furthermore, since Tor uses circuits with relays in different geographical locations, it makes the connection further protracted. With so many factors affecting the performance, there is a need to improve on the current functionalities and contribute to low latency, anonymous browsing. In this paper, we try to enhance the speed of the client's browsing by segregating the relays based on their derived bandwidth as given in the consensus file and selecting path for clients based on their requirement. We discard some of the relays which are below the 12.5 percentile of all the valid relays while creating circuits for clients with high bandwidth requirement. Based on the simulations that we ran on Shadow Simulator, we have observed that, in our proposed algorithm, we are ensuring that low bandwidth relays are not congested with connection from clients with high bandwidth requirement. This has improved the low bandwidth client speed significantly. In our simulations, we also observe that there is no compromise in anonymity in our proposed algorithm.
{"title":"Client requirement based path selection algorithm for Tor network","authors":"K. Kiran, B. Vignesh, P. D. Shenoy, K. Venugopal, T. V. Prabhu, M. Prasad","doi":"10.1109/ICCCNT.2017.8204018","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8204018","url":null,"abstract":"In Tor network, the relays run by volunteers are non-uniform in nature, mainly with respect to the bandwidths of the relay. Browsing with Tor is often slow due to the design where the path taken from client to server is not the shortest path but ensures that it is secure in such a way that user's anonymity is maintained. Due to this, speed and anonymity is inversely proportional in the Tor network. Furthermore, since Tor uses circuits with relays in different geographical locations, it makes the connection further protracted. With so many factors affecting the performance, there is a need to improve on the current functionalities and contribute to low latency, anonymous browsing. In this paper, we try to enhance the speed of the client's browsing by segregating the relays based on their derived bandwidth as given in the consensus file and selecting path for clients based on their requirement. We discard some of the relays which are below the 12.5 percentile of all the valid relays while creating circuits for clients with high bandwidth requirement. Based on the simulations that we ran on Shadow Simulator, we have observed that, in our proposed algorithm, we are ensuring that low bandwidth relays are not congested with connection from clients with high bandwidth requirement. This has improved the low bandwidth client speed significantly. In our simulations, we also observe that there is no compromise in anonymity in our proposed algorithm.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"32 1","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79551954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8204081
Abhay Ganjoo, Abhishek Dhyani
Object recognition in computer vision refers to detecting a particular object in a video or an image. Computer vision being a branch of Machine Learning helps us to achieve a goal by making the machine automate the entire process through multiple algorithms and paradigms. PAN (Permanent Account Number) is a 10 character alpha-numeric code which is issued by Government of India to each of its citizen, especially those who are tax payee. The purpose of this research is to identify the PAN code and extract them so as to automate the process of retyping it manually. We achieve the above through text recognition process of template matching under Normalized Cross Correlation (NCC) method.
{"title":"Text recognition on PAN card using template matching method","authors":"Abhay Ganjoo, Abhishek Dhyani","doi":"10.1109/ICCCNT.2017.8204081","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8204081","url":null,"abstract":"Object recognition in computer vision refers to detecting a particular object in a video or an image. Computer vision being a branch of Machine Learning helps us to achieve a goal by making the machine automate the entire process through multiple algorithms and paradigms. PAN (Permanent Account Number) is a 10 character alpha-numeric code which is issued by Government of India to each of its citizen, especially those who are tax payee. The purpose of this research is to identify the PAN code and extract them so as to automate the process of retyping it manually. We achieve the above through text recognition process of template matching under Normalized Cross Correlation (NCC) method.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"1 1","pages":"1-5"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83630348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8204106
Deepshikha Sethi, Abhishek Singhal
Recommender systems are the backbone of electronic commerce sites like amazon.in, netflix and flipkart.com which not only helps in achieving better customer satisfaction but also helps in bringing those products into the notice of the customer which are not easily seen by the customer but it helps in increasing the business of such e-commerce sites. This paper present a movie recommender system that uses collaborative filtering technique of recommender system and apply Ant Colony Optimization and Artificial Bee Colony Optimization and also compare the two algorithms on the basis of CPU Time and two standard functions.
{"title":"Comparative analysis of a recommender system based on ant colony optimization and artificial bee colony optimization algorithms","authors":"Deepshikha Sethi, Abhishek Singhal","doi":"10.1109/ICCCNT.2017.8204106","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8204106","url":null,"abstract":"Recommender systems are the backbone of electronic commerce sites like amazon.in, netflix and flipkart.com which not only helps in achieving better customer satisfaction but also helps in bringing those products into the notice of the customer which are not easily seen by the customer but it helps in increasing the business of such e-commerce sites. This paper present a movie recommender system that uses collaborative filtering technique of recommender system and apply Ant Colony Optimization and Artificial Bee Colony Optimization and also compare the two algorithms on the basis of CPU Time and two standard functions.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"46 1","pages":"1-4"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91204282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8204062
Y. Ganesh, R. P. Singh, G. R. Murthy
In this paper, we present a study done on quadratic neurons to solve the pattern classification problems. The paper compares the classification results obtained by quadratic neural network(QUAD) with normal single and multilayer perceptron(MLP). Examples with randomly generated toy datasets are used for understanding and visualization. The standard datasets such as Iris, MNIST and others are used for extensive comparison and interesting experiments. Different architectures of QUAD neural net has also been tested. Obtained results are better in comparison to conventional multilayer perceptron. This experimental study motivates the authors to study usage of QUAD neurons in deep neural networks.
{"title":"Pattern classification using quadratic neuron: An experimental study","authors":"Y. Ganesh, R. P. Singh, G. R. Murthy","doi":"10.1109/ICCCNT.2017.8204062","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8204062","url":null,"abstract":"In this paper, we present a study done on quadratic neurons to solve the pattern classification problems. The paper compares the classification results obtained by quadratic neural network(QUAD) with normal single and multilayer perceptron(MLP). Examples with randomly generated toy datasets are used for understanding and visualization. The standard datasets such as Iris, MNIST and others are used for extensive comparison and interesting experiments. Different architectures of QUAD neural net has also been tested. Obtained results are better in comparison to conventional multilayer perceptron. This experimental study motivates the authors to study usage of QUAD neurons in deep neural networks.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"8 1","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89759688","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-07-01DOI: 10.1109/ICCCNT.2017.8203954
B. Sandip, A. Apurva
The Substantial amount of research has been done in the area of frequent pattern mining in the last few decades. Researchers have developed various algorithms to generate frequent patterns. We propose Pattern Mining using Linked list (PML) algorithm that generates frequent patterns using Linked list. It uses both horizontal and vertical data layout. To generate 1-itemsets, it uses horizontal data layout and for 2-itemsets and more, it uses vertical data layout. The important feature of vertical data layout is that it count the frequency fast using intersection operations on transaction ids (tids). It prunes automatically irrelevant data. The algorithm uses Linked list data structure due to which it takes less execution time to generate frequent patterns. It runs with efficient memory usage. It scans the dataset only two times. The experimental results of proposed algorithm have been compared with other algorithms.
{"title":"Pattern mining using Linked list (PML) mine the frequent patterns from transaction dataset using Linked list data structure","authors":"B. Sandip, A. Apurva","doi":"10.1109/ICCCNT.2017.8203954","DOIUrl":"https://doi.org/10.1109/ICCCNT.2017.8203954","url":null,"abstract":"The Substantial amount of research has been done in the area of frequent pattern mining in the last few decades. Researchers have developed various algorithms to generate frequent patterns. We propose Pattern Mining using Linked list (PML) algorithm that generates frequent patterns using Linked list. It uses both horizontal and vertical data layout. To generate 1-itemsets, it uses horizontal data layout and for 2-itemsets and more, it uses vertical data layout. The important feature of vertical data layout is that it count the frequency fast using intersection operations on transaction ids (tids). It prunes automatically irrelevant data. The algorithm uses Linked list data structure due to which it takes less execution time to generate frequent patterns. It runs with efficient memory usage. It scans the dataset only two times. The experimental results of proposed algorithm have been compared with other algorithms.","PeriodicalId":6581,"journal":{"name":"2017 8th International Conference on Computing, Communication and Networking Technologies (ICCCNT)","volume":"65 1","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76699177","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}