Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508116
Savy Gupta, S. Gupta, R. Majumdar, Y. Rathore
Cloud computing is not a new technology but is a new way of distributing computing resources. From past few years it has emerged as the most widely used computing paradigm. It offers services in an adaptable manner and therefore helps in efficient computing by centralizing memory processing and storage. Thus it has become a welcome change for the information technology industry. While it has become a topic of conversation in the industry, there are risks associated with the adoption of cloud services. The objective of this work is to reveal and explore the risks encountered when adopting cloud computing and how their existence affects the intended user. The strategy implemented in this work is to layout mitigation strategies that should be followed to avoid such risks.
{"title":"Measuring Cloud Security from risks perspective","authors":"Savy Gupta, S. Gupta, R. Majumdar, Y. Rathore","doi":"10.1109/CONFLUENCE.2016.7508116","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508116","url":null,"abstract":"Cloud computing is not a new technology but is a new way of distributing computing resources. From past few years it has emerged as the most widely used computing paradigm. It offers services in an adaptable manner and therefore helps in efficient computing by centralizing memory processing and storage. Thus it has become a welcome change for the information technology industry. While it has become a topic of conversation in the industry, there are risks associated with the adoption of cloud services. The objective of this work is to reveal and explore the risks encountered when adopting cloud computing and how their existence affects the intended user. The strategy implemented in this work is to layout mitigation strategies that should be followed to avoid such risks.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"114 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117256425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508187
Sapna Shukla, P. Ranjan, Karan Singh
Software defined networking(SDN) can provide solution to many old problems in networks. In this paper we propose to give an algorithm for multicast routing in data center networks. Minimizing link utilization and reducing latency are the objectives of Data Centers but at the same time increasing throughput is also the other objective. Multicast not only helps to save bandwidth usage but also reduces load on servers and thus can increase the performance of the data centers in terms of link utilization and latency. Application layer multicast solutions do exist but network level multicast always has less latency than its application level counterparts but IP multicast has many challenges from the days of the Internet. PIM-SM which is prevalent multicast routing algorithm cannot leverage the advantages available in Data center networks . Hence a new algorithm specifically designed for data center networks is a need. We propose MCDC(Multicast routing for Data Centers) a SDN enabled algorithm which enables congestion aware multicast routing.
{"title":"MCDC: Multicast routing leveraging SDN for Data Center networks","authors":"Sapna Shukla, P. Ranjan, Karan Singh","doi":"10.1109/CONFLUENCE.2016.7508187","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508187","url":null,"abstract":"Software defined networking(SDN) can provide solution to many old problems in networks. In this paper we propose to give an algorithm for multicast routing in data center networks. Minimizing link utilization and reducing latency are the objectives of Data Centers but at the same time increasing throughput is also the other objective. Multicast not only helps to save bandwidth usage but also reduces load on servers and thus can increase the performance of the data centers in terms of link utilization and latency. Application layer multicast solutions do exist but network level multicast always has less latency than its application level counterparts but IP multicast has many challenges from the days of the Internet. PIM-SM which is prevalent multicast routing algorithm cannot leverage the advantages available in Data center networks . Hence a new algorithm specifically designed for data center networks is a need. We propose MCDC(Multicast routing for Data Centers) a SDN enabled algorithm which enables congestion aware multicast routing.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124496572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508176
Anshika Sharma, P. Singh, Palak Khurana
The prime objective of this review is to analyze popular techniques used for object segmentation and recognition. In this paper various existing object segmentation and recognition methodologies have been systematically analyzed and presented. The importance of object segmentation can be in identifying the object in a video. It is majorly used in video surveillance system, in human activity recognition, in shadow detection which includes both static and moving objects. The object recognition also has various applications in the field of video stabilization, cell counting in bio-imaging and in automated vehicle parking system. Google's driverless car and Microsoft's Kinect System also uses object recognition methodologies for its implementation. We have concluded our findings with the various pros and cons of the existing methods and with the possibility of future research in this area.
{"title":"Analytical review on object segmentation and recognition","authors":"Anshika Sharma, P. Singh, Palak Khurana","doi":"10.1109/CONFLUENCE.2016.7508176","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508176","url":null,"abstract":"The prime objective of this review is to analyze popular techniques used for object segmentation and recognition. In this paper various existing object segmentation and recognition methodologies have been systematically analyzed and presented. The importance of object segmentation can be in identifying the object in a video. It is majorly used in video surveillance system, in human activity recognition, in shadow detection which includes both static and moving objects. The object recognition also has various applications in the field of video stabilization, cell counting in bio-imaging and in automated vehicle parking system. Google's driverless car and Microsoft's Kinect System also uses object recognition methodologies for its implementation. We have concluded our findings with the various pros and cons of the existing methods and with the possibility of future research in this area.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126031911","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508208
Purva Grover, R. Johari
Over the years with automation more and more systems deployed in multiple industries are generating huge amount of data. In fact IT Industry itself has witnessed phenomenal growth of data in the recent years. The data generated in the last 5 years is much more then the data generated cumulatively by all the industries put together in the past 20 years. In the current work we focus on the ways and means to handle the data generated by PHIS(Personal Healthcare Information System). The big question which we have addressed in this paper is selection of the appropriate tool (Relational MySQL database or NoSQL MongoDB database) to store the patient data, its archival and storage, steps to mine it and concluded the work by depicting the comparative analysis in terms of space and time.
{"title":"Review of big data tools for healthcare system with case study on patient database storage methodology","authors":"Purva Grover, R. Johari","doi":"10.1109/CONFLUENCE.2016.7508208","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508208","url":null,"abstract":"Over the years with automation more and more systems deployed in multiple industries are generating huge amount of data. In fact IT Industry itself has witnessed phenomenal growth of data in the recent years. The data generated in the last 5 years is much more then the data generated cumulatively by all the industries put together in the past 20 years. In the current work we focus on the ways and means to handle the data generated by PHIS(Personal Healthcare Information System). The big question which we have addressed in this paper is selection of the appropriate tool (Relational MySQL database or NoSQL MongoDB database) to store the patient data, its archival and storage, steps to mine it and concluded the work by depicting the comparative analysis in terms of space and time.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130027659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508133
P. Jha, Arunima Jaiswal
Today, Web could be a stupendous info that involves multitudinous destinations, Internet crawlers and different information owning to the unregulated and disorganized info within the Web site pages, it's a testing enterprise for scientists to form an applicable and productive inquiry in distribution center of such style of info. Ontology could also be an honest part for accomplishing this objective and Web usage mining strategy could likewise be utilized to get and separate significant data from the Web records. During this paper, contributing of Web use mining has been made with the assistance of an instance of check information that Weblog instrument gadget, “Web Log Expert” has been utilized and it's been associated with the progression of mysticism for gainful Web and its reference to Web use mining. Finally, it additionally helps in summarizing another analysis challenges towards an intelligent Web.
{"title":"Creating ontology for intelligent web by using web usage mining","authors":"P. Jha, Arunima Jaiswal","doi":"10.1109/CONFLUENCE.2016.7508133","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508133","url":null,"abstract":"Today, Web could be a stupendous info that involves multitudinous destinations, Internet crawlers and different information owning to the unregulated and disorganized info within the Web site pages, it's a testing enterprise for scientists to form an applicable and productive inquiry in distribution center of such style of info. Ontology could also be an honest part for accomplishing this objective and Web usage mining strategy could likewise be utilized to get and separate significant data from the Web records. During this paper, contributing of Web use mining has been made with the assistance of an instance of check information that Weblog instrument gadget, “Web Log Expert” has been utilized and it's been associated with the progression of mysticism for gainful Web and its reference to Web use mining. Finally, it additionally helps in summarizing another analysis challenges towards an intelligent Web.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130926516","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508150
A. Sorokin, V. Dmitriev, O. N. Pishhin
In our research a mathematical model for evaluating the performance of telecommunication systems with dynamic network topology was developed. It is based on the model created by the algorithm operation of software to analyze the correctness of the choice of parameters of telecommunication systems with dynamic network topology. The algorithm is implemented as a program, The possibilities of using the algorithm for frequency-territorial planning in cellular communication that integrates dynamic structural component are considered.
{"title":"The algorithm of the software for the structural and parametric synthesis of communication systems with heterogeneous topology","authors":"A. Sorokin, V. Dmitriev, O. N. Pishhin","doi":"10.1109/CONFLUENCE.2016.7508150","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508150","url":null,"abstract":"In our research a mathematical model for evaluating the performance of telecommunication systems with dynamic network topology was developed. It is based on the model created by the algorithm operation of software to analyze the correctness of the choice of parameters of telecommunication systems with dynamic network topology. The algorithm is implemented as a program, The possibilities of using the algorithm for frequency-territorial planning in cellular communication that integrates dynamic structural component are considered.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121373984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508052
Mukesh Mann, O. Sangwan, P. Tomar, Shivani Singh
The literature on automatic test case generation has significantly arguments its importance in software testing. The solution to this un-decidable problem can reduce the financial resources spent in testing a software system. In this paper Evolutionary Genetic algorithm and simulated annealing based approach for automatic test case generation is presented. The fitness of target goal is achieved by instrumenting the program using branch distance approach and the generated test cases using genetic algorithm and simulated annealing are evaluated and compared in terms of 1) number of generation needed to reach to the target goal and 2) The time taken to generate test cases.
{"title":"Automatic goal-oriented test data generation using a Genetic algorithm and simulated annealing","authors":"Mukesh Mann, O. Sangwan, P. Tomar, Shivani Singh","doi":"10.1109/CONFLUENCE.2016.7508052","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508052","url":null,"abstract":"The literature on automatic test case generation has significantly arguments its importance in software testing. The solution to this un-decidable problem can reduce the financial resources spent in testing a software system. In this paper Evolutionary Genetic algorithm and simulated annealing based approach for automatic test case generation is presented. The fitness of target goal is achieved by instrumenting the program using branch distance approach and the generated test cases using genetic algorithm and simulated annealing are evaluated and compared in terms of 1) number of generation needed to reach to the target goal and 2) The time taken to generate test cases.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"262 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116395974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508046
Hari Mohan Pandey, Ankit Chaudhary, D. Mehrotra, Yudong Zhang
The genetic algorithm is a search an optimization algorithm has been widely used, need no introduction. There exist various factors such as population size, representation of the population, crossover and mutation probabilities, selection method and others greatly contribute to the success of the genetic algorithm. This paper dealt with the representation of the population for the genetic algorithm. The authors have shown the 2-D representation of the population has been called as bit masking oriented data structure (BMODS) was implemented by Iupsa in 2001. The BMODS is an efficient way to store the individual genome in which reproduction operations have been performed. Recently, the authors have incorporated the BMODS for the grammatical inference system and found encouraging results. By this paper, the aim is to show the usefulness of the BMODS for the representation of the GA's population.
{"title":"The BMODS: A powerful 2-D representation scheme for the GA's population","authors":"Hari Mohan Pandey, Ankit Chaudhary, D. Mehrotra, Yudong Zhang","doi":"10.1109/CONFLUENCE.2016.7508046","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508046","url":null,"abstract":"The genetic algorithm is a search an optimization algorithm has been widely used, need no introduction. There exist various factors such as population size, representation of the population, crossover and mutation probabilities, selection method and others greatly contribute to the success of the genetic algorithm. This paper dealt with the representation of the population for the genetic algorithm. The authors have shown the 2-D representation of the population has been called as bit masking oriented data structure (BMODS) was implemented by Iupsa in 2001. The BMODS is an efficient way to store the individual genome in which reproduction operations have been performed. Recently, the authors have incorporated the BMODS for the grammatical inference system and found encouraging results. By this paper, the aim is to show the usefulness of the BMODS for the representation of the GA's population.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126647217","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508203
D. Kumar, Ajay Kumar Dadoria, T. Gupta
As day by day continuing research in the field of nanotechnology, the CMOS manufacturing process scaled down in nano-dimensions at the cost of severe process variations and high leakage current which resulting large power dissipation. Therefore the leakage current and power dissipation becomes increasingly more focused in VLSI circuit design. Carbon NanoTube Field Effect Transistor (CNFETs) is suited best alternatives to the conventional CMOS based devices. During various simulation results, unexpected reduction in process variation, ultra low (nano-scaled) power memory devices and superior improvement of Noise Margin, propagation delay, write-read margin and its stability is found. CNFETs based logic gates are compared with Conventional CMOS and FinFET based logic gates in respect to delay and power consumption.
{"title":"Carbon NanoTube based logic gates structure for low power consumption at nano-scaled era","authors":"D. Kumar, Ajay Kumar Dadoria, T. Gupta","doi":"10.1109/CONFLUENCE.2016.7508203","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508203","url":null,"abstract":"As day by day continuing research in the field of nanotechnology, the CMOS manufacturing process scaled down in nano-dimensions at the cost of severe process variations and high leakage current which resulting large power dissipation. Therefore the leakage current and power dissipation becomes increasingly more focused in VLSI circuit design. Carbon NanoTube Field Effect Transistor (CNFETs) is suited best alternatives to the conventional CMOS based devices. During various simulation results, unexpected reduction in process variation, ultra low (nano-scaled) power memory devices and superior improvement of Noise Margin, propagation delay, write-read margin and its stability is found. CNFETs based logic gates are compared with Conventional CMOS and FinFET based logic gates in respect to delay and power consumption.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132159888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/CONFLUENCE.2016.7508212
N. Walter, S. Dubey
Efficient routing problem exists from several years. Spanning tree plays very important role to design routing algorithms efficiently. To obtain the minimum cost a minimum spanning tree is formed from the given graph. Greedy technique plays important role to generate minimum spanning tree. Several approaches exists to solve minimum spanning tree but in this paper a new methodology is designed and developed to find minimum spanning tree using subtraction and remainder procedure. This procedure also uses Greedy approach. The main objective is to present a new way to find minimum spanning tree. An example is also given to understand the procedure in efficient way.
{"title":"Design and development of novice conceptual approach for minimum spanning tree","authors":"N. Walter, S. Dubey","doi":"10.1109/CONFLUENCE.2016.7508212","DOIUrl":"https://doi.org/10.1109/CONFLUENCE.2016.7508212","url":null,"abstract":"Efficient routing problem exists from several years. Spanning tree plays very important role to design routing algorithms efficiently. To obtain the minimum cost a minimum spanning tree is formed from the given graph. Greedy technique plays important role to generate minimum spanning tree. Several approaches exists to solve minimum spanning tree but in this paper a new methodology is designed and developed to find minimum spanning tree using subtraction and remainder procedure. This procedure also uses Greedy approach. The main objective is to present a new way to find minimum spanning tree. An example is also given to understand the procedure in efficient way.","PeriodicalId":299044,"journal":{"name":"2016 6th International Conference - Cloud System and Big Data Engineering (Confluence)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129004417","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}