Pub Date : 2020-04-01DOI: 10.4018/ijncr.2020040102
D. Sarddar, Raktima Dey, R. Bose, Sandip Roy
As ubiquitous as it is, the Internet has spawned a slew of products that have forever changed the way one thinks of society and politics. This article proposes a model to predict chances of a political party winning based on data collected from Twitter microblogging website, because it is the most popular microblogging platform in the world. Using unsupervised topic modeling and the NRC Emotion Lexicon, the authors demonstrate how it is possible to predict results by analyzing eight types of emotions expressed by users on Twitter. To prove the results based on empirical analysis, the authors examine the Twitter messages posted during 14th Gujarat Legislative Assembly election, 2017. Implementing two unsupervised clustering methods of K-means and Latent Dirichlet Allocation, this research shows how the proposed model is able to examine and summarize observations based on underlying semantic structures of messages posted on Twitter. These two well-known unsupervised clustering methods provide a firm base for the proposed model to enable streamlining of decision-making processes objectively.
{"title":"Topic Modeling as a Tool to Gauge Political Sentiments from Twitter Feeds","authors":"D. Sarddar, Raktima Dey, R. Bose, Sandip Roy","doi":"10.4018/ijncr.2020040102","DOIUrl":"https://doi.org/10.4018/ijncr.2020040102","url":null,"abstract":"As ubiquitous as it is, the Internet has spawned a slew of products that have forever changed the way one thinks of society and politics. This article proposes a model to predict chances of a political party winning based on data collected from Twitter microblogging website, because it is the most popular microblogging platform in the world. Using unsupervised topic modeling and the NRC Emotion Lexicon, the authors demonstrate how it is possible to predict results by analyzing eight types of emotions expressed by users on Twitter. To prove the results based on empirical analysis, the authors examine the Twitter messages posted during 14th Gujarat Legislative Assembly election, 2017. Implementing two unsupervised clustering methods of K-means and Latent Dirichlet Allocation, this research shows how the proposed model is able to examine and summarize observations based on underlying semantic structures of messages posted on Twitter. These two well-known unsupervised clustering methods provide a firm base for the proposed model to enable streamlining of decision-making processes objectively.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125357158","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.4018/ijncr.2019100103
B. Sekhar, P V G D Prasad Reddy, S. Venkataramana, V. Chakravarthy, P Satish Rama Chowdary
In recent days, image communication has evolved in many fields like medicine, entertainment, gaming, mail, etc. Thus, it is an immediate need to denoise the received image because noise that is added in the channel during communication alters or deteriorates information contained in the image. Any image processing techniques concerned with image denoising or image noise removal has to be started with the spatial domain and end with the transform domain. A lot of research was carried out in the spatial domain by modifying the performance of different image filters such as mean filters, median filters, Laplacian filters, etc. Recently much research was carried out in Transform techniques under the transform domain, with evolutionary computing tools (ECT). ECT has proven to be dominant when compared with traditional denoising techniques in combination with wavelets in the transform domain. In this article, the authors applied a novel ECT such as SGOA on the denoising problem for denoising monochrome as well as color images and performance for denoising was evaluated using several image quality metrics.
{"title":"Image Denoising Using Novel Social Grouping Optimization Algorithm with Transform Domain Technique","authors":"B. Sekhar, P V G D Prasad Reddy, S. Venkataramana, V. Chakravarthy, P Satish Rama Chowdary","doi":"10.4018/ijncr.2019100103","DOIUrl":"https://doi.org/10.4018/ijncr.2019100103","url":null,"abstract":"In recent days, image communication has evolved in many fields like medicine, entertainment, gaming, mail, etc. Thus, it is an immediate need to denoise the received image because noise that is added in the channel during communication alters or deteriorates information contained in the image. Any image processing techniques concerned with image denoising or image noise removal has to be started with the spatial domain and end with the transform domain. A lot of research was carried out in the spatial domain by modifying the performance of different image filters such as mean filters, median filters, Laplacian filters, etc. Recently much research was carried out in Transform techniques under the transform domain, with evolutionary computing tools (ECT). ECT has proven to be dominant when compared with traditional denoising techniques in combination with wavelets in the transform domain. In this article, the authors applied a novel ECT such as SGOA on the denoising problem for denoising monochrome as well as color images and performance for denoising was evaluated using several image quality metrics.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120937963","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.4018/ijncr.2019100104
Marcelo Arbori Nogueira, P. D. Oliveira
Cellular automata present great variability in their temporal evolutions due to the number of rules and initial configurations. The possibility of automatically classifying its dynamic behavior would be of great value when studying properties of its dynamics. By counting on elementary cellular automata, and considering its temporal evolution as binary images, the authors created a texture descriptor of the images - based on the neighborhood configurations of the cells in temporal evolutions - so that it could be associated to each dynamic behavior class, following the scheme of Wolfram's classic classification. It was then possible to predict the class of rules of a temporal evolution of an elementary rule in a more effective way than others in the literature in terms of precision and computational cost. By applying the classifier to the larger neighborhood space containing 4 cells, accuracy decreased to just over 70%. However, the classifier is still able to provide some information about the dynamics of an unknown larger space with reduced computational cost.
{"title":"Automatic Texture Based Classification of the Dynamics of One-Dimensional Binary Cellular Automata","authors":"Marcelo Arbori Nogueira, P. D. Oliveira","doi":"10.4018/ijncr.2019100104","DOIUrl":"https://doi.org/10.4018/ijncr.2019100104","url":null,"abstract":"Cellular automata present great variability in their temporal evolutions due to the number of rules and initial configurations. The possibility of automatically classifying its dynamic behavior would be of great value when studying properties of its dynamics. By counting on elementary cellular automata, and considering its temporal evolution as binary images, the authors created a texture descriptor of the images - based on the neighborhood configurations of the cells in temporal evolutions - so that it could be associated to each dynamic behavior class, following the scheme of Wolfram's classic classification. It was then possible to predict the class of rules of a temporal evolution of an elementary rule in a more effective way than others in the literature in terms of precision and computational cost. By applying the classifier to the larger neighborhood space containing 4 cells, accuracy decreased to just over 70%. However, the classifier is still able to provide some information about the dynamics of an unknown larger space with reduced computational cost.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114707317","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.4018/ijncr.2019100102
Subhadip Mukherjee, Biswapati Jana
Data hiding techniques are very significant in the research area of information security. In this article, the authors propose a new reversible data hiding (RDH) scheme using difference expansion. At first, the original image is partitioned into 3 × 3 pixel blocks, then marked Type-one and Type-two pixels based on their coordinate values. After that, the authors find correlated pixels by computing correlation coefficients and the median of Type-one pixels. Next, secret data bits are embedded within Type-two pixels based on correlated pixels and Type-one pixels based on the stego Type-two pixels. The data extraction process successfully extracts secret data as well as recovers the cover image. The authors observed the effects of the proposed method by performing experiments on some standard cover images and found significantly better result in terms of data hiding capacity compared with existing data hiding schemes.
{"title":"A Novel Method for High Capacity Reversible Data Hiding Scheme Using Difference Expansion","authors":"Subhadip Mukherjee, Biswapati Jana","doi":"10.4018/ijncr.2019100102","DOIUrl":"https://doi.org/10.4018/ijncr.2019100102","url":null,"abstract":"Data hiding techniques are very significant in the research area of information security. In this article, the authors propose a new reversible data hiding (RDH) scheme using difference expansion. At first, the original image is partitioned into 3 × 3 pixel blocks, then marked Type-one and Type-two pixels based on their coordinate values. After that, the authors find correlated pixels by computing correlation coefficients and the median of Type-one pixels. Next, secret data bits are embedded within Type-two pixels based on correlated pixels and Type-one pixels based on the stego Type-two pixels. The data extraction process successfully extracts secret data as well as recovers the cover image. The authors observed the effects of the proposed method by performing experiments on some standard cover images and found significantly better result in terms of data hiding capacity compared with existing data hiding schemes.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132018369","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.4018/ijncr.2019100101
Satyabrata Das, Niva Tripathy
The major difference between underwater sensor networks (UWSNs) and terrestrial sensor networks is the use of acoustic signals as a communication medium rather than radio signals. The main reason behind this is the poor performance of radio signals in water. UWSNs have some distinct characteristics which makes them more research-oriented which is the large propagation delay, high error rate, low bandwidth, and limited energy. UWSNs have their application in the field of oceanographic, data collection, pollution monitoring, off-shore exploration, disaster prevention, assisted navigation, tactical surveillance, etc. In UWSNs the main advantages of protocol design are to a reliable and effective data transmission from source to destination. Among those, energy efficiency plays an important role in underwater communication. The main energy sources of UWSNs are batteries which are very difficult to replace frequently. There are two popular underwater protocols that are DBR and EEDBR. DBR is one of the popular routing techniques which don't use the full dimensional location information. In this article the authors use an efficient area localization scheme for UWSNs to minimize the energy hole created. Rather than finding the exact sensor position, this technique will estimate the position of every sensor node within certain area. In addition to that the authors introduced a RF based location finding and multilevel power transmission scheme. Simulation results shows that the proposed scheme produces better result than its counter parts.
{"title":"Minimization of Energy Hole in Under Water Sensor Networks (UWSNs)","authors":"Satyabrata Das, Niva Tripathy","doi":"10.4018/ijncr.2019100101","DOIUrl":"https://doi.org/10.4018/ijncr.2019100101","url":null,"abstract":"The major difference between underwater sensor networks (UWSNs) and terrestrial sensor networks is the use of acoustic signals as a communication medium rather than radio signals. The main reason behind this is the poor performance of radio signals in water. UWSNs have some distinct characteristics which makes them more research-oriented which is the large propagation delay, high error rate, low bandwidth, and limited energy. UWSNs have their application in the field of oceanographic, data collection, pollution monitoring, off-shore exploration, disaster prevention, assisted navigation, tactical surveillance, etc. In UWSNs the main advantages of protocol design are to a reliable and effective data transmission from source to destination. Among those, energy efficiency plays an important role in underwater communication. The main energy sources of UWSNs are batteries which are very difficult to replace frequently. There are two popular underwater protocols that are DBR and EEDBR. DBR is one of the popular routing techniques which don't use the full dimensional location information. In this article the authors use an efficient area localization scheme for UWSNs to minimize the energy hole created. Rather than finding the exact sensor position, this technique will estimate the position of every sensor node within certain area. In addition to that the authors introduced a RF based location finding and multilevel power transmission scheme. Simulation results shows that the proposed scheme produces better result than its counter parts.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"142 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132755766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.4018/IJNCR.2019070101
S. Vasavi, G. V.N.Priyanka, A. Gokhale
Nowadays we are moving towards digitization and making all our devices produce a variety of data, this has paved the way to the emergence of NoSQL databases like Cassandra, MongoDB, and Redis. Big data such as geospatial data allows for geospatial analytics in applications such as tourism, marketing, and rural development. Spark frameworks provide operators storage and processing of distributed data. This article proposes “GeoRediSpark” to integrate Redis with Spark. Redis is a key-value store that uses an in-memory store, hence integrating Redis with Spark can extend the real-time processing of geospatial data. The article investigates storage and retrieval of the Redis built-in geospatial queries and has added two new geospatial operators, GeoWithin and GeoIntersect, to enhance the capabilities of Redis. Hashed indexing is used to improve the processing performance. A comparison on Redis metrics with three benchmark datasets is made. Hashset is used to display geographic data. The output of geospatial queries is visualized to the type of place and the nature of the query using Tableau.
{"title":"Framework for Visualization of GeoSpatial Query Processing by Integrating Redis With Spark","authors":"S. Vasavi, G. V.N.Priyanka, A. Gokhale","doi":"10.4018/IJNCR.2019070101","DOIUrl":"https://doi.org/10.4018/IJNCR.2019070101","url":null,"abstract":"Nowadays we are moving towards digitization and making all our devices produce a variety of data, this has paved the way to the emergence of NoSQL databases like Cassandra, MongoDB, and Redis. Big data such as geospatial data allows for geospatial analytics in applications such as tourism, marketing, and rural development. Spark frameworks provide operators storage and processing of distributed data. This article proposes “GeoRediSpark” to integrate Redis with Spark. Redis is a key-value store that uses an in-memory store, hence integrating Redis with Spark can extend the real-time processing of geospatial data. The article investigates storage and retrieval of the Redis built-in geospatial queries and has added two new geospatial operators, GeoWithin and GeoIntersect, to enhance the capabilities of Redis. Hashed indexing is used to improve the processing performance. A comparison on Redis metrics with three benchmark datasets is made. Hashset is used to display geographic data. The output of geospatial queries is visualized to the type of place and the nature of the query using Tableau.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"91 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117101934","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.4018/IJNCR.2019070103
Sushitha Susan Joseph, D. Aju
Three-dimensional reconstruction is the process of acquiring the volumetric information from two dimensions, converting and representing it in three dimensions. The reconstructed images play a vital role in the disease diagnosis, treatment and surgery. Brain surgery is one of the main treatment options following the diagnosis of brain damage. The risk associated with brain surgery is high. Reconstructed brain images help the surgeons to visualize the exact location of tumor, plan and perform the surgical procedures from craniotomy to tumor resection with high precision. This survey provides an overview of the three-dimensional reconstruction techniques in MRI brain and brain tumors. The triangle generation methods and support vector machine methods are briefly described. The advantages and disadvantages of each method is discussed. The comparison reveals that Immune Sphere Shaped Support Vector Machine is the best choice when execution time is considered and triangle mesh generation algorithm is the best when visual quality is considered.
{"title":"A Comparative Objective Assessment on Mesh-Based and SVM-Based 3D Reconstruction of MRI Brain","authors":"Sushitha Susan Joseph, D. Aju","doi":"10.4018/IJNCR.2019070103","DOIUrl":"https://doi.org/10.4018/IJNCR.2019070103","url":null,"abstract":"Three-dimensional reconstruction is the process of acquiring the volumetric information from two dimensions, converting and representing it in three dimensions. The reconstructed images play a vital role in the disease diagnosis, treatment and surgery. Brain surgery is one of the main treatment options following the diagnosis of brain damage. The risk associated with brain surgery is high. Reconstructed brain images help the surgeons to visualize the exact location of tumor, plan and perform the surgical procedures from craniotomy to tumor resection with high precision. This survey provides an overview of the three-dimensional reconstruction techniques in MRI brain and brain tumors. The triangle generation methods and support vector machine methods are briefly described. The advantages and disadvantages of each method is discussed. The comparison reveals that Immune Sphere Shaped Support Vector Machine is the best choice when execution time is considered and triangle mesh generation algorithm is the best when visual quality is considered.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123784465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.4018/IJNCR.2019070104
Tejal Upadhyay, Samir B. Patel
This article is about the study of genomics structures and identifying cancer types from it. It divides into six parts. The first part is about the introduction of cancer, types of cancers, how cancer arises, etc. The second part is about the genomic study and how cancer is related to that, which features are used for the study. The third part is about the software which the authors have used to study these genomic structures, which data sets are used, and what is the final output for this study. The fourth part shows the proposed algorithm for the study. The fifth part shows the data preprocessing and clustering. Different preprocessing and clustering algorithms are used. The sixth part shows the results and conclusion with a future scope. The genomics data which is used by this article is taken from the Cancer Genome Atlas data portal which is freely available. Some applied imputation techniques fill up for the missing values and important features are extracted. Different clustering algorithms are applied on genome dataset and results are generated.
{"title":"Identifying Subtypes of Cancer Using Genomic Data by Applying Data Mining Techniques","authors":"Tejal Upadhyay, Samir B. Patel","doi":"10.4018/IJNCR.2019070104","DOIUrl":"https://doi.org/10.4018/IJNCR.2019070104","url":null,"abstract":"This article is about the study of genomics structures and identifying cancer types from it. It divides into six parts. The first part is about the introduction of cancer, types of cancers, how cancer arises, etc. The second part is about the genomic study and how cancer is related to that, which features are used for the study. The third part is about the software which the authors have used to study these genomic structures, which data sets are used, and what is the final output for this study. The fourth part shows the proposed algorithm for the study. The fifth part shows the data preprocessing and clustering. Different preprocessing and clustering algorithms are used. The sixth part shows the results and conclusion with a future scope. The genomics data which is used by this article is taken from the Cancer Genome Atlas data portal which is freely available. Some applied imputation techniques fill up for the missing values and important features are extracted. Different clustering algorithms are applied on genome dataset and results are generated.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115585839","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.4018/IJNCR.2019070102
S. Pattanayak, B. B. Choudhury, S. C. Sahoo, S. Agarwal
Advancements of technology in day to day life demands upgradation in the existing soft computing approaches, for enhancing the accuracy. So, the existing particle swarm optimization (PSO) has been upgraded in this article and the new approach is adaptive particle swarm optimization (APSO). Designing an effective track which is shorter in length, takes less travel time, computation time, smooth, feasible and has zero collision risk with obstacles is always a crucial issue. To solve these issues APSO approach has been adopted in this work. A static environment has been implemented in this article for conducting the simulation study. Fifteen numbers of obstacles have been taken into consideration for designing the environment. A comparability study has been stuck between PSO and APSO to recognize the fittest approach for track design (less track size and travel time) with the shortest computation time. The APSO approach is identified as the best suited track designing tool for mobile robots.
{"title":"An Effective Track Designing Approach for a Mobile Robot","authors":"S. Pattanayak, B. B. Choudhury, S. C. Sahoo, S. Agarwal","doi":"10.4018/IJNCR.2019070102","DOIUrl":"https://doi.org/10.4018/IJNCR.2019070102","url":null,"abstract":"Advancements of technology in day to day life demands upgradation in the existing soft computing approaches, for enhancing the accuracy. So, the existing particle swarm optimization (PSO) has been upgraded in this article and the new approach is adaptive particle swarm optimization (APSO). Designing an effective track which is shorter in length, takes less travel time, computation time, smooth, feasible and has zero collision risk with obstacles is always a crucial issue. To solve these issues APSO approach has been adopted in this work. A static environment has been implemented in this article for conducting the simulation study. Fifteen numbers of obstacles have been taken into consideration for designing the environment. A comparability study has been stuck between PSO and APSO to recognize the fittest approach for track design (less track size and travel time) with the shortest computation time. The APSO approach is identified as the best suited track designing tool for mobile robots.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134560829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-01DOI: 10.4018/IJNCR.2019040103
M. S. Das, A. Govardhan, D. Vijayalakshmi
With the growth of internet-based applications and the explosion of consumers, cloud-based web service applications have become more common and the importance of minimizing the cost, increasing the interactivity, and management and efficient use of resources has become high. Existing methods like fixed cost per month no longer satisfy the application maintenance costs of the modern app developers. In this article, the authors propose an enhanced model for improving efficiency; maximize availability and minimizing the cost of cloud-based web applications. The authors have conducted experiments on grid dataset and analyzed the results using several algorithms on the load balancer with the multilevel optimized shortest remaining time scheduling method. The analysis clearly proves that applying a “pay as you” go mechanism will substantially reduce the cost and will improve the efficiency which resources are utilized. The results clearly suggest improvements in cost minimization and effective utilization of resources leading to effective utilization of services.
{"title":"Cost Minimization Through Load Balancing and Effective Resource Utilization in Cloud-Based Web Services","authors":"M. S. Das, A. Govardhan, D. Vijayalakshmi","doi":"10.4018/IJNCR.2019040103","DOIUrl":"https://doi.org/10.4018/IJNCR.2019040103","url":null,"abstract":"With the growth of internet-based applications and the explosion of consumers, cloud-based web service applications have become more common and the importance of minimizing the cost, increasing the interactivity, and management and efficient use of resources has become high. Existing methods like fixed cost per month no longer satisfy the application maintenance costs of the modern app developers. In this article, the authors propose an enhanced model for improving efficiency; maximize availability and minimizing the cost of cloud-based web applications. The authors have conducted experiments on grid dataset and analyzed the results using several algorithms on the load balancer with the multilevel optimized shortest remaining time scheduling method. The analysis clearly proves that applying a “pay as you” go mechanism will substantially reduce the cost and will improve the efficiency which resources are utilized. The results clearly suggest improvements in cost minimization and effective utilization of resources leading to effective utilization of services.","PeriodicalId":369881,"journal":{"name":"Int. J. Nat. Comput. Res.","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126388921","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}