Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776371
Sanika Shirsat, S. Kedar
The most commonly found diseases in humanbeing is Lung diseases, which include Lung Cancer, Pneumonia and from 2020 Covid. It is essential that the lung diseases to be diagnosed timely. There are many machine learning and image processing models that have being developed to serve this purpose. The already existing algorithms serving this purpose are vanilla neural network, capsule network, and VGG. Here, Convolutional Neural Network i.e., CNN algorithm is used for lung diseases prediction based on images of Chest X-Ray. The tools used for implementation areSpyder, Keras and TensorFlow. The Kaggle repository dataset is used for the proposed model. The model yields 93% of mean accuracy. It will predict if the diseases arelung cancer, Pneumonia, covid or non.
{"title":"Lungs Diseases Prediction based on Convolutional Neural Network","authors":"Sanika Shirsat, S. Kedar","doi":"10.1109/CCGE50943.2021.9776371","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776371","url":null,"abstract":"The most commonly found diseases in humanbeing is Lung diseases, which include Lung Cancer, Pneumonia and from 2020 Covid. It is essential that the lung diseases to be diagnosed timely. There are many machine learning and image processing models that have being developed to serve this purpose. The already existing algorithms serving this purpose are vanilla neural network, capsule network, and VGG. Here, Convolutional Neural Network i.e., CNN algorithm is used for lung diseases prediction based on images of Chest X-Ray. The tools used for implementation areSpyder, Keras and TensorFlow. The Kaggle repository dataset is used for the proposed model. The model yields 93% of mean accuracy. It will predict if the diseases arelung cancer, Pneumonia, covid or non.","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126832917","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776382
N. Kishore, Priya Raina, N. Nayar, Mukesh Thakur
Digital signatures are widely used to check the authenticity of the identity of the signatory of the message/document and the integrity of the message sent. They are also used by the receiver for ensuring non-repudiation by the sender. They play an important role in making day-to-day processes electronic and paperless. Digital signatures are based on public key infrastructure (PKI). The message digest (hash) of the file is signed by the sender using a private key and appended to the file. The recipient extracts the signature, decrypting it with the sender's public key, and verifies if the received digest matches its own hash calculations. However, complex calculations for secure signatures imply that digital signatures are time consuming for large files. Hashing is the basic security mechanism used in digital signatures that is performed by all the parties and consumes most of the time. This paper presents a solution to this problem by using parallel hashing to achieve fast digital signatures, discussing two possible approaches. The first one uses only parallel hashing, keeping the rest of the algorithm the same as the reference algorithm based on RSA. The second approach parallelizes the entire reference algorithm. Both were implemented using the OpenMP framework, and the experimental results show a significant decline in the execution time in both the cases.
{"title":"Fast Implementation of Digital Signatures Using Parallel Techniques","authors":"N. Kishore, Priya Raina, N. Nayar, Mukesh Thakur","doi":"10.1109/CCGE50943.2021.9776382","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776382","url":null,"abstract":"Digital signatures are widely used to check the authenticity of the identity of the signatory of the message/document and the integrity of the message sent. They are also used by the receiver for ensuring non-repudiation by the sender. They play an important role in making day-to-day processes electronic and paperless. Digital signatures are based on public key infrastructure (PKI). The message digest (hash) of the file is signed by the sender using a private key and appended to the file. The recipient extracts the signature, decrypting it with the sender's public key, and verifies if the received digest matches its own hash calculations. However, complex calculations for secure signatures imply that digital signatures are time consuming for large files. Hashing is the basic security mechanism used in digital signatures that is performed by all the parties and consumes most of the time. This paper presents a solution to this problem by using parallel hashing to achieve fast digital signatures, discussing two possible approaches. The first one uses only parallel hashing, keeping the rest of the algorithm the same as the reference algorithm based on RSA. The second approach parallelizes the entire reference algorithm. Both were implemented using the OpenMP framework, and the experimental results show a significant decline in the execution time in both the cases.","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126654742","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776354
Shweta Sharad Chavan, J. Jayaseeli
By providing a function called Cloud Storage, the company let participants outsource their confidential data to a third party and use the on-demand services and applications of data on the organization's cloud storage server. With this, researchers would be able to encrypt details, which would will be crucial in preventing security breaches of confidential details. In this exchange conference, businesses have to encrypt their data before they process it to the Cloud framework. Attribute Based Encryption (ABE) system is a symmetric key dependent cryptosystem used in cloud system that lets device users, software, data and programmers access managed digital data. Unfortunately, BAE suffers from a performance downside with outsourcing the operation of decrypting the secret. There has been a lot of suggestions put forward as to how to boost the performance of the method. It would be to the same investigation that was stated in the research study. We take the case of the Robust Paraphrasing and conclude there is a new implementation of the Electronic Referencing tool, even though it depends on the Actuals. The load testing strategy is used to minimize the expense of outsourcing the decryption phase to a third-party data decryption service provider. Load balancing may occur by considering features such as file space, memory, hard drive disc usage, etc. For the intent of revocation of key for a community, we often discuss the problem of the key consumer quitting the group. Therefore, in the case of the key user leaving the group, the latest key to open a group should be modified and circulated to all current key holders. The experimental findings of this proposed method proves that the time and memory consumptions of this proposed system were comparable to, if not higher than, the current system of time consumptions and memory usage.
{"title":"Efficient Attribute Based Encryption Outsourcing in Cloud Storage with User Revocation","authors":"Shweta Sharad Chavan, J. Jayaseeli","doi":"10.1109/CCGE50943.2021.9776354","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776354","url":null,"abstract":"By providing a function called Cloud Storage, the company let participants outsource their confidential data to a third party and use the on-demand services and applications of data on the organization's cloud storage server. With this, researchers would be able to encrypt details, which would will be crucial in preventing security breaches of confidential details. In this exchange conference, businesses have to encrypt their data before they process it to the Cloud framework. Attribute Based Encryption (ABE) system is a symmetric key dependent cryptosystem used in cloud system that lets device users, software, data and programmers access managed digital data. Unfortunately, BAE suffers from a performance downside with outsourcing the operation of decrypting the secret. There has been a lot of suggestions put forward as to how to boost the performance of the method. It would be to the same investigation that was stated in the research study. We take the case of the Robust Paraphrasing and conclude there is a new implementation of the Electronic Referencing tool, even though it depends on the Actuals. The load testing strategy is used to minimize the expense of outsourcing the decryption phase to a third-party data decryption service provider. Load balancing may occur by considering features such as file space, memory, hard drive disc usage, etc. For the intent of revocation of key for a community, we often discuss the problem of the key consumer quitting the group. Therefore, in the case of the key user leaving the group, the latest key to open a group should be modified and circulated to all current key holders. The experimental findings of this proposed method proves that the time and memory consumptions of this proposed system were comparable to, if not higher than, the current system of time consumptions and memory usage.","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125101895","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776385
Madhavi S. Darokar, A. D. Raut, V. Thakre
Emotion recognition and their analysis have become a very popular topic nowadays, as most of the world using the social media in the form of various applications such as Twitter, Facebook, Whatsapp, Instagram and many more. Also, there are quite a large number of users, who buy the different daily life products through the online shopping websites like Amazon, Flipkart where the online behaviors and emotions of the consumer buying the product is of great interest to the e-commerce industry. In accordance to, the development in the artificial intelligence field, there exist various algorithms that are programmed to analyze the user behavior and trap their emotions through various tools for analyzing the market trends and to increase the percentage of profit. Furthermore, a prolific rate of development is observed in the AI field. This now can be noticed presently, in the form of ‘Deep learning’ where a very huge amount of data is available and the decision-making process is very crucial. If the tremendous amount of data is accessible, “Machine Learning” algorithms are of utmost importance.
{"title":"Methodological Review of Emotion Recognition for Social Media: A Sentiment Analysis Approach","authors":"Madhavi S. Darokar, A. D. Raut, V. Thakre","doi":"10.1109/CCGE50943.2021.9776385","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776385","url":null,"abstract":"Emotion recognition and their analysis have become a very popular topic nowadays, as most of the world using the social media in the form of various applications such as Twitter, Facebook, Whatsapp, Instagram and many more. Also, there are quite a large number of users, who buy the different daily life products through the online shopping websites like Amazon, Flipkart where the online behaviors and emotions of the consumer buying the product is of great interest to the e-commerce industry. In accordance to, the development in the artificial intelligence field, there exist various algorithms that are programmed to analyze the user behavior and trap their emotions through various tools for analyzing the market trends and to increase the percentage of profit. Furthermore, a prolific rate of development is observed in the AI field. This now can be noticed presently, in the form of ‘Deep learning’ where a very huge amount of data is available and the decision-making process is very crucial. If the tremendous amount of data is accessible, “Machine Learning” algorithms are of utmost importance.","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125301594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776373
Rashmi Patil, Sreepathi Bellary
Melanoma is a potentially fatal type of skin cancer in these melanocytes develop uncontrollably. Malignant melanoma is another name for melanoma. Melanoma rates in Australia and New Zealand are the highest in the world. Melanoma is anticipated to strike one in every 15 white New Zealanders at some point in their lives. Invasive melanoma was the third most prevalent malignancy in both men and women in 2012. Melanoma can strike adults of any age, but it is extremely uncommon in youngsters. Melanoma is hypothesised to start as an uncontrolled proliferation of genetically transformed melanocytic stem cells. Early diagnosis of melanoma in Dermoscopy pictures boosts the survival percentage substantially. Melanoma detection, on the other hand, is extremely difficult. As a result, automatic identification of skin cancer is extremely beneficial to pathologists' accuracy. This paper offers an ensemble deep learning strategy for accurately classifying the kind of melanoma at an early stage. The proposed model distinguishes between lentigo maligna, superficial spreading and nodular melanoma, allowing for early detection of the virus and prompt isolation and treatment to prevent the disease from spreading further. The deep layer architectures of the convolutional neural network (CNN) and the shallow structure of the pixel-based multilayer perceptron (MLP) are neural network algorithms that represent deep learning (DL) technique and the classical non-parametric machine learning method. Two methods that have diverse behaviours, were combined in a simple and successful means for the classification of very fine melanoma type detection utilising a rule-based decision fusion methodology. On dataset retrieved from https://dermnetnz.org/, the efficiency of ensemble MLP-CNN classifier was examined. In compared to state-of-the-art approaches, experimental outcomes reveal that the proposed technique is worthier in terms of diagnostic accuracy
{"title":"Ensemble Learning for Detection of Types of Melanoma","authors":"Rashmi Patil, Sreepathi Bellary","doi":"10.1109/CCGE50943.2021.9776373","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776373","url":null,"abstract":"Melanoma is a potentially fatal type of skin cancer in these melanocytes develop uncontrollably. Malignant melanoma is another name for melanoma. Melanoma rates in Australia and New Zealand are the highest in the world. Melanoma is anticipated to strike one in every 15 white New Zealanders at some point in their lives. Invasive melanoma was the third most prevalent malignancy in both men and women in 2012. Melanoma can strike adults of any age, but it is extremely uncommon in youngsters. Melanoma is hypothesised to start as an uncontrolled proliferation of genetically transformed melanocytic stem cells. Early diagnosis of melanoma in Dermoscopy pictures boosts the survival percentage substantially. Melanoma detection, on the other hand, is extremely difficult. As a result, automatic identification of skin cancer is extremely beneficial to pathologists' accuracy. This paper offers an ensemble deep learning strategy for accurately classifying the kind of melanoma at an early stage. The proposed model distinguishes between lentigo maligna, superficial spreading and nodular melanoma, allowing for early detection of the virus and prompt isolation and treatment to prevent the disease from spreading further. The deep layer architectures of the convolutional neural network (CNN) and the shallow structure of the pixel-based multilayer perceptron (MLP) are neural network algorithms that represent deep learning (DL) technique and the classical non-parametric machine learning method. Two methods that have diverse behaviours, were combined in a simple and successful means for the classification of very fine melanoma type detection utilising a rule-based decision fusion methodology. On dataset retrieved from https://dermnetnz.org/, the efficiency of ensemble MLP-CNN classifier was examined. In compared to state-of-the-art approaches, experimental outcomes reveal that the proposed technique is worthier in terms of diagnostic accuracy","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126998602","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776417
AnshulVarshav Borawake, Minal Shahakar
Developing mobile application compatible for both Android and iOS, hence a cross-platform development approach, as developers faced a challenge previously learning development specific language for Android and iOS. Compared to other hybrid mobile application frameworks, React Native has faster development time, wide market search and easy third party integration. It is also time and cost efficient for single codebase nature. Getting to the bottom of the solution for the underlying problem, this paper utilizes React Native framework to create an efficient hybrid mobile application “Embankment Protection App” capable of provisioning crowd sourced solutions pertaining to embankment surveys. The framework has been created for Android and iOS, the produced results reflects adequate experience for users on both the platforms. The framework develops truly native apps and does not compromise much with user experiences regardless of the platform. The programming language used for the solution of this research paper is a combination of Javascript.
{"title":"Embankment Protection - React Native Application Cross-Platform Application for protection of embankments by crowd sourced data","authors":"AnshulVarshav Borawake, Minal Shahakar","doi":"10.1109/CCGE50943.2021.9776417","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776417","url":null,"abstract":"Developing mobile application compatible for both Android and iOS, hence a cross-platform development approach, as developers faced a challenge previously learning development specific language for Android and iOS. Compared to other hybrid mobile application frameworks, React Native has faster development time, wide market search and easy third party integration. It is also time and cost efficient for single codebase nature. Getting to the bottom of the solution for the underlying problem, this paper utilizes React Native framework to create an efficient hybrid mobile application “Embankment Protection App” capable of provisioning crowd sourced solutions pertaining to embankment surveys. The framework has been created for Android and iOS, the produced results reflects adequate experience for users on both the platforms. The framework develops truly native apps and does not compromise much with user experiences regardless of the platform. The programming language used for the solution of this research paper is a combination of Javascript.","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133108920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776420
S. Patil, Rane Charushila Vijay
Secrete information is embedded in some cover medium through steganography. The contents of information as well as existence of information must be undetectable to attackers. Steganography is normally done by slightly altering the pixel values of cover image. We have used texture synthesis process for embedding the data instead of changing pixel values. Correlation at joining edges of patches to be stitched is considered for suitable patch selection. Energy of candidate patches is the parameter used to verify uniqueness of candidate patches and to identify patch in data extraction process. Along with energy, other parameters like mean as well as mean, variance, kurtosis and skewness combined are experimented. The data extraction rate in presence of different stego attacks is observed.
{"title":"Spatial Domain Texture Synthesis for Data Embedding","authors":"S. Patil, Rane Charushila Vijay","doi":"10.1109/CCGE50943.2021.9776420","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776420","url":null,"abstract":"Secrete information is embedded in some cover medium through steganography. The contents of information as well as existence of information must be undetectable to attackers. Steganography is normally done by slightly altering the pixel values of cover image. We have used texture synthesis process for embedding the data instead of changing pixel values. Correlation at joining edges of patches to be stitched is considered for suitable patch selection. Energy of candidate patches is the parameter used to verify uniqueness of candidate patches and to identify patch in data extraction process. Along with energy, other parameters like mean as well as mean, variance, kurtosis and skewness combined are experimented. The data extraction rate in presence of different stego attacks is observed.","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129019183","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776374
Monali Gulhane, T. Sajana
People are now suffering from a variety of diseases as a result of the environment in which they live and their lifestyle choices. As a result, the goal of predicting disease at an earlier stage becomes increasingly critical. However, making an accurate prediction based on symptoms becomes too tough for doctors to do. The task of accurately predicting disease is one of the most difficult. Data mining is critical in overcoming this difficulty because it may be used to forecast the sickness. Every year, a great amount of data is generated in the field of medicine. Due to the extreme increase in the rate of information being collected in the health and medical industries, it has been possible to conduct precise analyses of medical data, which now has resulted in better patient outcomes. When disease data is used as a starting point, data mining can be used to identify hidden patterns in the huge number of medical data that currently exists. On the basis of the patient's symptoms, we suggested a generic disease prediction model. In ability to implement credible illness predictions, we apply machine learning methods such as convolutional neural networks (CNNs) for disease prediction. Disease symptom datasets are essential for disease forecasting purposes. In this general disease prediction model, the individual's lifestyle behaviour as well as examination data are taken into consideration for reliable disease prediction. It has been demonstrated that the accuracy of generalized predictive modeling that used the CNN algorithm is 98.7 percent, which really is better than those of the present technique. In addition, the time and memory requirements for existing mechanism are higher than those for CNN. When general disease is expected, this method is qualified to determine the threat related to institutional disease, which can be stronger or weaker than the previously mentioned of general disease.
{"title":"A Machine Learning based Model for Disease Prediction","authors":"Monali Gulhane, T. Sajana","doi":"10.1109/CCGE50943.2021.9776374","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776374","url":null,"abstract":"People are now suffering from a variety of diseases as a result of the environment in which they live and their lifestyle choices. As a result, the goal of predicting disease at an earlier stage becomes increasingly critical. However, making an accurate prediction based on symptoms becomes too tough for doctors to do. The task of accurately predicting disease is one of the most difficult. Data mining is critical in overcoming this difficulty because it may be used to forecast the sickness. Every year, a great amount of data is generated in the field of medicine. Due to the extreme increase in the rate of information being collected in the health and medical industries, it has been possible to conduct precise analyses of medical data, which now has resulted in better patient outcomes. When disease data is used as a starting point, data mining can be used to identify hidden patterns in the huge number of medical data that currently exists. On the basis of the patient's symptoms, we suggested a generic disease prediction model. In ability to implement credible illness predictions, we apply machine learning methods such as convolutional neural networks (CNNs) for disease prediction. Disease symptom datasets are essential for disease forecasting purposes. In this general disease prediction model, the individual's lifestyle behaviour as well as examination data are taken into consideration for reliable disease prediction. It has been demonstrated that the accuracy of generalized predictive modeling that used the CNN algorithm is 98.7 percent, which really is better than those of the present technique. In addition, the time and memory requirements for existing mechanism are higher than those for CNN. When general disease is expected, this method is qualified to determine the threat related to institutional disease, which can be stronger or weaker than the previously mentioned of general disease.","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131839561","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776390
Monika Patel, P. Sajja
The whole world is completely upset because of the unexpected ejection of a lethal disease called Covid-19. Every single region is absolutely closed because of the effect of Covid. To prevent the unfold of this unwellness, everybody needs to maintain social distancing. Students are considered as the eventual fate of the country. To save the understudies from this infection the academic institute has begun internet educating and learning. Yet, giving information in online mode has become a testing task for understudies similarly as a tutor. Because of e-learning, customize learning has become vanish. To help intelligent instructing and learning systems an upgraded model is needed to boost the academic activities. This paper presents a style of projected model utilizing Reinforcement learning. The reinforcement learning (RL) approach provides effective pedagogical strategies for educating the learners with their interest in the subject. With the assistance of RL, the introduced model chooses the training difficulty level of scholars and recommends the student's understanding level to access the reading content. The proposed structure is planned in such a manner with the goal that the educator isn't needed to continually screen the understudy. Experimental results show that these approaches scale back the number of attentions needed from the teacher and enhance the training capability of understudy. The presented framework enhances personalized learning.
{"title":"Application for Multi-Agent System: A Case of Customised eLearning","authors":"Monika Patel, P. Sajja","doi":"10.1109/CCGE50943.2021.9776390","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776390","url":null,"abstract":"The whole world is completely upset because of the unexpected ejection of a lethal disease called Covid-19. Every single region is absolutely closed because of the effect of Covid. To prevent the unfold of this unwellness, everybody needs to maintain social distancing. Students are considered as the eventual fate of the country. To save the understudies from this infection the academic institute has begun internet educating and learning. Yet, giving information in online mode has become a testing task for understudies similarly as a tutor. Because of e-learning, customize learning has become vanish. To help intelligent instructing and learning systems an upgraded model is needed to boost the academic activities. This paper presents a style of projected model utilizing Reinforcement learning. The reinforcement learning (RL) approach provides effective pedagogical strategies for educating the learners with their interest in the subject. With the assistance of RL, the introduced model chooses the training difficulty level of scholars and recommends the student's understanding level to access the reading content. The proposed structure is planned in such a manner with the goal that the educator isn't needed to continually screen the understudy. Experimental results show that these approaches scale back the number of attentions needed from the teacher and enhance the training capability of understudy. The presented framework enhances personalized learning.","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125071467","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-23DOI: 10.1109/CCGE50943.2021.9776424
Manjusha A. Kanawade, Mrunmai M. Ranade
Cogeneration plants concurrently produce electricity and heat energy. In sugar industry bagasse can be utilized efficiently for generation of thermal and electrical energy. The present study includes optimal scheduling of boiler and generator units for generation of steam and electricity. The mixed integer linear programming (MILP) mathematical formulation is proposed to determine optimal planning. The existing sugar industry under consideration does not sell electricity to grid or other utility. The optimal planning and scheduling of sugar industry components in view of power export indicates reduction in annual cost of sugar industry. The proposed MILP model can be helpful for planner of sugar industry to consider power export option in the existing sugar industry. The study shows clear benefit and efficient utilization of boiler and generator units of the industry after satisfying the thermal and electrical demands.
{"title":"Optimization of Cogeneration in Sugar industry by Mixed integer linear programming Method","authors":"Manjusha A. Kanawade, Mrunmai M. Ranade","doi":"10.1109/CCGE50943.2021.9776424","DOIUrl":"https://doi.org/10.1109/CCGE50943.2021.9776424","url":null,"abstract":"Cogeneration plants concurrently produce electricity and heat energy. In sugar industry bagasse can be utilized efficiently for generation of thermal and electrical energy. The present study includes optimal scheduling of boiler and generator units for generation of steam and electricity. The mixed integer linear programming (MILP) mathematical formulation is proposed to determine optimal planning. The existing sugar industry under consideration does not sell electricity to grid or other utility. The optimal planning and scheduling of sugar industry components in view of power export indicates reduction in annual cost of sugar industry. The proposed MILP model can be helpful for planner of sugar industry to consider power export option in the existing sugar industry. The study shows clear benefit and efficient utilization of boiler and generator units of the industry after satisfying the thermal and electrical demands.","PeriodicalId":130452,"journal":{"name":"2021 International Conference on Computing, Communication and Green Engineering (CCGE)","volume":"111 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123435839","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}