Patients with chronic liver diseases typically experience lipid profile problems, and mortality from cirrhosis complicated by portal vein thrombosis (PVT) is very significant. A lipoprotein (Lp) is a bio-chemical assemblage with the main job of moving fat molecules in water that are hydrophobic. Lipoproteins are present in all eubacterial walls. Lipoproteins are of tremendous interest in the study of spirochaetes’ pathogenic mechanisms. Since spirochaete lipobox sequences are more malleable than other bacteria, it’s proven difficult to apply current prediction methods to new sequence data. The major goal is to present a Lipoprotein detection model in which correlation features, enhanced log energy entropy, raw features, and semantic similarity features are extracted. These extracted characteristics are put through a hybrid model that combines a Gated Recurrent Unit (GRU) and a Long Short-Term Memory (LSTM). Then, the outputs of GRU and LSTM are averaged to obtain the output. Here, GRU weights are optimized via the Selfish combined Henry Gas Solubility Optimization with cubic map initialization (SHGSO) model.
{"title":"Lipoprotein detection: Hybrid deep classification model with improved feature set","authors":"P. N. Kathavate, J. Amudhavel","doi":"10.3233/mgs-220329","DOIUrl":"https://doi.org/10.3233/mgs-220329","url":null,"abstract":"Patients with chronic liver diseases typically experience lipid profile problems, and mortality from cirrhosis complicated by portal vein thrombosis (PVT) is very significant. A lipoprotein (Lp) is a bio-chemical assemblage with the main job of moving fat molecules in water that are hydrophobic. Lipoproteins are present in all eubacterial walls. Lipoproteins are of tremendous interest in the study of spirochaetes’ pathogenic mechanisms. Since spirochaete lipobox sequences are more malleable than other bacteria, it’s proven difficult to apply current prediction methods to new sequence data. The major goal is to present a Lipoprotein detection model in which correlation features, enhanced log energy entropy, raw features, and semantic similarity features are extracted. These extracted characteristics are put through a hybrid model that combines a Gated Recurrent Unit (GRU) and a Long Short-Term Memory (LSTM). Then, the outputs of GRU and LSTM are averaged to obtain the output. Here, GRU weights are optimized via the Selfish combined Henry Gas Solubility Optimization with cubic map initialization (SHGSO) model.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"52 1","pages":"345-363"},"PeriodicalIF":0.7,"publicationDate":"2023-02-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83794346","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The building extraction technology in remote sensing imagery has been a research hotspot. Building extraction in remote sensing imagery plays an important role in land planning, disaster assessment, digital city construction, etc. Although many scholars have explored many methods, it is difficult to realize high-precision automatic extraction due to the problems in high-resolution remote sensing images, such as the same object with different spectrum, the same spectrum with different object, noise shadow and ground object occlusion. Therefore, this paper proposes an urban building extraction based on information fusion-oriented deep encoder-decoder network. First, the deep encoder-decoder network is adopted to extract the shallow semantic features of building objects. Second, a polynomial kernel is used to describe the middle feature map of deep network to improve the identification ability for fuzzy features. Third, the shallow features and high-order features are fused and sent to the end of the encoder-decoder network to obtain the building segmentation results. Finally, we conduct abundant experiments on public data sets, the recall rate, accuracy rate, and F1-Score are greatly improved. The overall F1-score increases by about 4%. Compared with other state-of-the-art building extraction network structures, the proposed network is better to segment the building target from the background.
{"title":"Urban building extraction based on information fusion-oriented deep encoder-decoder network in remote sensing imagery","authors":"Cheng Zhang, Mingzhou Ma, Dan He","doi":"10.3233/mgs-220339","DOIUrl":"https://doi.org/10.3233/mgs-220339","url":null,"abstract":"The building extraction technology in remote sensing imagery has been a research hotspot. Building extraction in remote sensing imagery plays an important role in land planning, disaster assessment, digital city construction, etc. Although many scholars have explored many methods, it is difficult to realize high-precision automatic extraction due to the problems in high-resolution remote sensing images, such as the same object with different spectrum, the same spectrum with different object, noise shadow and ground object occlusion. Therefore, this paper proposes an urban building extraction based on information fusion-oriented deep encoder-decoder network. First, the deep encoder-decoder network is adopted to extract the shallow semantic features of building objects. Second, a polynomial kernel is used to describe the middle feature map of deep network to improve the identification ability for fuzzy features. Third, the shallow features and high-order features are fused and sent to the end of the encoder-decoder network to obtain the building segmentation results. Finally, we conduct abundant experiments on public data sets, the recall rate, accuracy rate, and F1-Score are greatly improved. The overall F1-score increases by about 4%. Compared with other state-of-the-art building extraction network structures, the proposed network is better to segment the building target from the background.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"6 1","pages":"279-294"},"PeriodicalIF":0.7,"publicationDate":"2023-02-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87979781","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cloud computing is gaining a huge popularity for on-demand services on a pay-per-use basis. However, single data centre is restricted in offering the services, as it does not have unlimited resource capacity mostly in the peak demand time. Generally, the count of Virtual Machines (VM) is more in public cloud; still, the security is not ensured. In contrast, the VMs are limited in private cloud with high security. So, the consideration of security levels in task scheduling is remains to be more critical for secured processing. This works intends to afford the optimization strategies for optimal task scheduling with multi-objective constraints in cloud environment. Accordingly, the proposed optimal task allocation framework considers the objectives such as execution time, risk probability, and task priority. For this, a new hybrid optimization algorithm known as Clan Updated Seagull Optimization (CUSO) algorithm is introduced in this work, which is the conceptual blending of Elephant Herding Optimization (EHO) and Seagull Optimization Algorithm (SOA). Finally, the performance of proposed work is evaluated over other conventional models with respect to certain performance measures.
{"title":"Multi objective task scheduling based on hybrid metaheuristic algorithm for cloud environment","authors":"P. Neelakantan, N. Yadav","doi":"10.3233/mgs-220218","DOIUrl":"https://doi.org/10.3233/mgs-220218","url":null,"abstract":"Cloud computing is gaining a huge popularity for on-demand services on a pay-per-use basis. However, single data centre is restricted in offering the services, as it does not have unlimited resource capacity mostly in the peak demand time. Generally, the count of Virtual Machines (VM) is more in public cloud; still, the security is not ensured. In contrast, the VMs are limited in private cloud with high security. So, the consideration of security levels in task scheduling is remains to be more critical for secured processing. This works intends to afford the optimization strategies for optimal task scheduling with multi-objective constraints in cloud environment. Accordingly, the proposed optimal task allocation framework considers the objectives such as execution time, risk probability, and task priority. For this, a new hybrid optimization algorithm known as Clan Updated Seagull Optimization (CUSO) algorithm is introduced in this work, which is the conceptual blending of Elephant Herding Optimization (EHO) and Seagull Optimization Algorithm (SOA). Finally, the performance of proposed work is evaluated over other conventional models with respect to certain performance measures.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"40 1","pages":"149-169"},"PeriodicalIF":0.7,"publicationDate":"2022-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90453813","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cloud computing has emerged as one of the hottest topics in technology and has quickly become a widely used information and communication technology model. Performance is a critical component in the cloud environment concerning constraints like economic, time, and hardware issues. Various characteristics and conditions for providing solutions and designing strategies must be dealt with in different situations to perform better. For example, task scheduling and resource allocation are significant challenges in cloud management. Adopting proper techniques in such conditions leads to performance improvement. This paper surveys existing scheduling algorithms concerning the macro design idea. We classify these algorithms into four main categories: deterministic algorithms, metaheuristic algorithms, learning algorithms, and algorithms based on game theory. Each category is discussed by citing appropriate studies, and the MapReduce review is addressed as an example.
{"title":"A survey on cloud computing scheduling algorithms","authors":"M. Malekimajd, Ali Safarpoor-Dehkordi","doi":"10.3233/mgs-220217","DOIUrl":"https://doi.org/10.3233/mgs-220217","url":null,"abstract":"Cloud computing has emerged as one of the hottest topics in technology and has quickly become a widely used information and communication technology model. Performance is a critical component in the cloud environment concerning constraints like economic, time, and hardware issues. Various characteristics and conditions for providing solutions and designing strategies must be dealt with in different situations to perform better. For example, task scheduling and resource allocation are significant challenges in cloud management. Adopting proper techniques in such conditions leads to performance improvement. This paper surveys existing scheduling algorithms concerning the macro design idea. We classify these algorithms into four main categories: deterministic algorithms, metaheuristic algorithms, learning algorithms, and algorithms based on game theory. Each category is discussed by citing appropriate studies, and the MapReduce review is addressed as an example.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"273 1","pages":"119-148"},"PeriodicalIF":0.7,"publicationDate":"2022-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73083754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Modelling and analysis in software system development can be especially challenging in early requirements engineering (RE), where high-level system non-functional requirements are discovered. In the early stage, hard to measure non-functional requirements are critical; understanding the interactions between systems and stakeholders is key to system success. Goal-oriented requirements engineering (GORE) has been successful in dealing with the issues that may arise during the analysis of requirements. While assisting in the analysis of requirements, i* goal model is the only framework available among the many GORE models, emphasising socio-technical domains such as stakeholders/actors/players, goals/objectives, dependencies and design options/alternatives. Most current approaches to goal-model analysis use quantitative methods or formal information that is hard to gather in early RE, or produce analysis results automatically over models. In real-time competitive applications, the goals of various stakeholders are conflicting in complex systems. Also, each of the system goals have various alternative design options for the systems and optimal selection of goal-oriented requirements faces several challenges in requirements-based engineering. Hence, effective decision-making frameworks are necessary to capture the real issues to achieve multi-objective optimisation of interdependent actors. To obtain an optimum strategy for interdependent actors in the i* goal model must balance the opposing goals reciprocally. To achieve this, the model needs to go beyond the analytical decision-making tools such as sensitivity analysis tasks, cost-effective analysis process, game-theoretic concepts and analytical hierarchical process. To address these requirements, this paper discusses the design of novel frameworks for an agent-based goal model analysis in requirements engineering. The objective of this paper is to provide a brief and comprehensive review of the major efforts undertaken along this line of research. In this paper we have prepared literature review of the concepts, terminology, significance and techniques of Goal oriented requirements engineering in the context of non-functional requirements analysis.
{"title":"Challenges and review of goal-oriented requirements engineering based competitive non-functional requirements analysis","authors":"Sreenithya Sumesh, A. Krishna","doi":"10.3233/mgs-220231","DOIUrl":"https://doi.org/10.3233/mgs-220231","url":null,"abstract":"Modelling and analysis in software system development can be especially challenging in early requirements engineering (RE), where high-level system non-functional requirements are discovered. In the early stage, hard to measure non-functional requirements are critical; understanding the interactions between systems and stakeholders is key to system success. Goal-oriented requirements engineering (GORE) has been successful in dealing with the issues that may arise during the analysis of requirements. While assisting in the analysis of requirements, i* goal model is the only framework available among the many GORE models, emphasising socio-technical domains such as stakeholders/actors/players, goals/objectives, dependencies and design options/alternatives. Most current approaches to goal-model analysis use quantitative methods or formal information that is hard to gather in early RE, or produce analysis results automatically over models. In real-time competitive applications, the goals of various stakeholders are conflicting in complex systems. Also, each of the system goals have various alternative design options for the systems and optimal selection of goal-oriented requirements faces several challenges in requirements-based engineering. Hence, effective decision-making frameworks are necessary to capture the real issues to achieve multi-objective optimisation of interdependent actors. To obtain an optimum strategy for interdependent actors in the i* goal model must balance the opposing goals reciprocally. To achieve this, the model needs to go beyond the analytical decision-making tools such as sensitivity analysis tasks, cost-effective analysis process, game-theoretic concepts and analytical hierarchical process. To address these requirements, this paper discusses the design of novel frameworks for an agent-based goal model analysis in requirements engineering. The objective of this paper is to provide a brief and comprehensive review of the major efforts undertaken along this line of research. In this paper we have prepared literature review of the concepts, terminology, significance and techniques of Goal oriented requirements engineering in the context of non-functional requirements analysis.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"231 1","pages":"171-191"},"PeriodicalIF":0.7,"publicationDate":"2022-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76433090","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The data integrity verification process in cloud has become more promising research area in several Internet of Things (IoT) applications. The traditional data verification approaches use encryption in order to preserve data. Moreover, fog computing is considered as extensively employed virtualized platform and it affords various services including storage as well as services interconnected to computing and networking between user and data center based on standard cloud computing. Moreover, fog computing is an extensive description of cloud computing. Thus, fog servers effectively decrease the latency by integrating fog servers. In this paper, novel model for data integrity authentication and protection is designed in IoT cloud-fog model. This method mainly comprises fog nodes, cloud server, IoT nodes, and key distribution center. Here, dynamic and secure key is produced based on the request to key distribution center based on hashing, Exclusive OR (XOR), homomorphic encryption and polynomial. The fog nodes are employed to encrypt the data gathered from IoT nodes as well as allocate the nearby nodes based on Artificial Bee Colony-based Fuzzy-C-Means (ABC FCM) – based partitioning approach. The proposed data integrity authentication approach in IoT fog cloud system outperformed than other existing methods with respect to detection rate, computational time and memory usage of 0.8541, 34.25 s, and 54.8 MB, respectively.
{"title":"An approach for data integrity authentication and protection in fog computing","authors":"M.N. Babitha, M. Siddappa","doi":"10.3233/mgs-220210","DOIUrl":"https://doi.org/10.3233/mgs-220210","url":null,"abstract":"The data integrity verification process in cloud has become more promising research area in several Internet of Things (IoT) applications. The traditional data verification approaches use encryption in order to preserve data. Moreover, fog computing is considered as extensively employed virtualized platform and it affords various services including storage as well as services interconnected to computing and networking between user and data center based on standard cloud computing. Moreover, fog computing is an extensive description of cloud computing. Thus, fog servers effectively decrease the latency by integrating fog servers. In this paper, novel model for data integrity authentication and protection is designed in IoT cloud-fog model. This method mainly comprises fog nodes, cloud server, IoT nodes, and key distribution center. Here, dynamic and secure key is produced based on the request to key distribution center based on hashing, Exclusive OR (XOR), homomorphic encryption and polynomial. The fog nodes are employed to encrypt the data gathered from IoT nodes as well as allocate the nearby nodes based on Artificial Bee Colony-based Fuzzy-C-Means (ABC FCM) – based partitioning approach. The proposed data integrity authentication approach in IoT fog cloud system outperformed than other existing methods with respect to detection rate, computational time and memory usage of 0.8541, 34.25 s, and 54.8 MB, respectively.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"19 1","pages":"87-105"},"PeriodicalIF":0.7,"publicationDate":"2022-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79115111","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Internet of Things (IoT) is characterized by the large volumes of data collection. Since IoT devices are themselves resource-constrained, this data is transferred to cloud-based systems for further processing. This data collected over a period of time possesses high utility as it is useful for multiple analytical, predictive and prescriptive tasks. Therefore, it is crucial that IoT devices transfer the collected data to network gateways before exhausting their storage to prevent loss of data; this issue is referred to as the “data offloading problem”. This paper proposes a technique for fault tolerant offloading of data by IoT devices such that the data collected by them is transferred to the cloud with a minimal loss. The proposed technique employs opportunistic contacts between IoT and mobile fog nodes to provide a fault tolerant enhancement to the IoT architecture. The effectiveness of the proposed method is verified through simulation experiments to assess the reduction in data loss by use of proposed data offloading scheme. It is demonstrated that the method outperforms a state-of-art method.
{"title":"Fault tolerant data offloading in opportunistic fog enhanced IoT architecture","authors":"Parmeet Kaur","doi":"10.3233/mgs-220211","DOIUrl":"https://doi.org/10.3233/mgs-220211","url":null,"abstract":"Internet of Things (IoT) is characterized by the large volumes of data collection. Since IoT devices are themselves resource-constrained, this data is transferred to cloud-based systems for further processing. This data collected over a period of time possesses high utility as it is useful for multiple analytical, predictive and prescriptive tasks. Therefore, it is crucial that IoT devices transfer the collected data to network gateways before exhausting their storage to prevent loss of data; this issue is referred to as the “data offloading problem”. This paper proposes a technique for fault tolerant offloading of data by IoT devices such that the data collected by them is transferred to the cloud with a minimal loss. The proposed technique employs opportunistic contacts between IoT and mobile fog nodes to provide a fault tolerant enhancement to the IoT architecture. The effectiveness of the proposed method is verified through simulation experiments to assess the reduction in data loss by use of proposed data offloading scheme. It is demonstrated that the method outperforms a state-of-art method.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"13 1","pages":"107-118"},"PeriodicalIF":0.7,"publicationDate":"2022-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81866023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Djamel Douha, A. Mokhtari, Z. Guessoum, Y. M. Berghout
An agent is an autonomous entity that can perform actions to achieve its goals. It acts in a dynamic environment that may engender failures regarding its behavior. Therefore, a formal testing/verification approach of the agent is required to ensure the correctness of its behavior. In this paper, we propose a Default Logic formalism to abstract an agent behavior as knowledge and reasoning rules, and to verify and test the consistency of the behavior. The considered agents are implemented with JADE framework. Also, agent abstraction is translated into Answer Set Programming and solved by Clingo to generate dynamic and adaptive test cases of the agent behavior. The dynamic test cases allow predicting the agent behavior when a new information arises in the system.
{"title":"Using default logic for agent behavior testing","authors":"Djamel Douha, A. Mokhtari, Z. Guessoum, Y. M. Berghout","doi":"10.3233/mgs-220359","DOIUrl":"https://doi.org/10.3233/mgs-220359","url":null,"abstract":"An agent is an autonomous entity that can perform actions to achieve its goals. It acts in a dynamic environment that may engender failures regarding its behavior. Therefore, a formal testing/verification approach of the agent is required to ensure the correctness of its behavior. In this paper, we propose a Default Logic formalism to abstract an agent behavior as knowledge and reasoning rules, and to verify and test the consistency of the behavior. The considered agents are implemented with JADE framework. Also, agent abstraction is translated into Answer Set Programming and solved by Clingo to generate dynamic and adaptive test cases of the agent behavior. The dynamic test cases allow predicting the agent behavior when a new information arises in the system.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"16 1","pages":"1-20"},"PeriodicalIF":0.7,"publicationDate":"2022-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80784999","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nowadays cloud computing has given a new paradigm of computing. Despite several benefits of cloud computing there is still a big challenge of ensuring confidentiality and integrity for sensitive information on the cloud. Therefore to address these challenges without loss of any sensitive information and privacy, we present a novel and robust model called ‘Enhanced Cloud Security using Hyper Elliptic Curve and Biometric’ (ECSHB). The model ECSHB ensures the preservation of data security, privacy, and authentication of data in a cloud environment. The proposed approach combines biometric and hyperelliptic curve cryptography (HECC) techniques to elevate the security of data accessing and resource preservations in the cloud. ECSHB provides a high level of security using less processing power, which will automatically reduce the overall cost. The efficacy of the ECSHB has been evaluated in the form of recognition rate, biometric similarity score, False Matching Ratio (FMR), and False NonMatching Ratio (FNMR). ECSHB has been validated using security threat model analysis in terms of confidentiality. The measure of collision attack, replay attack and non-repudiation is also considered in this work. The evidence of results is compared with some existing work, and the results obtained exhibit better performance in terms of data security and privacy in the cloud environment.
{"title":"A novel model to enhance the data security in cloud environment","authors":"G. Verma, Soumen Kanrar","doi":"10.3233/mgs-220361","DOIUrl":"https://doi.org/10.3233/mgs-220361","url":null,"abstract":"Nowadays cloud computing has given a new paradigm of computing. Despite several benefits of cloud computing there is still a big challenge of ensuring confidentiality and integrity for sensitive information on the cloud. Therefore to address these challenges without loss of any sensitive information and privacy, we present a novel and robust model called ‘Enhanced Cloud Security using Hyper Elliptic Curve and Biometric’ (ECSHB). The model ECSHB ensures the preservation of data security, privacy, and authentication of data in a cloud environment. The proposed approach combines biometric and hyperelliptic curve cryptography (HECC) techniques to elevate the security of data accessing and resource preservations in the cloud. ECSHB provides a high level of security using less processing power, which will automatically reduce the overall cost. The efficacy of the ECSHB has been evaluated in the form of recognition rate, biometric similarity score, False Matching Ratio (FMR), and False NonMatching Ratio (FNMR). ECSHB has been validated using security threat model analysis in terms of confidentiality. The measure of collision attack, replay attack and non-repudiation is also considered in this work. The evidence of results is compared with some existing work, and the results obtained exhibit better performance in terms of data security and privacy in the cloud environment.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"75 1","pages":"45-63"},"PeriodicalIF":0.7,"publicationDate":"2022-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87381977","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The appliances that are received at a cloud data centre are a compilation of jobs (task) that might be independent or dependent on one another. These tasks are then allocated to diverse virtual machine (VM) in a scheduled way. For this task allocation, various scheduling policies are deployed with the intention of reducing energy utilization and makespan, and increasing cloud resource exploitation as well. A variety of research and studies were done to attain an optimal solution in a single cloud setting, however the similar schemes might not operate on multi-cloud environments. Here, this paper aims to introduce a secured task scheduling model in multi-cloud environment. The developed approach mainly concerns on optimal allocation of tasks via a hybrid optimization theory. Consequently, the developed optimal task allotment considers the objectives like makespan, execution time, security parameters (risk evaluation), utilization cost, maximal service level agreement (SLA) adherence and power usage effectiveness (PUE). For resolving this issue, a novel hybrid algorithm termed as rock hyraxes updated shark smell with logistic mapping (RHU-SLM) is introduced in this work. At last, the superiority of developed approach is proved on varied measures.
{"title":"Multi-objective secure task scheduling based on SLA in multi-cloud environment","authors":"P. Jawade, S. Ramachandram","doi":"10.3233/mgs-220362","DOIUrl":"https://doi.org/10.3233/mgs-220362","url":null,"abstract":"The appliances that are received at a cloud data centre are a compilation of jobs (task) that might be independent or dependent on one another. These tasks are then allocated to diverse virtual machine (VM) in a scheduled way. For this task allocation, various scheduling policies are deployed with the intention of reducing energy utilization and makespan, and increasing cloud resource exploitation as well. A variety of research and studies were done to attain an optimal solution in a single cloud setting, however the similar schemes might not operate on multi-cloud environments. Here, this paper aims to introduce a secured task scheduling model in multi-cloud environment. The developed approach mainly concerns on optimal allocation of tasks via a hybrid optimization theory. Consequently, the developed optimal task allotment considers the objectives like makespan, execution time, security parameters (risk evaluation), utilization cost, maximal service level agreement (SLA) adherence and power usage effectiveness (PUE). For resolving this issue, a novel hybrid algorithm termed as rock hyraxes updated shark smell with logistic mapping (RHU-SLM) is introduced in this work. At last, the superiority of developed approach is proved on varied measures.","PeriodicalId":43659,"journal":{"name":"Multiagent and Grid Systems","volume":"56 1","pages":"65-85"},"PeriodicalIF":0.7,"publicationDate":"2022-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79823617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}