Pub Date : 2020-07-31DOI: 10.5121/ijcnc.2020.12407
Mohammad Shojaeshafiei
Much research has been conducted to detect vulnerabilities of Web Applications; however, these never proposed a methodology to measure the vulnerabilities either qualitatively or quantitatively. In this paper, a methodology is proposed to investigate the quantification of vulnerabilities in Web Applications. We applied the Goal Question Metrics (GQM) methodology to determine all possible security factors and subfactors of Web Applications in the Department of Transportation (DOT) as our proof of concept. Then we introduced a Multi-layered Fuzzy Logic (MFL) approach based on the security sub-factors’ prioritization in the Analytic Hierarchy Process (AHP). Using AHP, we weighted each security sub-factor before the quantification process in the Fuzzy Logic to handle imprecise crisp number calculation.
{"title":"Analytic Hierarchy Process-based Fuzzy Measurement to Quantify Vulnerabilities of Web Applications","authors":"Mohammad Shojaeshafiei","doi":"10.5121/ijcnc.2020.12407","DOIUrl":"https://doi.org/10.5121/ijcnc.2020.12407","url":null,"abstract":"Much research has been conducted to detect vulnerabilities of Web Applications; however, these never proposed a methodology to measure the vulnerabilities either qualitatively or quantitatively. In this paper, a methodology is proposed to investigate the quantification of vulnerabilities in Web Applications. We applied the Goal Question Metrics (GQM) methodology to determine all possible security factors and subfactors of Web Applications in the Department of Transportation (DOT) as our proof of concept. Then we introduced a Multi-layered Fuzzy Logic (MFL) approach based on the security sub-factors’ prioritization in the Analytic Hierarchy Process (AHP). Using AHP, we weighted each security sub-factor before the quantification process in the Fuzzy Logic to handle imprecise crisp number calculation.","PeriodicalId":136749,"journal":{"name":"CompSciRN: Supercomputer Performance (Topic)","volume":"29 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132531916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper aims to investigate rare natural disasters and studies adaptation decisions in a dynamic stochastic economy. We examine the optimal balance between investment in productive capital and adaptive capital stock, which is inherently non-productive but alleviates the damage caused by a rare natural disaster. We present a modeling way to include uncertain rare natural disasters in discrete time and solve the model by using the time iteration collocation with the adaptive sparse grid. We perform a global sensitivity analysis to screen which uncertain parameters should be primary calibrated based on the Sobol' indices and compute the univariate effects to identify in which parametric region the model outcomes are most sensitive. To speed up the solving processes, our implementations are massively parallelized on high-performance computing architecture in a distributed memory fashion. We claim that the optimal adaptation to rare natural disasters is to advance our economic development; however, when the economy is developed enough, the growth rate of the adaptive capital stock exceeds that of productive capital stock to precautionary prepare for the future uncertainty.
{"title":"Adaptation to Rare Natural Disasters and Global Sensitivity Analysis in a Dynamic Stochastic Economy","authors":"Takafumi Usui","doi":"10.2139/ssrn.3462011","DOIUrl":"https://doi.org/10.2139/ssrn.3462011","url":null,"abstract":"This paper aims to investigate rare natural disasters and studies adaptation decisions in a dynamic stochastic economy. We examine the optimal balance between investment in productive capital and adaptive capital stock, which is inherently non-productive but alleviates the damage caused by a rare natural disaster. We present a modeling way to include uncertain rare natural disasters in discrete time and solve the model by using the time iteration collocation with the adaptive sparse grid. We perform a global sensitivity analysis to screen which uncertain parameters should be primary calibrated based on the Sobol' indices and compute the univariate effects to identify in which parametric region the model outcomes are most sensitive. To speed up the solving processes, our implementations are massively parallelized on high-performance computing architecture in a distributed memory fashion. We claim that the optimal adaptation to rare natural disasters is to advance our economic development; however, when the economy is developed enough, the growth rate of the adaptive capital stock exceeds that of productive capital stock to precautionary prepare for the future uncertainty.","PeriodicalId":136749,"journal":{"name":"CompSciRN: Supercomputer Performance (Topic)","volume":"42 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116729635","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}