Enrico Bacis, S. Vimercati, S. Foresti, S. Paraboschi, Marco Rosa, P. Samarati
The distributed shuffle index strengthens the guarantees of access confidentiality provided by the shuffle index through the distribution of data among three cloud providers. In this paper, we analyze architectural and design issues and describe an implementation of the distributed shuffle index integrated with different cloud providers (i.e., Amazon S3, OpenStack Swift, Google Cloud Storage, and EMC Elastic Cloud Storage). The experimental results obtained with our implementation confirm the protection guarantees provided by the distributed shuffle index and its limited performance overhead, demonstrating its practical applicability in cloud scenarios.
分布式shuffle索引通过数据在三个云提供商之间的分布,加强了shuffle索引提供的访问机密性保证。在本文中,我们分析了架构和设计问题,并描述了与不同云提供商(即Amazon S3, OpenStack Swift, Google cloud Storage和EMC Elastic cloud Storage)集成的分布式shuffle索引的实现。实验结果证实了分布式shuffle索引提供的保护保证及其有限的性能开销,证明了其在云场景中的实际适用性。
{"title":"Distributed Shuffle Index in the Cloud: Implementation and Evaluation","authors":"Enrico Bacis, S. Vimercati, S. Foresti, S. Paraboschi, Marco Rosa, P. Samarati","doi":"10.1109/CSCloud.2017.25","DOIUrl":"https://doi.org/10.1109/CSCloud.2017.25","url":null,"abstract":"The distributed shuffle index strengthens the guarantees of access confidentiality provided by the shuffle index through the distribution of data among three cloud providers. In this paper, we analyze architectural and design issues and describe an implementation of the distributed shuffle index integrated with different cloud providers (i.e., Amazon S3, OpenStack Swift, Google Cloud Storage, and EMC Elastic Cloud Storage). The experimental results obtained with our implementation confirm the protection guarantees provided by the distributed shuffle index and its limited performance overhead, demonstrating its practical applicability in cloud scenarios.","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125155720","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abdulaziz Alshammari, Sulaiman Alhaidari, Ali I. Alharbi, M. Zohdy
Cloud Computing has emerged as a new paradigm of computing that builds on the foundations of Distributed Computing, Grid Computing, and Virtualization. Cloud computing is Internet-accessible business model with flexible resource allocation on demand, and computing on a pay-per-use as utilities. Cloud computing has grown to provide a promising business concept for computing infrastructure, where concerns are beginning to grow about how safe an environment is. Security is one of the major issues in the cloud-computing environment. In this paper we investigate some prime security attacks and possible solutions for clouds: XML Signature Wrapping attacks, Browser Security, and Vendor Lock-in.
{"title":"Security Threats and Challenges in Cloud Computing","authors":"Abdulaziz Alshammari, Sulaiman Alhaidari, Ali I. Alharbi, M. Zohdy","doi":"10.1109/CSCloud.2017.59","DOIUrl":"https://doi.org/10.1109/CSCloud.2017.59","url":null,"abstract":"Cloud Computing has emerged as a new paradigm of computing that builds on the foundations of Distributed Computing, Grid Computing, and Virtualization. Cloud computing is Internet-accessible business model with flexible resource allocation on demand, and computing on a pay-per-use as utilities. Cloud computing has grown to provide a promising business concept for computing infrastructure, where concerns are beginning to grow about how safe an environment is. Security is one of the major issues in the cloud-computing environment. In this paper we investigate some prime security attacks and possible solutions for clouds: XML Signature Wrapping attacks, Browser Security, and Vendor Lock-in.","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"84 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130944632","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Information Security Risk Management (ISRM) process involves several activities to conduct a risk management (RM) task in an organization. ISRM activities require access to various information related to the organization. An organization often needs to share information related to an ISRM process with the stakeholders involved in the activity. Therefore, it is important to manage the information which is critical to the operations of the organization. The presence of an information classification scheme can enable the proper handling of the information involved in the RM task. We selected ISO/IEC27005:2011 risk management standard to assess various information generated during the process of applying this standard in an organization. The purpose of this study is to propose a framework to show various information objects involved in ISO27005 risk management standard and classify the information based on the guideline provided by UNINETT scheme. A case scenario of a health clinic is developed to identify ISRM related information objects using the proposed framework and classify the information using UNINETT scheme.
{"title":"A Framework for the Information Classification in ISO 27005 Standard","authors":"V. Agrawal","doi":"10.1109/CSCloud.2017.13","DOIUrl":"https://doi.org/10.1109/CSCloud.2017.13","url":null,"abstract":"Information Security Risk Management (ISRM) process involves several activities to conduct a risk management (RM) task in an organization. ISRM activities require access to various information related to the organization. An organization often needs to share information related to an ISRM process with the stakeholders involved in the activity. Therefore, it is important to manage the information which is critical to the operations of the organization. The presence of an information classification scheme can enable the proper handling of the information involved in the RM task. We selected ISO/IEC27005:2011 risk management standard to assess various information generated during the process of applying this standard in an organization. The purpose of this study is to propose a framework to show various information objects involved in ISO27005 risk management standard and classify the information based on the guideline provided by UNINETT scheme. A case scenario of a health clinic is developed to identify ISRM related information objects using the proposed framework and classify the information using UNINETT scheme.","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"37 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120816585","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cloud computing and big data technologies are converging to offer a cost-effective delivery model for cloud-based big data analytics. Though impacts of size and scaling of big data on cloud have been extensively studied, the effects of complexity of underlying analytic methods on cloud performance have received less attention. This paper will develop and evaluate a computationally intensive statistical methodology to perform inference in the presence of both non-Gaussian data and missing data. Two well-established statistical approaches, bootstrap and multiple imputations (MI), will be combined to form the methodology. Bootstrap is a computer-based nonparametric resampling procedure that involves randomly selecting data many thousands of times to construct an empirical distribution, which is then used to construct confidence intervals for significance tests. This statistical technique enables scientists who conduct studies on data with known non-normality to obtain higher quality significance tests than is possible with a traditional asymptotic, normal-theory based significance test. However, the bootstrapping procedure only works when no data are missing or the data are missing completely at random (MCAR). Missing data can lead to biased estimates when the MCAR assumption is violated. It is unclear how to best implement a bootstrapping procedure in the presence of missing data. The proposed methods will provide guidelines and procedures that will enable researchers to use the technique in all areas of health, behavior and developmental science in which a study has missing data and cannot rely on parametric inference. Either bootstrapping or MI can be computationally expensive, and combining these two can lead to further computation costs in the cloud. Using carefully constructed simulation examples, we demonstrate that it is feasible to implement the proposed methodology in a high performance Knights Landing platform. However, the computation costs are substantial even with small data size. Further studies are needed to study the effects of optimizing the implementation and its performance with big data.
{"title":"Evaluation of Combining Bootstrap with Multiple Imputation Using R on Knights Landing Platform","authors":"Chuan Zhou, Yuxiang Gao, Waylon Howard","doi":"10.1109/CSCLOUD.2017.55","DOIUrl":"https://doi.org/10.1109/CSCLOUD.2017.55","url":null,"abstract":"Cloud computing and big data technologies are converging to offer a cost-effective delivery model for cloud-based big data analytics. Though impacts of size and scaling of big data on cloud have been extensively studied, the effects of complexity of underlying analytic methods on cloud performance have received less attention. This paper will develop and evaluate a computationally intensive statistical methodology to perform inference in the presence of both non-Gaussian data and missing data. Two well-established statistical approaches, bootstrap and multiple imputations (MI), will be combined to form the methodology. Bootstrap is a computer-based nonparametric resampling procedure that involves randomly selecting data many thousands of times to construct an empirical distribution, which is then used to construct confidence intervals for significance tests. This statistical technique enables scientists who conduct studies on data with known non-normality to obtain higher quality significance tests than is possible with a traditional asymptotic, normal-theory based significance test. However, the bootstrapping procedure only works when no data are missing or the data are missing completely at random (MCAR). Missing data can lead to biased estimates when the MCAR assumption is violated. It is unclear how to best implement a bootstrapping procedure in the presence of missing data. The proposed methods will provide guidelines and procedures that will enable researchers to use the technique in all areas of health, behavior and developmental science in which a study has missing data and cannot rely on parametric inference. Either bootstrapping or MI can be computationally expensive, and combining these two can lead to further computation costs in the cloud. Using carefully constructed simulation examples, we demonstrate that it is feasible to implement the proposed methodology in a high performance Knights Landing platform. However, the computation costs are substantial even with small data size. Further studies are needed to study the effects of optimizing the implementation and its performance with big data.","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126104874","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Storage demands in the data centers are growing dramatically for most internet and cloud service providers today. More and more service providers are adopting Software-Defined Storage (SDS) instead of traditional fiber channel based storage appliances due to the lead time, expense, and flexibility. However, data centers are held back by storage I/O that cannot keep up with ever-increasing demand, preventing systems from reaching their full performance potential. Intel Cache Acceleration Software (Intel CAS), combined with highperformance Solid State Drives (SSDs), increases data center performance via intelligent caching rather than extreme spending. This case study shows the decoupling of compute and storage in the Apache Hadoop cluster so the compute and storage can be expanded independently. While decoupling Hadoop HDFS storage from local hard drives to external Ceph storage, the study demonstrates how the Intel Cache Acceleration Software helps the increase of the performance under the decoupled architecture by several benchmarking tasks.
{"title":"Performance Study of Ceph Storage with Intel Cache Acceleration Software: Decoupling Hadoop MapReduce and HDFS over Ceph Storage","authors":"V. Shankar, Roscoe Lin","doi":"10.1109/CSCloud.2017.40","DOIUrl":"https://doi.org/10.1109/CSCloud.2017.40","url":null,"abstract":"Storage demands in the data centers are growing dramatically for most internet and cloud service providers today. More and more service providers are adopting Software-Defined Storage (SDS) instead of traditional fiber channel based storage appliances due to the lead time, expense, and flexibility. However, data centers are held back by storage I/O that cannot keep up with ever-increasing demand, preventing systems from reaching their full performance potential. Intel Cache Acceleration Software (Intel CAS), combined with highperformance Solid State Drives (SSDs), increases data center performance via intelligent caching rather than extreme spending. This case study shows the decoupling of compute and storage in the Apache Hadoop cluster so the compute and storage can be expanded independently. While decoupling Hadoop HDFS storage from local hard drives to external Ceph storage, the study demonstrates how the Intel Cache Acceleration Software helps the increase of the performance under the decoupled architecture by several benchmarking tasks.","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125194536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
OpenHPC is a collaborative project conducted by Linux Foundation to lower barriers to deployment, management, and use of modern HPC system with reference collection of open-source HPC software components and best practices. Quanta Cloud Technology (QCT) customized HPC cluster software stack including system provisioning, core HPC services, development tools, and optimized applications and libraries, which are distributed as pre-built and validated binaries and are meant to seamlessly layer on top of popular Linux distributions with the integration conventions defined by OpenHPC project. The architecture of QCT HPC Cluster Software Stack is intentionally modular to allow end users to pick and choose from the provided components, as well as to foster a community of open contribution. This paper presents an overview of the underlying customized vision, system architecture, software components and run tests on QCT Developer Cloud.
{"title":"Customized HPC Cluster Software Stack on QCT Developer Cloud","authors":"Stephen Chang, A. Pan","doi":"10.1109/CSCloud.2017.56","DOIUrl":"https://doi.org/10.1109/CSCloud.2017.56","url":null,"abstract":"OpenHPC is a collaborative project conducted by Linux Foundation to lower barriers to deployment, management, and use of modern HPC system with reference collection of open-source HPC software components and best practices. Quanta Cloud Technology (QCT) customized HPC cluster software stack including system provisioning, core HPC services, development tools, and optimized applications and libraries, which are distributed as pre-built and validated binaries and are meant to seamlessly layer on top of popular Linux distributions with the integration conventions defined by OpenHPC project. The architecture of QCT HPC Cluster Software Stack is intentionally modular to allow end users to pick and choose from the provided components, as well as to foster a community of open contribution. This paper presents an overview of the underlying customized vision, system architecture, software components and run tests on QCT Developer Cloud.","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126259632","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sukun Li, A. Leider, Meikang Qiu, Keke Gai, Meiqin Liu
Virtual Reality (VR) research is accelerating the development of inexpensive real-time Brain Computer Interface (BCI). Hardware improvements that increase the capability of Virtual Reality displays and Brain Computer wearable sensors have made possible several new software frameworks for developers to use and create applications combining BCI and VR. It also enables multiple sensory pathways for communications with a larger sized data to users' brains. The intersections of these two research paths are accelerating both fields and will drive the needs for an energy-aware infrastructure to support the wider local bandwidth demands in the mobile cloud. In this paper, we complete a survey on BCI in VR from various perspectives, including Electroencephalogram (EEG)-based BCI models, machine learning, and current active platforms. Based on our investigations, the main findings of this survey highlights three major development trends of BCI, which are entertainment, VR, and cloud computing.
{"title":"Brain-Based Computer Interfaces in Virtual Reality","authors":"Sukun Li, A. Leider, Meikang Qiu, Keke Gai, Meiqin Liu","doi":"10.1109/CSCloud.2017.51","DOIUrl":"https://doi.org/10.1109/CSCloud.2017.51","url":null,"abstract":"Virtual Reality (VR) research is accelerating the development of inexpensive real-time Brain Computer Interface (BCI). Hardware improvements that increase the capability of Virtual Reality displays and Brain Computer wearable sensors have made possible several new software frameworks for developers to use and create applications combining BCI and VR. It also enables multiple sensory pathways for communications with a larger sized data to users' brains. The intersections of these two research paths are accelerating both fields and will drive the needs for an energy-aware infrastructure to support the wider local bandwidth demands in the mobile cloud. In this paper, we complete a survey on BCI in VR from various perspectives, including Electroencephalogram (EEG)-based BCI models, machine learning, and current active platforms. Based on our investigations, the main findings of this survey highlights three major development trends of BCI, which are entertainment, VR, and cloud computing.","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"213 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134637256","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Finding the best model to reveal potential relationships of a given set of data is not an easy job and often requires many iterations of trial and errors for model sections, feature selections and parameters tuning. This problem is greatly complicated in the big data era where the I/O bottlenecks significantly slowed down the time needed to finding the best model. In this article, we examine the case of Box-Cox transformation when assumptions of a regression model are violated. Specifically, we construct and compute a set of summary statistics and transformed the maximum likelihood computation into a per-role operational fashion. The innovative algorithms reduced the big data machine learning problem into a stream based small data learning problem. Once the Box-Cox information array is obtained, the optimal power transformation as well as the corresponding estimates of model parameters can be quickly computed. To evaluate the performance, we implemented the proposed Box-Cox algorithms on QCT developer cloud. Our results showed that by leveraging both the algorithms and the QCT cloud technology, find the fittest model from 101 potential parameters is much faster than the conventional approach.
{"title":"Finding the Best Box-Cox Transformation in Big Data with Meta-Model Learning: A Case Study on QCT Developer Cloud","authors":"Yuxiang Gao, Tonglin Zhang, B. Yang","doi":"10.1109/CSCloud.2017.53","DOIUrl":"https://doi.org/10.1109/CSCloud.2017.53","url":null,"abstract":"Finding the best model to reveal potential relationships of a given set of data is not an easy job and often requires many iterations of trial and errors for model sections, feature selections and parameters tuning. This problem is greatly complicated in the big data era where the I/O bottlenecks significantly slowed down the time needed to finding the best model. In this article, we examine the case of Box-Cox transformation when assumptions of a regression model are violated. Specifically, we construct and compute a set of summary statistics and transformed the maximum likelihood computation into a per-role operational fashion. The innovative algorithms reduced the big data machine learning problem into a stream based small data learning problem. Once the Box-Cox information array is obtained, the optimal power transformation as well as the corresponding estimates of model parameters can be quickly computed. To evaluate the performance, we implemented the proposed Box-Cox algorithms on QCT developer cloud. Our results showed that by leveraging both the algorithms and the QCT cloud technology, find the fittest model from 101 potential parameters is much faster than the conventional approach.","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123668951","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we present a vulnerability assessment framework that could be used to assess and prevent cyber threats related to wired and wireless networks and computer systems. We have performed vulnerability assessment tests for aviation systems including data loaders and in order to meet aviation industry requirements for wireless network security. Our contributions include detecting cyber vulnerabilities in these aviation systems by using vulnerability assessment and penetration testing tools such as Metasploit Pro and BackTrack and improving security and safety of aircraft. Based on our test results of cyber vulnerabilities, the corresponding solutions will be developed to fix these vulnerabilities. New vulnerability assessment tests will be conducted again until our solutions are secure and safe to use. Some results of our vulnerability assessment tests against our software-hardware products are presented
{"title":"Vulnerability Assessment for Security in Aviation Cyber-Physical Systems","authors":"S. Kumar, Brian Xu","doi":"10.1109/CSCloud.2017.17","DOIUrl":"https://doi.org/10.1109/CSCloud.2017.17","url":null,"abstract":"In this paper, we present a vulnerability assessment framework that could be used to assess and prevent cyber threats related to wired and wireless networks and computer systems. We have performed vulnerability assessment tests for aviation systems including data loaders and in order to meet aviation industry requirements for wireless network security. Our contributions include detecting cyber vulnerabilities in these aviation systems by using vulnerability assessment and penetration testing tools such as Metasploit Pro and BackTrack and improving security and safety of aircraft. Based on our test results of cyber vulnerabilities, the corresponding solutions will be developed to fix these vulnerabilities. New vulnerability assessment tests will be conducted again until our solutions are secure and safe to use. Some results of our vulnerability assessment tests against our software-hardware products are presented","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126472342","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The broad implementation of cloud computing have led to a dramatically growing in exchanging and using data throughout multiple parties. The main problem restricting the implementation of cloud computing is that users lack controls in cloud systems, from which security and privacy concerns become a major issue for cloud users. Logically, an applicable Fully Homomorphic Encryption (FHE) scheme is an effective solution to protecting data throughout the data usage lifecycle in the cloud system, due to the full control on users' own. However, there is no efficacious FHE scheme developed yet for meeting practical demands by the reason of either unqualified accuracy rate or intolerable latency time. Focus on this issue, we propose an advanced FHE scheme designed for operating real numbers, which is named as Full Homomorphic Encryption over Real Numbers (FHE-RN). Our approach has superb performances in both accuracy and efficiency, which has been proved by our experimental evaluations.
{"title":"Advanced Fully Homomorphic Encryption Scheme Over Real Numbers","authors":"Keke Gai, Meikang Qiu, Yujun Li, Xiao-Yang Liu","doi":"10.1109/CSCloud.2017.61","DOIUrl":"https://doi.org/10.1109/CSCloud.2017.61","url":null,"abstract":"The broad implementation of cloud computing have led to a dramatically growing in exchanging and using data throughout multiple parties. The main problem restricting the implementation of cloud computing is that users lack controls in cloud systems, from which security and privacy concerns become a major issue for cloud users. Logically, an applicable Fully Homomorphic Encryption (FHE) scheme is an effective solution to protecting data throughout the data usage lifecycle in the cloud system, due to the full control on users' own. However, there is no efficacious FHE scheme developed yet for meeting practical demands by the reason of either unqualified accuracy rate or intolerable latency time. Focus on this issue, we propose an advanced FHE scheme designed for operating real numbers, which is named as Full Homomorphic Encryption over Real Numbers (FHE-RN). Our approach has superb performances in both accuracy and efficiency, which has been proved by our experimental evaluations.","PeriodicalId":436299,"journal":{"name":"2017 IEEE 4th International Conference on Cyber Security and Cloud Computing (CSCloud)","volume":"25 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125672900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}