Can We Securely Outsource Big Data Analytics with Lightweight Cryptography?

Sherman S. M. Chow
{"title":"Can We Securely Outsource Big Data Analytics with Lightweight Cryptography?","authors":"Sherman S. M. Chow","doi":"10.1145/3327962.3331455","DOIUrl":null,"url":null,"abstract":"Advances in cryptography such as secure multiparty computation (SMC) and fully-/somewhat-homomorphic encryption (FHE/SHE) have already provided a generic solution to the problem of processing encrypted data; however, they are still not that efficient if one directly applies them for big data analytics.\n Many cryptographers have recently designed specialized privacy-preserving frameworks for neural networks. While promising, they are still not entirely satisfactory. Gazelle (Usenix Security 2018) supports inference but not training. SecureNN (PoPETS 2019), with the help of non-colluding servers, is still orders of magnitudes slower than plaintext training/inferencing.\n To narrow the gap between theory and practice, we put forward a new paradigm for privacy-preserving big data analytics which leverages both trusted processor such as Intel SGX (Software Guard Extensions) and (untrusted) GPU (Graphics Processing Unit). Note that SGX is not a silver bullet in this scenario. In general, SGX is subject to a memory constraint which can be easily exceeded by a single layer of the (evergrowing) neural networks. Relying on the generic solution such as paging mechanism is, again, inefficient. GPU is an ideal platform for deep learning, yet, we do not want to assume it is trusted. We thus still need cryptographic techniques.\n In this keynote, we will briefly survey the research landscape of privacy-preserving machine learning, point out the obstacles brought by seemingly slight changes of requirements (e.g., a single query from different data sources, multiple model owners, outsourcing a trained model to an untrusted cloud), and highlight a number of settings which aids in ensuring privacy without heavyweight cryptography. We will also discuss two notable recent works, Graviton (OSDI 2018) and Slalom (ICLR 2019), and our ongoing research.","PeriodicalId":284467,"journal":{"name":"IEEE International Conference on Services Computing","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE International Conference on Services Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3327962.3331455","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Advances in cryptography such as secure multiparty computation (SMC) and fully-/somewhat-homomorphic encryption (FHE/SHE) have already provided a generic solution to the problem of processing encrypted data; however, they are still not that efficient if one directly applies them for big data analytics. Many cryptographers have recently designed specialized privacy-preserving frameworks for neural networks. While promising, they are still not entirely satisfactory. Gazelle (Usenix Security 2018) supports inference but not training. SecureNN (PoPETS 2019), with the help of non-colluding servers, is still orders of magnitudes slower than plaintext training/inferencing. To narrow the gap between theory and practice, we put forward a new paradigm for privacy-preserving big data analytics which leverages both trusted processor such as Intel SGX (Software Guard Extensions) and (untrusted) GPU (Graphics Processing Unit). Note that SGX is not a silver bullet in this scenario. In general, SGX is subject to a memory constraint which can be easily exceeded by a single layer of the (evergrowing) neural networks. Relying on the generic solution such as paging mechanism is, again, inefficient. GPU is an ideal platform for deep learning, yet, we do not want to assume it is trusted. We thus still need cryptographic techniques. In this keynote, we will briefly survey the research landscape of privacy-preserving machine learning, point out the obstacles brought by seemingly slight changes of requirements (e.g., a single query from different data sources, multiple model owners, outsourcing a trained model to an untrusted cloud), and highlight a number of settings which aids in ensuring privacy without heavyweight cryptography. We will also discuss two notable recent works, Graviton (OSDI 2018) and Slalom (ICLR 2019), and our ongoing research.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
我们可以用轻量级加密技术安全地外包大数据分析吗?
密码学的进步,如安全多方计算(SMC)和全/半同态加密(FHE/SHE),已经为处理加密数据的问题提供了通用的解决方案;然而,如果直接将它们应用于大数据分析,它们仍然没有那么高效。许多密码学家最近为神经网络设计了专门的隐私保护框架。虽然有希望,但它们仍然不能完全令人满意。Gazelle (Usenix Security 2018)支持推理,但不支持训练。在非串通服务器的帮助下,SecureNN (PoPETS 2019)仍然比明文训练/推理慢几个数量级。为了缩小理论与实践之间的差距,我们提出了一种新的保护隐私的大数据分析范式,该范式利用了可信的处理器,如英特尔SGX(软件保护扩展)和(不可信的)GPU(图形处理单元)。请注意,在这种情况下,SGX不是灵丹妙药。一般来说,SGX受到内存约束,这可以很容易地被单层(不断增长的)神经网络所超越。依赖于分页机制之类的通用解决方案同样效率低下。GPU是深度学习的理想平台,然而,我们不想假设它是可信的。因此,我们仍然需要加密技术。在本次主题演讲中,我们将简要介绍保护隐私的机器学习的研究概况,指出看似微小的需求变化带来的障碍(例如,来自不同数据源的单个查询,多个模型所有者,将训练好的模型外包给不可信的云),并强调一些有助于确保隐私的设置,而无需重量级加密。我们还将讨论最近的两个著名作品,引力子(OSDI 2018)和激流回旋(ICLR 2019),以及我们正在进行的研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Automated Web Service Specification Generation Through a Transformation-Based Learning Midiag: A Sequential Trace-Based Fault Diagnosis Framework for Microservices Selecting Secret Sharing Instantiations for Distributed Storage Can We Securely Outsource Big Data Analytics with Lightweight Cryptography? Cloud Data Provenance using IPFS and Blockchain Technology
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1