Scalable expectation propagation for generalized linear models

Niccolò Anceschi, Augusto Fasano, Beatrice Franzolini, Giovanni Rebaudo
{"title":"Scalable expectation propagation for generalized linear models","authors":"Niccolò Anceschi, Augusto Fasano, Beatrice Franzolini, Giovanni Rebaudo","doi":"arxiv-2407.02128","DOIUrl":null,"url":null,"abstract":"Generalized linear models (GLMs) arguably represent the standard approach for\nstatistical regression beyond the Gaussian likelihood scenario. When Bayesian\nformulations are employed, the general absence of a tractable posterior\ndistribution has motivated the development of deterministic approximations,\nwhich are generally more scalable than sampling techniques. Among them,\nexpectation propagation (EP) showed extreme accuracy, usually higher than many\nvariational Bayes solutions. However, the higher computational cost of EP posed\nconcerns about its practical feasibility, especially in high-dimensional\nsettings. We address these concerns by deriving a novel efficient formulation\nof EP for GLMs, whose cost scales linearly in the number of covariates p. This\nreduces the state-of-the-art O(p^2 n) per-iteration computational cost of the\nEP routine for GLMs to O(p n min{p,n}), with n being the sample size. We also\nshow that, for binary models and log-linear GLMs approximate predictive means\ncan be obtained at no additional cost. To preserve efficient moment matching\nfor count data, we propose employing a combination of log-normal Laplace\ntransform approximations, avoiding numerical integration. These novel results\nopen the possibility of employing EP in settings that were believed to be\npractically impossible. Improvements over state-of-the-art approaches are\nillustrated both for simulated and real data. The efficient EP implementation\nis available at https://github.com/niccoloanceschi/EPglm.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2407.02128","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Generalized linear models (GLMs) arguably represent the standard approach for statistical regression beyond the Gaussian likelihood scenario. When Bayesian formulations are employed, the general absence of a tractable posterior distribution has motivated the development of deterministic approximations, which are generally more scalable than sampling techniques. Among them, expectation propagation (EP) showed extreme accuracy, usually higher than many variational Bayes solutions. However, the higher computational cost of EP posed concerns about its practical feasibility, especially in high-dimensional settings. We address these concerns by deriving a novel efficient formulation of EP for GLMs, whose cost scales linearly in the number of covariates p. This reduces the state-of-the-art O(p^2 n) per-iteration computational cost of the EP routine for GLMs to O(p n min{p,n}), with n being the sample size. We also show that, for binary models and log-linear GLMs approximate predictive means can be obtained at no additional cost. To preserve efficient moment matching for count data, we propose employing a combination of log-normal Laplace transform approximations, avoiding numerical integration. These novel results open the possibility of employing EP in settings that were believed to be practically impossible. Improvements over state-of-the-art approaches are illustrated both for simulated and real data. The efficient EP implementation is available at https://github.com/niccoloanceschi/EPglm.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
广义线性模型的可扩展期望传播
广义线性模型(GLM)可以说是超越高斯似然情景的标准统计回归方法。在使用贝叶斯公式时,由于普遍缺乏可操作的后分布,因此人们开发了确定性近似方法,这种方法通常比抽样技术更具可扩展性。其中,期望传播(EP)显示出极高的准确性,通常高于许多变量贝叶斯解决方案。然而,EP 较高的计算成本使人们对其实际可行性产生了担忧,尤其是在高维环境中。为了解决这些问题,我们为 GLMs 推导了一种新的高效 EP 方案,其成本与协方差的数量 p 成线性比例,从而将 GLMs 的 EP 例程的最新 O(p^2 n) 每次迭代计算成本降至 O(p,n),n 为样本大小。我们还证明,对于二元模型和对数线性 GLM,可以在不增加成本的情况下获得近似预测均值。为了保持计数数据的有效矩匹配,我们建议采用对数正态拉普变换近似的组合,避免数值积分。这些新颖的结果为在人们认为实际上不可能的情况下使用 EP 提供了可能性。在模拟数据和真实数据方面,与最先进的方法相比都有很大改进。高效的 EP 实现可在 https://github.com/niccoloanceschi/EPglm 上获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Model-Embedded Gaussian Process Regression for Parameter Estimation in Dynamical System Effects of the entropy source on Monte Carlo simulations A Robust Approach to Gaussian Processes Implementation HJ-sampler: A Bayesian sampler for inverse problems of a stochastic process by leveraging Hamilton-Jacobi PDEs and score-based generative models Reducing Shape-Graph Complexity with Application to Classification of Retinal Blood Vessels and Neurons
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1