{"title":"拟合多层次因子模型","authors":"Tetiana Parshakova, Trevor Hastie, Stephen Boyd","doi":"arxiv-2409.12067","DOIUrl":null,"url":null,"abstract":"We examine a special case of the multilevel factor model, with covariance\ngiven by multilevel low rank (MLR) matrix~\\cite{parshakova2023factor}. We\ndevelop a novel, fast implementation of the expectation-maximization (EM)\nalgorithm, tailored for multilevel factor models, to maximize the likelihood of\nthe observed data. This method accommodates any hierarchical structure and\nmaintains linear time and storage complexities per iteration. This is achieved\nthrough a new efficient technique for computing the inverse of the positive\ndefinite MLR matrix. We show that the inverse of an invertible PSD MLR matrix\nis also an MLR matrix with the same sparsity in factors, and we use the\nrecursive Sherman-Morrison-Woodbury matrix identity to obtain the factors of\nthe inverse. Additionally, we present an algorithm that computes the Cholesky\nfactorization of an expanded matrix with linear time and space complexities,\nyielding the covariance matrix as its Schur complement. This paper is\naccompanied by an open-source package that implements the proposed methods.","PeriodicalId":501340,"journal":{"name":"arXiv - STAT - Machine Learning","volume":"18 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fitting Multilevel Factor Models\",\"authors\":\"Tetiana Parshakova, Trevor Hastie, Stephen Boyd\",\"doi\":\"arxiv-2409.12067\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We examine a special case of the multilevel factor model, with covariance\\ngiven by multilevel low rank (MLR) matrix~\\\\cite{parshakova2023factor}. We\\ndevelop a novel, fast implementation of the expectation-maximization (EM)\\nalgorithm, tailored for multilevel factor models, to maximize the likelihood of\\nthe observed data. This method accommodates any hierarchical structure and\\nmaintains linear time and storage complexities per iteration. This is achieved\\nthrough a new efficient technique for computing the inverse of the positive\\ndefinite MLR matrix. We show that the inverse of an invertible PSD MLR matrix\\nis also an MLR matrix with the same sparsity in factors, and we use the\\nrecursive Sherman-Morrison-Woodbury matrix identity to obtain the factors of\\nthe inverse. Additionally, we present an algorithm that computes the Cholesky\\nfactorization of an expanded matrix with linear time and space complexities,\\nyielding the covariance matrix as its Schur complement. This paper is\\naccompanied by an open-source package that implements the proposed methods.\",\"PeriodicalId\":501340,\"journal\":{\"name\":\"arXiv - STAT - Machine Learning\",\"volume\":\"18 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - STAT - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.12067\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.12067","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We examine a special case of the multilevel factor model, with covariance
given by multilevel low rank (MLR) matrix~\cite{parshakova2023factor}. We
develop a novel, fast implementation of the expectation-maximization (EM)
algorithm, tailored for multilevel factor models, to maximize the likelihood of
the observed data. This method accommodates any hierarchical structure and
maintains linear time and storage complexities per iteration. This is achieved
through a new efficient technique for computing the inverse of the positive
definite MLR matrix. We show that the inverse of an invertible PSD MLR matrix
is also an MLR matrix with the same sparsity in factors, and we use the
recursive Sherman-Morrison-Woodbury matrix identity to obtain the factors of
the inverse. Additionally, we present an algorithm that computes the Cholesky
factorization of an expanded matrix with linear time and space complexities,
yielding the covariance matrix as its Schur complement. This paper is
accompanied by an open-source package that implements the proposed methods.