{"title":"Symmetric Least Squares Estimates of Functional Relationships","authors":"Michael T. Kane","doi":"10.1002/ets2.12331","DOIUrl":null,"url":null,"abstract":"<p>Ordinary least squares (OLS) regression provides optimal linear predictions of a dependent variable, <i>y</i>, given an independent variable, <i>x</i>, but OLS regressions are not symmetric or reversible. In order to get optimal linear predictions of <i>x</i> given <i>y</i>, a separate OLS regression in that direction would be needed. This report provides a least squares derivation of the geometric mean (GM) regression line, which is symmetric and reversible, as the line that minimizes a weighted sum of the mean squared errors for <i>y</i>, given <i>x</i>, and for <i>x</i>, given <i>y</i>. It is shown that the GM regression line is symmetric and predicts equally well (or poorly, depending on the absolute value of <i>r</i><sub><i>xy</i></sub>) in both directions. The errors of prediction for the GM line are, naturally, larger for the predictions of both <i>x</i> and <i>y</i> than those for the two OLS equations, each of which is specifically optimized for prediction in one direction, but for high values of , the difference is not large. The GM line has previously been derived as a special case of principal-components analysis and gets its name from the fact that its slope is equal to the geometric mean of the slopes of the OLS regressions of <i>y</i> on <i>x</i> and <i>x</i> on <i>y</i>.</p>","PeriodicalId":11972,"journal":{"name":"ETS Research Report Series","volume":"2021 1","pages":"1-14"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/ets2.12331","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ETS Research Report Series","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/ets2.12331","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 1
Abstract
Ordinary least squares (OLS) regression provides optimal linear predictions of a dependent variable, y, given an independent variable, x, but OLS regressions are not symmetric or reversible. In order to get optimal linear predictions of x given y, a separate OLS regression in that direction would be needed. This report provides a least squares derivation of the geometric mean (GM) regression line, which is symmetric and reversible, as the line that minimizes a weighted sum of the mean squared errors for y, given x, and for x, given y. It is shown that the GM regression line is symmetric and predicts equally well (or poorly, depending on the absolute value of rxy) in both directions. The errors of prediction for the GM line are, naturally, larger for the predictions of both x and y than those for the two OLS equations, each of which is specifically optimized for prediction in one direction, but for high values of , the difference is not large. The GM line has previously been derived as a special case of principal-components analysis and gets its name from the fact that its slope is equal to the geometric mean of the slopes of the OLS regressions of y on x and x on y.