{"title":"Navigating interpretability and alpha control in GF-KCSD testing with measurement error: A Kernel approach","authors":"Elham Afzali, Saman Muthukumarana, Liqun Wang","doi":"10.1016/j.mlwa.2024.100581","DOIUrl":null,"url":null,"abstract":"<div><p>The Gradient-Free Kernel Conditional Stein Discrepancy (GF-KCSD), presented in our prior work, represents a significant advancement in goodness-of-fit testing for conditional distributions. This method offers a robust alternative to previous gradient-based techniques, specially when the gradient calculation is intractable or computationally expensive. In this study, we explore previously unexamined aspects of GF-KCSD, with a particular focus on critical values and test power—essential components for effective hypothesis testing. We also present novel investigation on the impact of measurement errors on the performance of GF-KCSD in comparison to established benchmarks, enhancing our understanding of its resilience to these errors. Through controlled experiments using synthetic data, we demonstrate GF-KCSD’s superior ability to control type-I error rates and maintain high statistical power, even in the presence of measurement inaccuracies. Our empirical evaluation extends to real-world datasets, including brain MRI data. The findings confirm that GF-KCSD performs comparably to KCSD in hypothesis testing effectiveness while requiring significantly less computational time. This demonstrates GF-KCSD’s capability as an efficient tool for analyzing complex data, enhancing its value for scenarios that demand rapid and robust statistical analysis.</p></div>","PeriodicalId":74093,"journal":{"name":"Machine learning with applications","volume":"17 ","pages":"Article 100581"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2666827024000574/pdfft?md5=64343827db2919a23c638fc11f2df65c&pid=1-s2.0-S2666827024000574-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine learning with applications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666827024000574","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The Gradient-Free Kernel Conditional Stein Discrepancy (GF-KCSD), presented in our prior work, represents a significant advancement in goodness-of-fit testing for conditional distributions. This method offers a robust alternative to previous gradient-based techniques, specially when the gradient calculation is intractable or computationally expensive. In this study, we explore previously unexamined aspects of GF-KCSD, with a particular focus on critical values and test power—essential components for effective hypothesis testing. We also present novel investigation on the impact of measurement errors on the performance of GF-KCSD in comparison to established benchmarks, enhancing our understanding of its resilience to these errors. Through controlled experiments using synthetic data, we demonstrate GF-KCSD’s superior ability to control type-I error rates and maintain high statistical power, even in the presence of measurement inaccuracies. Our empirical evaluation extends to real-world datasets, including brain MRI data. The findings confirm that GF-KCSD performs comparably to KCSD in hypothesis testing effectiveness while requiring significantly less computational time. This demonstrates GF-KCSD’s capability as an efficient tool for analyzing complex data, enhancing its value for scenarios that demand rapid and robust statistical analysis.