{"title":"Helium line emission spectroscopy to measure plasma parameters using modeling and machine learning in low temperature plasmas","authors":"S. Kajita, D. Nishijima","doi":"10.1088/1361-6463/ad6007","DOIUrl":null,"url":null,"abstract":"\n Line intensity ratios (LIRs) of helium (He) atoms are known to depend on electron density, $n_{\\rm e}$, and temperature, $T_{\\rm e}$, and thus are widely utilized to evaluate these parameters, which is the so-called He I LIR method. In this conventional method, measured LIRs are compared with theoretical values calculated using a collisional-radiative (CR) model to find the best possible $n_{\\rm e}$ and $T_{\\rm e}$. Basic CR models have been improved to take into account several effects. For instance, radiation trapping can occur to a significant degree in weakly ionized plasmas, leading to major alterations of LIRs. This effect has been included with optical escape factors in CR models. A new approach to the evaluation of $n_{\\rm e}$ and $T_{\\rm e}$ from He I LIRs has recently been explored using machine learning (ML). In the ML-aided LIR method, a predictive model is developed with training data, which consist of input (measured LIRs) and desired/known output (measured $n_{\\rm e}$ or $T_{\\rm e}$ from other diagnostics). It has been demonstrated that this new method predicts $n_{\\rm e}$ and $T_{\\rm e}$ better than using the conventional method coupled with a CR model, not only for He but also for other species.","PeriodicalId":507822,"journal":{"name":"Journal of Physics D: Applied Physics","volume":" 590","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Physics D: Applied Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/1361-6463/ad6007","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Line intensity ratios (LIRs) of helium (He) atoms are known to depend on electron density, $n_{\rm e}$, and temperature, $T_{\rm e}$, and thus are widely utilized to evaluate these parameters, which is the so-called He I LIR method. In this conventional method, measured LIRs are compared with theoretical values calculated using a collisional-radiative (CR) model to find the best possible $n_{\rm e}$ and $T_{\rm e}$. Basic CR models have been improved to take into account several effects. For instance, radiation trapping can occur to a significant degree in weakly ionized plasmas, leading to major alterations of LIRs. This effect has been included with optical escape factors in CR models. A new approach to the evaluation of $n_{\rm e}$ and $T_{\rm e}$ from He I LIRs has recently been explored using machine learning (ML). In the ML-aided LIR method, a predictive model is developed with training data, which consist of input (measured LIRs) and desired/known output (measured $n_{\rm e}$ or $T_{\rm e}$ from other diagnostics). It has been demonstrated that this new method predicts $n_{\rm e}$ and $T_{\rm e}$ better than using the conventional method coupled with a CR model, not only for He but also for other species.