Shivam Pandey, Francois Lanusse, Chirag Modi, Benjamin D. Wandelt
We develop a transformer-based conditional generative model for discrete point objects and their properties. We use it to build a model for populating cosmological simulations with gravitationally collapsed structures called dark matter halos. Specifically, we condition our model with dark matter distribution obtained from fast, approximate simulations to recover the correct three-dimensional positions and masses of individual halos. This leads to a first model that can recover the statistical properties of the halos at small scales to better than 3% level using an accelerated dark matter simulation. This trained model can then be applied to simulations with significantly larger volumes which would otherwise be computationally prohibitive with traditional simulations, and also provides a crucial missing link in making end-to-end differentiable cosmological simulations. The code, named GOTHAM (Generative cOnditional Transformer for Halo's Auto-regressive Modeling) is publicly available at url{https://github.com/shivampcosmo/GOTHAM}.
我们为离散点物体及其属性开发了一个基于变换器的条件生成模型。我们用它建立了一个模型,用于在宇宙学模拟中填充被称为暗物质晕的引力塌缩结构。具体地说,我们用从快速近似模拟中获得的暗物质分布作为我们模型的条件,以恢复单个光环的正确三维位置和质量。这个训练有素的模型可以应用于体积很大的模拟,否则传统模拟的计算量将会非常大,同时也为端到端可分辨宇宙学模拟提供了一个关键的缺失环节。该代码被命名为GOTHAM(GenerativecOnditional Transformer for Halo's Auto-regressive Modeling),可在(url{https://github.com/shivampcosmo/GOTHAM})上公开获取。
{"title":"Teaching dark matter simulations to speak the halo language","authors":"Shivam Pandey, Francois Lanusse, Chirag Modi, Benjamin D. Wandelt","doi":"arxiv-2409.11401","DOIUrl":"https://doi.org/arxiv-2409.11401","url":null,"abstract":"We develop a transformer-based conditional generative model for discrete\u0000point objects and their properties. We use it to build a model for populating\u0000cosmological simulations with gravitationally collapsed structures called dark\u0000matter halos. Specifically, we condition our model with dark matter\u0000distribution obtained from fast, approximate simulations to recover the correct\u0000three-dimensional positions and masses of individual halos. This leads to a\u0000first model that can recover the statistical properties of the halos at small\u0000scales to better than 3% level using an accelerated dark matter simulation.\u0000This trained model can then be applied to simulations with significantly larger\u0000volumes which would otherwise be computationally prohibitive with traditional\u0000simulations, and also provides a crucial missing link in making end-to-end\u0000differentiable cosmological simulations. The code, named GOTHAM (Generative\u0000cOnditional Transformer for Halo's Auto-regressive Modeling) is publicly\u0000available at url{https://github.com/shivampcosmo/GOTHAM}.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"19 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260602","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alexis Lau, Élodie Choquet, Lisa Altinier, Iva Laginja, Rémi Soummer, Laurent Pueyo, Nicolas Godoy, Arthur Vigan, David Mary
The Roman Space Telescope will be a critical mission to demonstrate high-contrast imaging technologies allowing for the characterisation of exoplanets in reflected light. It will demonstrate $10^{-7}$ contrast limits or better at 3--9 $lambda / D$ separations with active wavefront control for the first time in space. The detection limits for the Coronagraph Instrument are expected to be set by wavefront variations between the science target and the reference star observations. We are investigating methods to use the deformablel mirrors to methodically probe the impact of such variations on the coronagraphic PSF, generating a PSF library during observations of the reference star to optimise the starlight subtraction at post-processing. We are collaborating with STScI to test and validate these methods in lab using the HiCAT tested, a high-contrast imaging lab platform dedicated to system-level developments for future space missions. In this paper, we will present the first applications of these methods on HiCAT.
{"title":"ESCAPE project: testing active observing strategies for high-contrast imaging in space on the HiCAT testbed","authors":"Alexis Lau, Élodie Choquet, Lisa Altinier, Iva Laginja, Rémi Soummer, Laurent Pueyo, Nicolas Godoy, Arthur Vigan, David Mary","doi":"arxiv-2409.11062","DOIUrl":"https://doi.org/arxiv-2409.11062","url":null,"abstract":"The Roman Space Telescope will be a critical mission to demonstrate\u0000high-contrast imaging technologies allowing for the characterisation of\u0000exoplanets in reflected light. It will demonstrate $10^{-7}$ contrast limits or\u0000better at 3--9 $lambda / D$ separations with active wavefront control for the\u0000first time in space. The detection limits for the Coronagraph Instrument are\u0000expected to be set by wavefront variations between the science target and the\u0000reference star observations. We are investigating methods to use the\u0000deformablel mirrors to methodically probe the impact of such variations on the\u0000coronagraphic PSF, generating a PSF library during observations of the\u0000reference star to optimise the starlight subtraction at post-processing. We are\u0000collaborating with STScI to test and validate these methods in lab using the\u0000HiCAT tested, a high-contrast imaging lab platform dedicated to system-level\u0000developments for future space missions. In this paper, we will present the\u0000first applications of these methods on HiCAT.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"23 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mario Casado Diezon behalf of the Gaia4Sustainability team
Light pollution is a growing environmental issue that affects astronomy, ecosystems, human health. To address this, we introduce the Free Dark Sky Meter (FreeDSM), an affordable IoT-based photometer designed for continuous light pollution monitoring. FreeDSM uses an ESP32 microcontroller with integrated sensors for light, temperature, and humidity, and operates on an open-source platform. Data from multiple devices are centralized and processed using the Gambons model, which leverages Gaia satellite data for accurate real-time assessments of natural light levels. This project is part of the Gaia4Sustainability initiative.
{"title":"FreeDSM and the Gaia4Sustaniability project: a light pollution meter based on IoT technologies","authors":"Mario Casado Diezon behalf of the Gaia4Sustainability team","doi":"arxiv-2409.10298","DOIUrl":"https://doi.org/arxiv-2409.10298","url":null,"abstract":"Light pollution is a growing environmental issue that affects astronomy,\u0000ecosystems, human health. To address this, we introduce the Free Dark Sky Meter\u0000(FreeDSM), an affordable IoT-based photometer designed for continuous light\u0000pollution monitoring. FreeDSM uses an ESP32 microcontroller with integrated\u0000sensors for light, temperature, and humidity, and operates on an open-source\u0000platform. Data from multiple devices are centralized and processed using the\u0000Gambons model, which leverages Gaia satellite data for accurate real-time\u0000assessments of natural light levels. This project is part of the\u0000Gaia4Sustainability initiative.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"70 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260606","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Ramirez, G. Pignata, Francisco Förster, Santiago Gonzáles-Gaitán, Claudia P. Gutiérrez, B. Ayala, Guillermo Cabrera-Vives, Márcio Catelan, A. M. Muñoz Arancibia, J. Pineda-García
This paper introduces a novel method for creating spectral time series, which can be used for generating synthetic light curves for photometric classification but also for applications like K-corrections and bolometric corrections. This approach is particularly valuable in the era of large astronomical surveys, where it can significantly enhance the analysis and understanding of an increasing number of SNe, even in the absence of extensive spectroscopic data. methods: By employing interpolations based on optimal transport theory, starting from a spectroscopic sequence, we derive weighted average spectra with high cadence. The weights incorporate an uncertainty factor, for penalizing interpolations between spectra with significant epoch differences and with poor match between the synthetic and observed photometry. results: Our analysis reveals that even with phase difference of up to 40 days between pairs of spectra, optical transport can generate interpolated spectral time series that closely resemble the original ones. Synthetic photometry extracted from these spectral time series aligns well with observed photometry. The best results are achieved in the V band, with relative residuals less than 10% for 87% and 84% of the data for type Ia and II, respectively. For the B, g, R and r bands the relative residuals are between 65% and 87% within the previously mentioned 10% threshold for both classes. The worse results correspond to the i and I bands where, in the case, of SN~Ia the values drop to 53% and 42%, respectively. conclusions: We introduce a new method to construct spectral time series for individual SN starting from a sparse spectroscopic sequence, demonstrating its capability to produce reliable light curves that can be used for photometric classification.
本文介绍了一种创建光谱时间序列的新方法,该方法不仅可用于生成用于测光分类的合成光曲线,还可用于 K 校正和测电校正等应用。这种方法在大型天文巡天时代尤为重要,即使在缺乏扩展光谱数据的情况下,它也能显著增强对越来越多的SNE的分析和理解:通过采用基于最优传输理论的内插法,从光谱序列出发,我们得出了加权平均高频率光谱。权重包含了一个不确定性因子,用于惩罚具有显著年代差异以及合成光度测量与观测光度测量之间匹配度较差的光谱之间的内插:我们的分析表明,即使光谱对之间的相位差高达 40 天,光传输也能产生与原始光谱非常相似的内插光谱时间序列。从这些光谱时间序列中提取的合成测光结果与观测到的测光结果非常吻合。V 波段的结果最好,Ia 型和 II 型分别有 87% 和 84% 的数据的相对残差小于 10%。在 B、g、R 和 r 波段,两类数据的相对残差都在 65% 到 87% 之间,不超过前面提到的 10% 的临界值。结果较差的是 i 和 I 波段,在 SN~Ia 的情况下,其值分别下降到 53% 和 42%:我们介绍了一种从稀疏光谱序列开始为单个SN构建光谱时间序列的新方法,证明了它能够产生可靠的光曲线,并可用于光度分类。
{"title":"A Novel Optimal Transport-Based Approach for Interpolating Spectral Time Series: Paving the Way for Photometric Classification of Supernovae","authors":"M. Ramirez, G. Pignata, Francisco Förster, Santiago Gonzáles-Gaitán, Claudia P. Gutiérrez, B. Ayala, Guillermo Cabrera-Vives, Márcio Catelan, A. M. Muñoz Arancibia, J. Pineda-García","doi":"arxiv-2409.10701","DOIUrl":"https://doi.org/arxiv-2409.10701","url":null,"abstract":"This paper introduces a novel method for creating spectral time series, which\u0000can be used for generating synthetic light curves for photometric\u0000classification but also for applications like K-corrections and bolometric\u0000corrections. This approach is particularly valuable in the era of large\u0000astronomical surveys, where it can significantly enhance the analysis and\u0000understanding of an increasing number of SNe, even in the absence of extensive\u0000spectroscopic data. methods: By employing interpolations based on optimal\u0000transport theory, starting from a spectroscopic sequence, we derive weighted\u0000average spectra with high cadence. The weights incorporate an uncertainty\u0000factor, for penalizing interpolations between spectra with significant epoch\u0000differences and with poor match between the synthetic and observed photometry.\u0000results: Our analysis reveals that even with phase difference of up to 40 days\u0000between pairs of spectra, optical transport can generate interpolated spectral\u0000time series that closely resemble the original ones. Synthetic photometry\u0000extracted from these spectral time series aligns well with observed photometry.\u0000The best results are achieved in the V band, with relative residuals less than\u000010% for 87% and 84% of the data for type Ia and II, respectively. For the B, g,\u0000R and r bands the relative residuals are between 65% and 87% within the\u0000previously mentioned 10% threshold for both classes. The worse results\u0000correspond to the i and I bands where, in the case, of SN~Ia the values drop to\u000053% and 42%, respectively. conclusions: We introduce a new method to construct\u0000spectral time series for individual SN starting from a sparse spectroscopic\u0000sequence, demonstrating its capability to produce reliable light curves that\u0000can be used for photometric classification.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"35 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260603","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. A. SemenikhinThe SNAD team, M. V. KornilovThe SNAD team, M. V. PruzhinskayaThe SNAD team, A. D. LavrukhinaThe SNAD team, E. RusseilThe SNAD team, E. GanglerThe SNAD team, E. E. O. IshidaThe SNAD team, V. S. KorolevThe SNAD team, K. L. MalanchevThe SNAD team, A. A. VolnovaThe SNAD team, S. SreejithThe SNAD team
In the task of anomaly detection in modern time-domain photometric surveys, the primary goal is to identify astrophysically interesting, rare, and unusual objects among a large volume of data. Unfortunately, artifacts -- such as plane or satellite tracks, bad columns on CCDs, and ghosts -- often constitute significant contaminants in results from anomaly detection analysis. In such contexts, the Active Anomaly Discovery (AAD) algorithm allows tailoring the output of anomaly detection pipelines according to what the expert judges to be scientifically interesting. We demonstrate how the introduction real-bogus scores, obtained from a machine learning classifier, improves the results from AAD. Using labeled data from the SNAD ZTF knowledge database, we train four real-bogus classifiers: XGBoost, CatBoost, Random Forest, and Extremely Randomized Trees. All the models perform real-bogus classification with similar effectiveness, achieving ROC-AUC scores ranging from 0.93 to 0.95. Consequently, we select the Random Forest model as the main model due to its simplicity and interpretability. The Random Forest classifier is applied to 67 million light curves from ZTF DR17. The output real-bogus score is used as an additional feature for two anomaly detection algorithms: static Isolation Forest and AAD. While results from Isolation Forest remained unchanged, the number of artifacts detected by the active approach decreases significantly with the inclusion of the real-bogus score, from 27 to 3 out of 100. We conclude that incorporating the real-bogus classifier result as an additional feature in the active anomaly detection pipeline significantly reduces the number of artifacts in the outputs, thereby increasing the incidence of astrophysically interesting objects presented to human experts.
{"title":"Real-bogus scores for active anomaly detection","authors":"T. A. SemenikhinThe SNAD team, M. V. KornilovThe SNAD team, M. V. PruzhinskayaThe SNAD team, A. D. LavrukhinaThe SNAD team, E. RusseilThe SNAD team, E. GanglerThe SNAD team, E. E. O. IshidaThe SNAD team, V. S. KorolevThe SNAD team, K. L. MalanchevThe SNAD team, A. A. VolnovaThe SNAD team, S. SreejithThe SNAD team","doi":"arxiv-2409.10256","DOIUrl":"https://doi.org/arxiv-2409.10256","url":null,"abstract":"In the task of anomaly detection in modern time-domain photometric surveys,\u0000the primary goal is to identify astrophysically interesting, rare, and unusual\u0000objects among a large volume of data. Unfortunately, artifacts -- such as plane\u0000or satellite tracks, bad columns on CCDs, and ghosts -- often constitute\u0000significant contaminants in results from anomaly detection analysis. In such\u0000contexts, the Active Anomaly Discovery (AAD) algorithm allows tailoring the\u0000output of anomaly detection pipelines according to what the expert judges to be\u0000scientifically interesting. We demonstrate how the introduction real-bogus\u0000scores, obtained from a machine learning classifier, improves the results from\u0000AAD. Using labeled data from the SNAD ZTF knowledge database, we train four\u0000real-bogus classifiers: XGBoost, CatBoost, Random Forest, and Extremely\u0000Randomized Trees. All the models perform real-bogus classification with similar\u0000effectiveness, achieving ROC-AUC scores ranging from 0.93 to 0.95.\u0000Consequently, we select the Random Forest model as the main model due to its\u0000simplicity and interpretability. The Random Forest classifier is applied to 67\u0000million light curves from ZTF DR17. The output real-bogus score is used as an\u0000additional feature for two anomaly detection algorithms: static Isolation\u0000Forest and AAD. While results from Isolation Forest remained unchanged, the\u0000number of artifacts detected by the active approach decreases significantly\u0000with the inclusion of the real-bogus score, from 27 to 3 out of 100. We\u0000conclude that incorporating the real-bogus classifier result as an additional\u0000feature in the active anomaly detection pipeline significantly reduces the\u0000number of artifacts in the outputs, thereby increasing the incidence of\u0000astrophysically interesting objects presented to human experts.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"11 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260608","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lunar reference systems represent a fundamental aspect of lunar exploration. This paper presents a review of the topic in the context of the ESA lunar programme, MoonLight. This paper describes the current state of the art in the definition of the lunar reference frame and introduces TCL, a lunar time scale based on IAU resolutions. It also proposes several possible implementations of this time scale for orbiting and ground-based clocks. Finally, it provides an assessment of the improvement of the lunar reference frame that would result from the addition of lunar retro-reflectors on the Moon surface and the use of orbiter altimetry. This document is an appendix dedicated to lunar reference system definition of a more global document dedicated to the presentation of new concepts in orbit determination and time synchronization of a lunar radio navigation system.
{"title":"Lunar References Systems, Frames and Time-scales in the context of the ESA Programme Moonlight","authors":"Agnes Fienga, Nicolas Rambaux, Krzysztof Sosnica","doi":"arxiv-2409.10043","DOIUrl":"https://doi.org/arxiv-2409.10043","url":null,"abstract":"Lunar reference systems represent a fundamental aspect of lunar exploration.\u0000This paper presents a review of the topic in the context of the ESA lunar\u0000programme, MoonLight. This paper describes the current state of the art in the\u0000definition of the lunar reference frame and introduces TCL, a lunar time scale\u0000based on IAU resolutions. It also proposes several possible implementations of\u0000this time scale for orbiting and ground-based clocks. Finally, it provides an\u0000assessment of the improvement of the lunar reference frame that would result\u0000from the addition of lunar retro-reflectors on the Moon surface and the use of\u0000orbiter altimetry. This document is an appendix dedicated to lunar reference\u0000system definition of a more global document dedicated to the presentation of\u0000new concepts in orbit determination and time synchronization of a lunar radio\u0000navigation system.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"26 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260648","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Aneta Siemiginowska, Douglas Burke, Hans Moritz Günther, Nicholas P. Lee, Warren McLaughlin, David A. Principe, Harlan Cheer, Antonella Fruscione, Omar Laurino, Jonathan McDowell, Marie Terrell
We present an overview of Sherpa, an open source Python project, and discuss its development history, broad design concepts and capabilities. Sherpa contains powerful tools for combining parametric models into complex expressions that can be fit to data using a variety of statistics and optimization methods. It is easily extensible to include user-defined models, statistics, and optimization methods. It provides a high-level User Interface for interactive data-analysis, such as within a Jupyter notebook, and it can also be used as a library component, providing fitting and modeling capabilities to an application. We include a few examples of Sherpa applications to multiwavelength astronomical data. The code is available GitHub: https://github.com/sherpa/sherpa
{"title":"Sherpa: An Open Source Python Fitting Package","authors":"Aneta Siemiginowska, Douglas Burke, Hans Moritz Günther, Nicholas P. Lee, Warren McLaughlin, David A. Principe, Harlan Cheer, Antonella Fruscione, Omar Laurino, Jonathan McDowell, Marie Terrell","doi":"arxiv-2409.10400","DOIUrl":"https://doi.org/arxiv-2409.10400","url":null,"abstract":"We present an overview of Sherpa, an open source Python project, and discuss\u0000its development history, broad design concepts and capabilities. Sherpa\u0000contains powerful tools for combining parametric models into complex\u0000expressions that can be fit to data using a variety of statistics and\u0000optimization methods. It is easily extensible to include user-defined models,\u0000statistics, and optimization methods. It provides a high-level User Interface\u0000for interactive data-analysis, such as within a Jupyter notebook, and it can\u0000also be used as a library component, providing fitting and modeling\u0000capabilities to an application. We include a few examples of Sherpa\u0000applications to multiwavelength astronomical data. The code is available\u0000GitHub: https://github.com/sherpa/sherpa","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"29 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260605","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hyukmo Kang, Kyle Van Gorkom, Meghdoot Biswas, Daewook Kim, Ewan S. Douglas
Continuous wavefront sensing benefits space observatories in on-orbit optical performance maintenance. To measure the phase of a wavefront, phase retrieval is an attractive technique as it uses multiple point spread function (PSF) images that are acquired by the telescope itself without extra metrology systems nor complicated calibration. The focus diverse phase retrieval utilizes PSFs from predetermined defocused positions to enhance the dynamic range of the algorithm. We describe an updated visible light active optics testbed with the addition of a linear motorized focus stage. The performance of the phase retrieval algorithm in broadband is tested under various cases. While broadband pass filters have advantages in higher signal-to-noise ratio (SNR), the performance of phase retrieval can be restricted due to blurred image caused by diffraction and increased computing cost. We used multiple bandpass filters (10 nm, 88 nm, and 150 nm) and investigated effects of bandwidth on the accuracy and required image acquisition conditions such as SNR, reaching accuracies below 20 nm RMS wavefront error at the widest bandwidth. We also investigated the dynamic range of the phase retrieval algorithm depending on the bandwidth and required amount of defocus to expand dynamic range. Finally, we simulated the continuous wavefront sensing and correction loop with a range of statistically generated representative telescope disturbance time series to test for edge cases.
{"title":"Focus diverse phase retrieval test results on broadband continuous wavefront sensing in space telescope applications","authors":"Hyukmo Kang, Kyle Van Gorkom, Meghdoot Biswas, Daewook Kim, Ewan S. Douglas","doi":"arxiv-2409.10500","DOIUrl":"https://doi.org/arxiv-2409.10500","url":null,"abstract":"Continuous wavefront sensing benefits space observatories in on-orbit optical\u0000performance maintenance. To measure the phase of a wavefront, phase retrieval\u0000is an attractive technique as it uses multiple point spread function (PSF)\u0000images that are acquired by the telescope itself without extra metrology\u0000systems nor complicated calibration. The focus diverse phase retrieval utilizes\u0000PSFs from predetermined defocused positions to enhance the dynamic range of the\u0000algorithm. We describe an updated visible light active optics testbed with the\u0000addition of a linear motorized focus stage. The performance of the phase\u0000retrieval algorithm in broadband is tested under various cases. While broadband\u0000pass filters have advantages in higher signal-to-noise ratio (SNR), the\u0000performance of phase retrieval can be restricted due to blurred image caused by\u0000diffraction and increased computing cost. We used multiple bandpass filters (10\u0000nm, 88 nm, and 150 nm) and investigated effects of bandwidth on the accuracy\u0000and required image acquisition conditions such as SNR, reaching accuracies\u0000below 20 nm RMS wavefront error at the widest bandwidth. We also investigated\u0000the dynamic range of the phase retrieval algorithm depending on the bandwidth\u0000and required amount of defocus to expand dynamic range. Finally, we simulated\u0000the continuous wavefront sensing and correction loop with a range of\u0000statistically generated representative telescope disturbance time series to\u0000test for edge cases.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"85 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Z. Wang, K. W. Bannister, V. Gupta, X. Deng, M. Pilawa, J. Tuthill, J. D. Bunton, C. Flynn, M. Glowacki, A. Jaini, Y. W. J. Lee, E. Lenc, J. Lucero, A. Paek, R. Radhakrishnan, N. Thyagarajan, P. Uttarkar, Y. Wang, N. D. R. Bhat, C. W. James, V. A. Moss, Tara Murphy, J. E. Reynolds, R. M. Shannon, L. G. Spitler, A. Tzioumis, M. Caleb, A. T. Deller, A. C. Gordon, L. Marnoch, S. D. Ryder, S. Simha, C. S. Anderson, L. Ball, D. Brodrick, F. R. Cooray, N. Gupta, D. B. Hayman, A. Ng, S. E. Pearce, C. Phillips, M. A. Voronkov, T. Westmeier
We present the first results from a new backend on the Australian Square Kilometre Array Pathfinder, the Commensal Realtime ASKAP Fast Transient COherent (CRACO) upgrade. CRACO records millisecond time resolution visibility data, and searches for dispersed fast transient signals including fast radio bursts (FRB), pulsars, and ultra-long period objects (ULPO). With the visibility data, CRACO can localise the transient events to arcsecond-level precision after the detection. Here, we describe the CRACO system and report the result from a sky survey carried out by CRACO at 110ms resolution during its commissioning phase. During the survey, CRACO detected two FRBs (including one discovered solely with CRACO, FRB 20231027A), reported more precise localisations for four pulsars, discovered two new RRATs, and detected one known ULPO, GPM J1839-10, through its sub-pulse structure. We present a sensitivity calibration of CRACO, finding that it achieves the expected sensitivity of 11.6 Jy ms to bursts of 110 ms duration or less. CRACO is currently running at a 13.8 ms time resolution and aims at a 1.7 ms time resolution before the end of 2024. The planned CRACO has an expected sensitivity of 1.5 Jy ms to bursts of 1.7 ms duration or less, and can detect 10x more FRBs than the current CRAFT incoherent sum system (i.e., 0.5-2 localised FRBs per day), enabling us to better constrain the FRB emission mechanism model and use them as cosmological probes.
{"title":"The CRAFT Coherent (CRACO) upgrade I: System Description and Results of the 110-ms Radio Transient Pilot Survey","authors":"Z. Wang, K. W. Bannister, V. Gupta, X. Deng, M. Pilawa, J. Tuthill, J. D. Bunton, C. Flynn, M. Glowacki, A. Jaini, Y. W. J. Lee, E. Lenc, J. Lucero, A. Paek, R. Radhakrishnan, N. Thyagarajan, P. Uttarkar, Y. Wang, N. D. R. Bhat, C. W. James, V. A. Moss, Tara Murphy, J. E. Reynolds, R. M. Shannon, L. G. Spitler, A. Tzioumis, M. Caleb, A. T. Deller, A. C. Gordon, L. Marnoch, S. D. Ryder, S. Simha, C. S. Anderson, L. Ball, D. Brodrick, F. R. Cooray, N. Gupta, D. B. Hayman, A. Ng, S. E. Pearce, C. Phillips, M. A. Voronkov, T. Westmeier","doi":"arxiv-2409.10316","DOIUrl":"https://doi.org/arxiv-2409.10316","url":null,"abstract":"We present the first results from a new backend on the Australian Square\u0000Kilometre Array Pathfinder, the Commensal Realtime ASKAP Fast Transient\u0000COherent (CRACO) upgrade. CRACO records millisecond time resolution visibility\u0000data, and searches for dispersed fast transient signals including fast radio\u0000bursts (FRB), pulsars, and ultra-long period objects (ULPO). With the\u0000visibility data, CRACO can localise the transient events to arcsecond-level\u0000precision after the detection. Here, we describe the CRACO system and report\u0000the result from a sky survey carried out by CRACO at 110ms resolution during\u0000its commissioning phase. During the survey, CRACO detected two FRBs (including\u0000one discovered solely with CRACO, FRB 20231027A), reported more precise\u0000localisations for four pulsars, discovered two new RRATs, and detected one\u0000known ULPO, GPM J1839-10, through its sub-pulse structure. We present a\u0000sensitivity calibration of CRACO, finding that it achieves the expected\u0000sensitivity of 11.6 Jy ms to bursts of 110 ms duration or less. CRACO is\u0000currently running at a 13.8 ms time resolution and aims at a 1.7 ms time\u0000resolution before the end of 2024. The planned CRACO has an expected\u0000sensitivity of 1.5 Jy ms to bursts of 1.7 ms duration or less, and can detect\u000010x more FRBs than the current CRAFT incoherent sum system (i.e., 0.5-2\u0000localised FRBs per day), enabling us to better constrain the FRB emission\u0000mechanism model and use them as cosmological probes.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"16 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260646","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Imdad Mahmud Pathi, John Y. H. Soo, Mao Jie Wee, Sazatul Nadhilah Zakaria, Nur Azwin Ismail, Carlton M. Baugh, Giorgio Manzoni, Enrique Gaztanaga, Francisco J. Castander, Martin Eriksen, Jorge Carretero, Enrique Fernandez, Juan Garcia-Bellido, Ramon Miquel, Cristobal Padilla, Pablo Renard, Eusebio Sanchez, Ignacio Sevilla-Noarbe, Pau Tallada-Crespí
ANNZ is a fast and simple algorithm which utilises artificial neural networks (ANNs), it was known as one of the pioneers of machine learning approaches to photometric redshift estimation decades ago. We enhanced the algorithm by introducing new activation functions like tanh, softplus, SiLU, Mish and ReLU variants; its new performance is then vigorously tested on legacy samples like the Luminous Red Galaxy (LRG) and Stripe-82 samples from SDSS, as well as modern galaxy samples like the Physics of the Accelerating Universe Survey (PAUS). This work focuses on testing the robustness of activation functions with respect to the choice of ANN architectures, particularly on its depth and width, in the context of galaxy photometric redshift estimation. Our upgraded algorithm, which we named ANNZ+, shows that the tanh and Leaky ReLU activation functions provide more consistent and stable results across deeper and wider architectures with > 1 per cent improvement in root-mean-square error ($sigma_{textrm{RMS}}$) and 68th percentile error ($sigma_{68}$) when tested on SDSS data sets. While assessing its capabilities in handling high dimensional inputs, we achieved an improvement of 11 per cent in $sigma_{textrm{RMS}}$ and 6 per cent in $sigma_{68}$ with the tanh activation function when tested on the 40-narrowband PAUS dataset; it even outperformed ANNZ2, its supposed successor, by 44 per cent in $sigma_{textrm{RMS}}$. This justifies the effort to upgrade the 20-year-old ANNZ, allowing it to remain viable and competitive within the photo-z community today. The updated algorithm ANNZ+ is publicly available at https://github.com/imdadmpt/ANNzPlus.
{"title":"ANNZ+: an enhanced photometric redshift estimation algorithm with applications on the PAU Survey","authors":"Imdad Mahmud Pathi, John Y. H. Soo, Mao Jie Wee, Sazatul Nadhilah Zakaria, Nur Azwin Ismail, Carlton M. Baugh, Giorgio Manzoni, Enrique Gaztanaga, Francisco J. Castander, Martin Eriksen, Jorge Carretero, Enrique Fernandez, Juan Garcia-Bellido, Ramon Miquel, Cristobal Padilla, Pablo Renard, Eusebio Sanchez, Ignacio Sevilla-Noarbe, Pau Tallada-Crespí","doi":"arxiv-2409.09981","DOIUrl":"https://doi.org/arxiv-2409.09981","url":null,"abstract":"ANNZ is a fast and simple algorithm which utilises artificial neural networks\u0000(ANNs), it was known as one of the pioneers of machine learning approaches to\u0000photometric redshift estimation decades ago. We enhanced the algorithm by\u0000introducing new activation functions like tanh, softplus, SiLU, Mish and ReLU\u0000variants; its new performance is then vigorously tested on legacy samples like\u0000the Luminous Red Galaxy (LRG) and Stripe-82 samples from SDSS, as well as\u0000modern galaxy samples like the Physics of the Accelerating Universe Survey\u0000(PAUS). This work focuses on testing the robustness of activation functions\u0000with respect to the choice of ANN architectures, particularly on its depth and\u0000width, in the context of galaxy photometric redshift estimation. Our upgraded\u0000algorithm, which we named ANNZ+, shows that the tanh and Leaky ReLU activation\u0000functions provide more consistent and stable results across deeper and wider\u0000architectures with > 1 per cent improvement in root-mean-square error\u0000($sigma_{textrm{RMS}}$) and 68th percentile error ($sigma_{68}$) when tested\u0000on SDSS data sets. While assessing its capabilities in handling high\u0000dimensional inputs, we achieved an improvement of 11 per cent in\u0000$sigma_{textrm{RMS}}$ and 6 per cent in $sigma_{68}$ with the tanh\u0000activation function when tested on the 40-narrowband PAUS dataset; it even\u0000outperformed ANNZ2, its supposed successor, by 44 per cent in\u0000$sigma_{textrm{RMS}}$. This justifies the effort to upgrade the 20-year-old\u0000ANNZ, allowing it to remain viable and competitive within the photo-z community\u0000today. The updated algorithm ANNZ+ is publicly available at\u0000https://github.com/imdadmpt/ANNzPlus.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"210 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}