The Kepler end-to-end data pipeline: From photons to far away worlds

B. Cooke, R. Thompson, S. Standley
{"title":"The Kepler end-to-end data pipeline: From photons to far away worlds","authors":"B. Cooke, R. Thompson, S. Standley","doi":"10.1109/AERO.2012.6187170","DOIUrl":null,"url":null,"abstract":"Launched by NASA on 6 March 2009, the Kepler Mission has been observing more than 100,000 targets in a single patch of sky between the constellations Cygnus and Lyra almost continuously for the last two years looking for planetary systems using the transit method. As of October 2011, the Kepler spacecraft has collected and returned to Earth just over 290 GB of data, identifying 1235 planet candidates with 25 of these candidates confirmed as planets via ground observation. Extracting the telltale signature of a planetary system from stellar photometry where valid signal transients can be as small as a 40 ppm is a difficult and exacting task. The end-to-end process of determining planetary candidates from noisy, raw photometric measurements is discussed. The Kepler mission is described in overview and the Kepler technique for discovering exoplanets is discussed. The design and implementation of the Kepler spacecraft, tracing the data path from photons entering the telescope aperture through raw observation data transmitted to the ground operations team is described. The technical challenges of operating a large aperture photometer with an unprecedented 95 million pixel detector are addressed as well as the onboard technique for processing and reducing the large volume of data produced by the Kepler photometer. The technique and challenge of day-to-day mission operations that result in a very high percentage of time on target is discussed. This includes the day to day process for monitoring and managing the health of the spacecraft, the annual process for maintaining sun on the solar arrays while still keeping the telescope pointed at the fixed science target, the process for safely but rapidly returning to science operations after a spacecraft initiated safing event and the long term anomaly resolution process. The ground data processing pipeline, from the point that science data is received on the ground to the presentation of preliminary planetary candidates and supporting data to the science team for further evaluation is discussed. Ground management, control, exchange and storage of Kepler's large and growing data set is discussed as well as the process and techniques for removing noise sources and applying calibrations to intermediate data products.","PeriodicalId":6421,"journal":{"name":"2012 IEEE Aerospace Conference","volume":"23 1","pages":"1-9"},"PeriodicalIF":0.0000,"publicationDate":"2012-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE Aerospace Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AERO.2012.6187170","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Launched by NASA on 6 March 2009, the Kepler Mission has been observing more than 100,000 targets in a single patch of sky between the constellations Cygnus and Lyra almost continuously for the last two years looking for planetary systems using the transit method. As of October 2011, the Kepler spacecraft has collected and returned to Earth just over 290 GB of data, identifying 1235 planet candidates with 25 of these candidates confirmed as planets via ground observation. Extracting the telltale signature of a planetary system from stellar photometry where valid signal transients can be as small as a 40 ppm is a difficult and exacting task. The end-to-end process of determining planetary candidates from noisy, raw photometric measurements is discussed. The Kepler mission is described in overview and the Kepler technique for discovering exoplanets is discussed. The design and implementation of the Kepler spacecraft, tracing the data path from photons entering the telescope aperture through raw observation data transmitted to the ground operations team is described. The technical challenges of operating a large aperture photometer with an unprecedented 95 million pixel detector are addressed as well as the onboard technique for processing and reducing the large volume of data produced by the Kepler photometer. The technique and challenge of day-to-day mission operations that result in a very high percentage of time on target is discussed. This includes the day to day process for monitoring and managing the health of the spacecraft, the annual process for maintaining sun on the solar arrays while still keeping the telescope pointed at the fixed science target, the process for safely but rapidly returning to science operations after a spacecraft initiated safing event and the long term anomaly resolution process. The ground data processing pipeline, from the point that science data is received on the ground to the presentation of preliminary planetary candidates and supporting data to the science team for further evaluation is discussed. Ground management, control, exchange and storage of Kepler's large and growing data set is discussed as well as the process and techniques for removing noise sources and applying calibrations to intermediate data products.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
开普勒端到端数据管道:从光子到遥远的世界
美国宇航局于2009年3月6日发射了开普勒任务,在过去的两年里,开普勒任务几乎不间断地在天鹅座和天琴座之间的一片天空中观测了超过10万个目标,用凌日法寻找行星系统。截至2011年10月,开普勒宇宙飞船已经收集并返回地球的数据超过290gb,确定了1235颗行星候选者,其中25颗候选者通过地面观测确认为行星。从恒星光度测量中提取行星系统的特征是一项困难而艰巨的任务,其中有效的信号瞬变可以小到40ppm。讨论了从嘈杂的原始光度测量中确定候选行星的端到端过程。概述了开普勒任务,并讨论了发现系外行星的开普勒技术。描述了开普勒航天器的设计和实现,通过原始观测数据传输到地面操作团队,跟踪光子进入望远镜孔径的数据路径。操作具有前所未有的9500万像素探测器的大孔径光度计的技术挑战以及处理和减少开普勒光度计产生的大量数据的机载技术都得到了解决。讨论了导致高准点率的日常任务操作的技术和挑战。这包括监测和管理航天器健康状况的日常过程,在保持望远镜指向固定科学目标的同时保持太阳能阵列上的太阳的年度过程,在航天器启动安全事件后安全但迅速返回科学操作的过程以及长期异常解决过程。讨论了从地面接收科学数据到向科学小组提交初步候选行星和支持数据以供进一步评估的地面数据处理流程。讨论了开普勒庞大且不断增长的数据集的地面管理、控制、交换和存储,以及消除噪声源和对中间数据产品应用校准的过程和技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Low-cost telepresence at technical conferences Design of a Stellar Gyroscope for visual attitude propagation for small satellites A cooperative search algorithm for highly parallel implementation of RANSAC for model estimation on Tilera MIMD architecture Open source software framework for applications in aeronautics and space Robonaut 2 — Initial activities on-board the ISS
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1