首页 > 最新文献

GeoResJ最新文献

英文 中文
The process of bringing dark data to light: The rescue of the early Nimbus satellite data 揭示黑暗数据的过程:对早期“光轮”卫星数据的抢救
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.02.013
D. Gallaher , G.G. Campbell , W. Meier , J. Moses , D. Wingo

Myriad environmental satellite missions are currently orbiting the earth. The comprehensive monitoring by these sensors provide scientists, policymakers, and the public critical information on the earth’s weather and climate system. The state of the art technology of our satellite monitoring system is the legacy of the first environment satellites, the Nimbus systems launched by NASA in the mid-1960s. Such early data can extend our climate record and provide important context in longer-term climate changes. However, the data was stowed away and, over the years, largely forgotten. It was nearly lost before its value was recognized and attempts to recover the data were undertaken. This paper covers what it took the authors to recover, navigate and reprocess the data into modern formats so that it could be used as a part of the satellite climate record. The procedures to recover the Nimbus data, from both film and tape, could be used by other data rescue projects, however the algorithms presented will tend to be Nimbus specific. Data rescue projects are often both difficult and time consuming but the data they bring back to the science community makes these efforts worthwhile.

无数的环境卫星目前正在绕地球运行。这些传感器的全面监测为科学家、决策者和公众提供了关于地球天气和气候系统的关键信息。我们的卫星监测系统的最先进技术是第一颗环境卫星的遗产,即美国宇航局在20世纪60年代中期发射的光轮系统。这些早期数据可以扩展我们的气候记录,并为长期气候变化提供重要背景。然而,这些数据被藏了起来,多年来基本上被遗忘了。在确认其价值并尝试恢复数据之前,它几乎丢失了。这篇论文涵盖了作者将这些数据恢复、导航和重新处理为现代格式,以便将其用作卫星气候记录的一部分所做的工作。从胶片和磁带中恢复Nimbus数据的程序可以被其他数据救援项目使用,但是所介绍的算法往往是针对Nimbus的。数据救援项目通常既困难又耗时,但它们带回科学界的数据使这些努力值得。
{"title":"The process of bringing dark data to light: The rescue of the early Nimbus satellite data","authors":"D. Gallaher ,&nbsp;G.G. Campbell ,&nbsp;W. Meier ,&nbsp;J. Moses ,&nbsp;D. Wingo","doi":"10.1016/j.grj.2015.02.013","DOIUrl":"10.1016/j.grj.2015.02.013","url":null,"abstract":"<div><p>Myriad environmental satellite missions are currently orbiting the earth. The comprehensive monitoring by these sensors provide scientists, policymakers, and the public critical information on the earth’s weather and climate system. The state of the art technology of our satellite monitoring system is the legacy of the first environment satellites, the Nimbus systems launched by NASA in the mid-1960s. Such early data can extend our climate record and provide important context in longer-term climate changes. However, the data was stowed away and, over the years, largely forgotten. It was nearly lost before its value was recognized and attempts to recover the data were undertaken. This paper covers what it took the authors to recover, navigate and reprocess the data into modern formats so that it could be used as a part of the satellite climate record. The procedures to recover the Nimbus data, from both film and tape, could be used by other data rescue projects, however the algorithms presented will tend to be Nimbus specific. Data rescue projects are often both difficult and time consuming but the data they bring back to the science community makes these efforts worthwhile.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 124-134"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.013","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365528","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
When are Old Data New Data? 什么时候旧数据是新数据?
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.02.004
R. Elizabeth Griffin, the CODATA Task Group ‘Data At Risk’ (DAR-TG)

What is the value of ‘old’ data when much more sophisticated data are being acquired today in huge quantities with modern equipment and served up in ready-to-use form? Why the hype over delving into the past, when the observers were undoubtedly less well informed than they are today? What can such old records possibly teach us that we don’t already know better from modern electronic data and today’s sophisticated experiments? As this paper demonstrates, the answers to those questions lie in the critical scientific advantages of the long-term date-stamps which only historical data carry.

当现代设备大量获取更复杂的数据,并以现成的形式提供时,“旧”数据的价值是什么?在观察家们毫无疑问没有今天了解得那么多的情况下,为什么要大肆宣传对过去的研究?这些古老的记录可能教给我们什么,我们还没有从现代电子数据和当今复杂的实验中得到更好的了解?正如本文所展示的那样,这些问题的答案在于只有历史数据才能携带的长期日期戳的关键科学优势。
{"title":"When are Old Data New Data?","authors":"R. Elizabeth Griffin,&nbsp;the CODATA Task Group ‘Data At Risk’ (DAR-TG)","doi":"10.1016/j.grj.2015.02.004","DOIUrl":"10.1016/j.grj.2015.02.004","url":null,"abstract":"<div><p>What is the value of ‘old’ data when much more sophisticated data are being acquired today in huge quantities with modern equipment and served up in ready-to-use form? Why the hype over delving into the past, when the observers were undoubtedly less well informed than they are today? What can such old records possibly teach us that we don’t already know better from modern electronic data and today’s sophisticated experiments? As this paper demonstrates, the answers to those questions lie in the critical scientific advantages of the long-term date-stamps which only historical data carry.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 92-97"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.004","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 37
Soil survey data rescued by means of user friendly soil identification keys and toposequence models to deliver soil information for improved land management 通过用户友好的土壤识别密钥和拓扑序列模型提取土壤调查数据,为改善土地管理提供土壤信息
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.02.006
G.J. Grealish , R.W. Fitzpatrick , J.L. Hutson

In many countries there is a large source of soil survey information that could be used to guide land management decision. This soil information is commonly undervalued and underused, because it is usually not in a user-friendly format that non-soil specialists who generally make land management decisions can readily apply, nor are soil specialists always immediately available to conduct the interpretation required.

The aim of this work was to develop an approach to convey soil survey information by means of special-purpose soil classifications and conceptual toposequence models in order to improve land management decisions. The approach: (i) salvages and reinterprets valuable soil survey legacy data from the plethora of detailed published soil survey technical reports and their numerous appendices of quantitative and qualitative data, and (ii) delivers complex or intricate soil survey information to non-soil specialists using a vocabulary and diagrams that they can understand and have available to apply when they need it.

To illustrate the wide applicability of this approach, case studies were conducted in three different parts of the world – Kuwait, Brunei, and Australia, each of which exhibit vastly different landscapes, climates, soil types and land use problems. Pedologists distilled published soil survey information and identified a limited set of soil properties related to landscape position which enabled non-soil specialists to determine soil types by following user-friendly approach and format. This provides a wider audience with information about soils, rather than always relying on a limited number of soil specialists to conduct the work.

The details provided in the case studies are applicable for the local area that they were prepared for. However, the structured approach developed and used is applicable to other locations throughout the world outside of: (i) Brunei, especially in tropical landscapes, (ii) Kuwait, especially in arid and semi-arid landscapes and (iii) Australian winter rainfall landscapes, especially in Mediterranean landscapes – in order to establish similar local classifications and conceptual models.

在许多国家,有大量的土壤调查资料可以用来指导土地管理决策。这种土壤资料通常被低估和未得到充分利用,因为它通常不是一种用户友好的格式,一般作出土地管理决定的非土壤专家不能随时应用,土壤专家也不能立即进行所需的解释。这项工作的目的是开发一种方法,通过特殊用途的土壤分类和概念拓扑序列模型来传达土壤调查信息,以改善土地管理决策。该方法:(i)从大量详细出版的土壤调查技术报告及其大量定量和定性数据附录中回收和重新解释有价值的土壤调查遗留数据,(ii)使用他们能够理解并在需要时可用的词汇和图表向非土壤专家提供复杂或错综复杂的土壤调查信息。为了说明这种方法的广泛适用性,我们在世界上三个不同的地区——科威特、文莱和澳大利亚——进行了案例研究,每个地区都表现出截然不同的景观、气候、土壤类型和土地利用问题。土壤学家对公布的土壤调查信息进行了提炼,并确定了与景观位置相关的一组有限的土壤属性,这使得非土壤专家能够通过遵循用户友好的方法和格式来确定土壤类型。这为更广泛的受众提供了有关土壤的信息,而不是总是依靠有限数量的土壤专家来开展工作。案例研究中提供的细节适用于它们所针对的当地地区。但是,开发和使用的结构化方法适用于世界各地以外的其他地点:(i)文莱,特别是热带地区;(ii)科威特,特别是干旱和半干旱地区;(iii)澳大利亚冬季降雨地区,特别是地中海地区,以便建立类似的地方分类和概念模型。
{"title":"Soil survey data rescued by means of user friendly soil identification keys and toposequence models to deliver soil information for improved land management","authors":"G.J. Grealish ,&nbsp;R.W. Fitzpatrick ,&nbsp;J.L. Hutson","doi":"10.1016/j.grj.2015.02.006","DOIUrl":"10.1016/j.grj.2015.02.006","url":null,"abstract":"<div><p>In many countries there is a large source of soil survey information that could be used to guide land management decision. This soil information is commonly undervalued and underused, because it is usually not in a user-friendly format that non-soil specialists who generally make land management decisions can readily apply, nor are soil specialists always immediately available to conduct the interpretation required.</p><p>The aim of this work was to develop an approach to convey soil survey information by means of special-purpose soil classifications and conceptual toposequence models in order to improve land management decisions. The approach: (i) salvages and reinterprets valuable soil survey legacy data from the plethora of detailed published soil survey technical reports and their numerous appendices of quantitative and qualitative data, and (ii) delivers complex or intricate soil survey information to non-soil specialists using a vocabulary and diagrams that they can understand and have available to apply when they need it.</p><p>To illustrate the wide applicability of this approach, case studies were conducted in three different parts of the world – Kuwait, Brunei, and Australia, each of which exhibit vastly different landscapes, climates, soil types and land use problems. Pedologists distilled published soil survey information and identified a limited set of soil properties related to landscape position which enabled non-soil specialists to determine soil types by following user-friendly approach and format. This provides a wider audience with information about soils, rather than always relying on a limited number of soil specialists to conduct the work.</p><p>The details provided in the case studies are applicable for the local area that they were prepared for. However, the structured approach developed and used is applicable to other locations throughout the world outside of: (i) Brunei, especially in tropical landscapes, (ii) Kuwait, especially in arid and semi-arid landscapes and (iii) Australian winter rainfall landscapes, especially in Mediterranean landscapes – in order to establish similar local classifications and conceptual models.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 81-91"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.006","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365411","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
A digital seismogram archive of nuclear explosion signals, recorded at the Borovoye Geophysical Observatory, Kazakhstan, from 1966 to 1996 1966年至1996年在哈萨克斯坦Borovoye地球物理观测站记录的核爆炸信号的数字地震记录档案
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.02.014
Vadim A. An , Vladimir M. Ovtchinnikov , Pyotr B. Kaazik , Vitaly V. Adushkin , Inna N. Sokolova , Iraida B. Aleschenko , Natalya N. Mikhailova , Won-Young Kim , Paul G. Richards , Howard J. Patton , W. Scott Phillips , George Randall , Diane Baker

Seismologists from Kazakhstan, Russia, and the United States have rescued the Soviet-era archive of nuclear explosion seismograms recorded at Borovoye in northern Kazakhstan during the period 1966–1996. The signals had been stored on about 8000 magnetic tapes, which were held at the recording observatory. After hundreds of man-years of work, these digital waveforms together with significant metadata are now available via the project URL, namely http://www.ldeo.columbia.edu/res/pi/Monitoring/Data/ as a modern open database, of use to diverse communities.

Three different sets of recording systems were operated at Borovoye, each using several different seismometers and different gain levels. For some explosions, more than twenty different channels of data are available. A first data release, in 2001, contained numerous glitches and lacked many instrument responses, but could still be used for measuring accurate arrival times and for comparison of the strengths of different types of seismic waves. The project URL also links to our second major data release, for nuclear explosions in Eurasia recorded in Borovoye, in which the data have been deglitched, all instrument responses have been included, and recording systems are described in detail.

This second dataset consists of more than 3700 waveforms (digital seismograms) from almost 500 nuclear explosions in Eurasia, many of them recorded at regional distances. It is important as a training set for the development and evaluation of seismological methods of discriminating between earthquakes and underground explosions, and can be used for assessment of three-dimensional models of the Earth’s interior structure.

来自哈萨克斯坦、俄罗斯和美国的地震学家拯救了苏联时期在哈萨克斯坦北部Borovoye记录的1966年至1996年期间的核爆炸地震图档案。这些信号被储存在大约8000盘磁带上,保存在录音天文台。经过数百人年的工作,这些数字波形和重要的元数据现在可以通过项目的URL(即http://www.ldeo.columbia.edu/res/pi/Monitoring/Data/)获得,作为一个现代开放数据库,供各种社区使用。在Borovoye操作了三套不同的记录系统,每一套使用几个不同的地震仪和不同的增益水平。对于一些爆炸,有超过20种不同的数据渠道可用。2001年发布的第一次数据包含了许多小故障,缺乏许多仪器响应,但仍然可以用于测量准确的到达时间和比较不同类型地震波的强度。该项目的URL还链接到我们的第二个主要数据发布,即在Borovoye记录的欧亚大陆核爆炸,其中数据已被删除,所有仪器响应已包括在内,并详细描述了记录系统。第二个数据集由欧亚大陆近500次核爆炸的3700多个波形(数字地震图)组成,其中许多是在区域距离上记录的。它是开发和评价区分地震和地下爆炸的地震学方法的重要训练集,并可用于评估地球内部结构的三维模型。
{"title":"A digital seismogram archive of nuclear explosion signals, recorded at the Borovoye Geophysical Observatory, Kazakhstan, from 1966 to 1996","authors":"Vadim A. An ,&nbsp;Vladimir M. Ovtchinnikov ,&nbsp;Pyotr B. Kaazik ,&nbsp;Vitaly V. Adushkin ,&nbsp;Inna N. Sokolova ,&nbsp;Iraida B. Aleschenko ,&nbsp;Natalya N. Mikhailova ,&nbsp;Won-Young Kim ,&nbsp;Paul G. Richards ,&nbsp;Howard J. Patton ,&nbsp;W. Scott Phillips ,&nbsp;George Randall ,&nbsp;Diane Baker","doi":"10.1016/j.grj.2015.02.014","DOIUrl":"10.1016/j.grj.2015.02.014","url":null,"abstract":"<div><p>Seismologists from Kazakhstan, Russia, and the United States have rescued the Soviet-era archive of nuclear explosion seismograms recorded at Borovoye in northern Kazakhstan during the period 1966–1996. The signals had been stored on about 8000 magnetic tapes, which were held at the recording observatory. After hundreds of man-years of work, these digital waveforms together with significant metadata are now available via the project URL, namely <span>http://www.ldeo.columbia.edu/res/pi/Monitoring/Data/</span><svg><path></path></svg> as a modern open database, of use to diverse communities.</p><p>Three different sets of recording systems were operated at Borovoye, each using several different seismometers and different gain levels. For some explosions, more than twenty different channels of data are available. A first data release, in 2001, contained numerous glitches and lacked many instrument responses, but could still be used for measuring accurate arrival times and for comparison of the strengths of different types of seismic waves. The project URL also links to our second major data release, for nuclear explosions in Eurasia recorded in Borovoye, in which the data have been deglitched, all instrument responses have been included, and recording systems are described in detail.</p><p>This second dataset consists of more than 3700 waveforms (digital seismograms) from almost 500 nuclear explosions in Eurasia, many of them recorded at regional distances. It is important as a training set for the development and evaluation of seismological methods of discriminating between earthquakes and underground explosions, and can be used for assessment of three-dimensional models of the Earth’s interior structure.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 141-163"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.014","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365537","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Guest Editorial: Special issue Rescuing Legacy data for Future Science 特刊:为未来科学拯救遗留数据
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.02.017
Lesley Wyborn, Leslie Hsu, Kerstin Lehnert, Mark A. Parsons
{"title":"Guest Editorial: Special issue Rescuing Legacy data for Future Science","authors":"Lesley Wyborn,&nbsp;Leslie Hsu,&nbsp;Kerstin Lehnert,&nbsp;Mark A. Parsons","doi":"10.1016/j.grj.2015.02.017","DOIUrl":"10.1016/j.grj.2015.02.017","url":null,"abstract":"","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 106-107"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.017","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Data rescue to extend the value of vintage seismic data: The OGS-SNAP experience 数据救援扩展古地震数据的价值:OGS-SNAP经验
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.01.006
Paolo Diviacco, Nigel Wardell, Edy Forlin, Chiara Sauli, Mihai Burca, Alessandro Busato, Jacques Centonze, Claudio Pelos

Large amounts of vintage seismic data were rescued and disseminated in an internal project of the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale (OGS). Such types of data would be very difficult to acquire today because they cover many areas that are currently subject to restrictions in obtaining exploration permits. The datasets extend over large geographical areas, covering large geological structures and would be very expensive to acquire today. Additionally, these data are particularly interesting because they were acquired using a high-energy source (dynamite) that would be difficult to obtain permission to use today. Therefore the recovery of these data could be very interesting for both the scientific and commercial communities. The urgency of rescuing tapes before degradation, and scanning and converting the paper sections into a usable form was the main focus, but, at the same time, the project looked ahead and attempted to address possible future exploitation of these data. To this end, considering how end users are likely to search for and use data, a full processing path that goes beyond recovery to consider other aspects was developed. The other concerns integrated into this process are

  • data enhancement, to overcome data limitations due to the older technology used during acquisition;

  • data integration, to consolidate different data types within the same data space, and

  • data discovery, for which a specific web based framework named SNAP (Seismic data Network Access Point) was developed that allows end users to search, locate and preview the data.

在意大利国家海洋和地理实验研究所(OGS)的一个内部项目中,抢救和传播了大量的古地震数据。这类数据在今天很难获得,因为它们涵盖了许多目前在获得勘探许可方面受到限制的地区。这些数据集覆盖了很大的地理区域,覆盖了很大的地质结构,在今天获取这些数据将非常昂贵。此外,这些数据特别有趣,因为它们是使用高能源(炸药)获得的,而今天很难获得使用许可。因此,这些数据的恢复对于科学界和商界来说都是非常有趣的。在磁带退化之前抢救磁带的紧迫性,以及扫描并将纸质部分转换为可用的形式是主要焦点,但与此同时,该项目展望未来,并试图解决未来可能利用这些数据的问题。为此,考虑到最终用户可能如何搜索和使用数据,开发了一个超越恢复考虑其他方面的完整处理路径。整合到这一过程中的其他问题包括:数据增强,克服采集过程中使用的旧技术造成的数据限制;数据集成,在同一数据空间内整合不同的数据类型;数据发现,为此开发了一个名为SNAP(地震数据网络接入点)的特定基于web的框架,允许最终用户搜索、定位和预览数据。
{"title":"Data rescue to extend the value of vintage seismic data: The OGS-SNAP experience","authors":"Paolo Diviacco,&nbsp;Nigel Wardell,&nbsp;Edy Forlin,&nbsp;Chiara Sauli,&nbsp;Mihai Burca,&nbsp;Alessandro Busato,&nbsp;Jacques Centonze,&nbsp;Claudio Pelos","doi":"10.1016/j.grj.2015.01.006","DOIUrl":"10.1016/j.grj.2015.01.006","url":null,"abstract":"<div><p>Large amounts of vintage seismic data were rescued and disseminated in an internal project of the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale (OGS). Such types of data would be very difficult to acquire today because they cover many areas that are currently subject to restrictions in obtaining exploration permits. The datasets extend over large geographical areas, covering large geological structures and would be very expensive to acquire today. Additionally, these data are particularly interesting because they were acquired using a high-energy source (dynamite) that would be difficult to obtain permission to use today. Therefore the recovery of these data could be very interesting for both the scientific and commercial communities. The urgency of rescuing tapes before degradation, and scanning and converting the paper sections into a usable form was the main focus, but, at the same time, the project looked ahead and attempted to address possible future exploitation of these data. To this end, considering how end users are likely to search for and use data, a full processing path that goes beyond recovery to consider other aspects was developed. The other concerns integrated into this process are</p><p></p><ul><li><span>•</span><span><p>data enhancement, to overcome data limitations due to the older technology used during acquisition;</p></span></li><li><span>•</span><span><p>data integration, to consolidate different data types within the same data space, and</p></span></li><li><span>•</span><span><p>data discovery, for which a specific web based framework named SNAP (Seismic data Network Access Point) was developed that allows end users to search, locate and preview the data.</p></span></li></ul></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 44-52"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.01.006","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365305","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Maintaining legacy data: Saving Belfast Harbour (UK) tide-gauge data (1901–2010) 维护传统数据:保存贝尔法斯特港(英国)潮汐计数据(1901-2010)
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.02.002
Joanne Murdy , Julian Orford , James Bell

Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110 years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual xy digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&Convert and MergeHYD allow consistent water level sampling both at 60 min (past standard) and 10 min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.

在20世纪,由于观测方法和与潮汐计相关的所需输出格式之间发生了根本性的转变,潮汐计数据被确定为遗留数据。通过验潮记录观测到的水位变化被认为是确定最近平均海平面和风暴潮的历史变化(十年到一个世纪)的唯一重要依据。覆盖20世纪的潮汐仪记录有限,如果完整的纸质记录(婚图)被数字化重组以允许一致的数据分析,那么贝尔法斯特(英国)港口的潮汐仪将是一个具有战略意义的长期(110年)记录。本文介绍了从1901年末开始,从5个不同的贝尔法斯特港潮汐计位置/机器类型中提取一致的观测水位时间序列的方法。从原始模拟(每日)记录中通过扫描对数图,然后使用图线搜索软件(Ungraph™)提取顺序潮汐高度,以数字方式检索潮汐计数据。这种自动化的信号提取允许完整的贝尔法斯特系列被快速检索,相对于任何手动x-y数字化的信号。项目开发的软件Merge&Convert和MergeHYD可以在60分钟(过去标准)和10分钟的间隔内进行一致的水位采样,后者可以增强浪涌测量。贝尔法斯特潮汐计数据已被纠正,验证和质量控制(IOC 2006标准)。结果是贝尔法斯特港每年一致的遗留数据系列,其中包括200多万潮汐数据观测。
{"title":"Maintaining legacy data: Saving Belfast Harbour (UK) tide-gauge data (1901–2010)","authors":"Joanne Murdy ,&nbsp;Julian Orford ,&nbsp;James Bell","doi":"10.1016/j.grj.2015.02.002","DOIUrl":"10.1016/j.grj.2015.02.002","url":null,"abstract":"<div><p>Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110<!--> <!-->years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual <em>x</em>–<em>y</em> digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&amp;Convert and MergeHYD allow consistent water level sampling both at 60<!--> <!-->min (past standard) and 10<!--> <!-->min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 65-73"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.002","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Use of legacy data in geomorphological research 地貌学研究中遗留数据的使用
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.02.008
Mike J. Smith , Saskia Keesstra , James Rose

This paper considers legacy data and data rescue within the context of geomorphology. Data rescue may be necessary dependent upon the storage medium (is it physically accessible) and the data format (e.g. digital file type); where either of these is not functional, intervention will be required in order to retrieve the stored data. Within geomorphological research, there are three scenarios that may utilize legacy data: to reinvestigate phenomena, to access information about a landform/process that no longer exists, and to investigate temporal change. Here, we present three case studies with discussion that illustrate these scenarios: striae records of Ireland were used to produce a palaeoglacial reconstruction, geomorphological mapping was used to compile a map of glacial landforms, and aerial photographs were used to analyze temporal change in river channel form and catchment land cover.

本文考虑了地貌学背景下的遗留数据和数据抢救问题。根据存储介质(是否可物理访问)和数据格式(例如数字文件类型),数据抢救可能是必要的;如果其中任何一个都不起作用,则需要进行干预以检索存储的数据。在地形学研究中,有三种情况可以利用遗留数据:重新调查现象,获取不再存在的地貌/过程的信息,以及调查时间变化。在这里,我们提出了三个案例研究,并讨论了这些情景:爱尔兰的条纹记录用于产生古冰川重建,地貌测绘用于编制冰川地貌图,航空照片用于分析河道形式和集水区土地覆盖的时间变化。
{"title":"Use of legacy data in geomorphological research","authors":"Mike J. Smith ,&nbsp;Saskia Keesstra ,&nbsp;James Rose","doi":"10.1016/j.grj.2015.02.008","DOIUrl":"10.1016/j.grj.2015.02.008","url":null,"abstract":"<div><p>This paper considers legacy data and data rescue within the context of geomorphology. Data rescue may be necessary dependent upon the storage medium (is it physically accessible) and the data format (e.g. digital file type); where either of these is not functional, intervention will be required in order to retrieve the stored data. Within geomorphological research, there are three scenarios that may utilize legacy data: to reinvestigate phenomena, to access information about a landform/process that no longer exists, and to investigate temporal change. Here, we present three case studies with discussion that illustrate these scenarios: striae records of Ireland were used to produce a palaeoglacial reconstruction, geomorphological mapping was used to compile a map of glacial landforms, and aerial photographs were used to analyze temporal change in river channel form and catchment land cover.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 74-80"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.008","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365434","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Rescued from the deep: Publishing scientific ocean drilling long tail data 从深海获救:发表科学海洋钻探长尾数据
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.01.003
Jamus Collier , Stefanie Schumacher , Cornelia Behrens , Amelie Driemel , Michael Diepenbroek , Hannes Grobe , Taewoon Kim , Uwe Schindler , Rainer Sieger , Hans-Joachim Wallrabe-Adams

Scientific ocean drilling began in 1968 and ever since has been generating huge amounts of data, including that from shipboard analysis of cores, in situ borehole measurements, long-term subseafloor hydrogeological observatories, and post-expedition research done on core samples and data at laboratories around the world (Smith et al., 2010). Much of the data collected aboard the drilling vessels are captured in a number of program databases (e.g., Janus), and eventually archived in a long-term data repository. However, data resulting from researchers’ analyses on core samples in the post-cruise period are generally confined to journal articles and scholarly literature or, particularly for raw or processed data sets, to the hard drives of those researchers. Thus, knowledge of and access to long tail research data that constitutes a significant portion of the overall output of scientific ocean drilling is at risk of remaining lost to the multidisciplinary Earth sciences community.

In order to address the issue of long tail data from scientific ocean drilling, the Integrated Ocean Drilling Program Management International (IODP-MI) partnered with the data publisher PANGAEA hosted by the Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research (AWI) and the University of Bremen Center for Marine Environmental Sciences (MARUM) to conduct a post-cruise data rescue project. This collaboration began in 2007 and continued until 2013. This report summarizes the goals, methods, results, and lessons learned from the IODP Post-Cruise Data Capture Project.

科学海洋钻探始于1968年,从那时起就产生了大量数据,包括船上岩心分析、现场钻孔测量、长期海底水文地质观测,以及在世界各地实验室对岩心样本和数据进行的考察后研究(Smith et al., 2010)。钻井船上收集的大部分数据被捕获到许多程序数据库(例如Janus)中,并最终存档到一个长期数据存储库中。然而,研究人员在巡航后时期对岩心样本进行分析所得的数据通常仅限于期刊文章和学术文献,特别是原始或处理过的数据集,仅限于这些研究人员的硬盘驱动器。因此,对长尾研究数据的了解和获取,构成了科学海洋钻探总体产出的很大一部分,对多学科地球科学界来说,仍然存在丢失的风险。为了解决科学海洋钻井的长尾数据问题,国际综合海洋钻井计划管理公司(IODP-MI)与由阿尔弗雷德·韦格纳研究所亥姆霍兹极地与海洋研究中心(AWI)和不来梅大学海洋环境科学中心(MARUM)主办的数据出版商PANGAEA合作,开展了一个巡航后数据救援项目。这项合作始于2007年,一直持续到2013年。本报告总结了IODP巡航后数据捕获项目的目标、方法、结果和经验教训。
{"title":"Rescued from the deep: Publishing scientific ocean drilling long tail data","authors":"Jamus Collier ,&nbsp;Stefanie Schumacher ,&nbsp;Cornelia Behrens ,&nbsp;Amelie Driemel ,&nbsp;Michael Diepenbroek ,&nbsp;Hannes Grobe ,&nbsp;Taewoon Kim ,&nbsp;Uwe Schindler ,&nbsp;Rainer Sieger ,&nbsp;Hans-Joachim Wallrabe-Adams","doi":"10.1016/j.grj.2015.01.003","DOIUrl":"10.1016/j.grj.2015.01.003","url":null,"abstract":"<div><p>Scientific ocean drilling began in 1968 and ever since has been generating huge amounts of data, including that from shipboard analysis of cores, in situ borehole measurements, long-term subseafloor hydrogeological observatories, and post-expedition research done on core samples and data at laboratories around the world (Smith et al., 2010). Much of the data collected aboard the drilling vessels are captured in a number of program databases (e.g., Janus), and eventually archived in a long-term data repository. However, data resulting from researchers’ analyses on core samples in the post-cruise period are generally confined to journal articles and scholarly literature or, particularly for raw or processed data sets, to the hard drives of those researchers. Thus, knowledge of and access to long tail research data that constitutes a significant portion of the overall output of scientific ocean drilling is at risk of remaining lost to the multidisciplinary Earth sciences community.</p><p>In order to address the issue of long tail data from scientific ocean drilling, the Integrated Ocean Drilling Program Management International (IODP-MI) partnered with the data publisher PANGAEA hosted by the Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research (AWI) and the University of Bremen Center for Marine Environmental Sciences (MARUM) to conduct a post-cruise data rescue project. This collaboration began in 2007 and continued until 2013. This report summarizes the goals, methods, results, and lessons learned from the IODP Post-Cruise Data Capture Project.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 17-20"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.01.003","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365265","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Preserving geomorphic data records of flood disturbances 保存洪水扰动的地貌数据记录
Pub Date : 2015-06-01 DOI: 10.1016/j.grj.2015.02.016
John A. Moody , Deborah A. Martin , Robert H. Meade

No central database or repository is currently available in the USA to preserve long-term, spatially extensive records of fluvial geomorphic data or to provide future accessibility. Yet, because of their length and continuity these data are valuable for future research. Therefore, we built a public accessible website to preserve data records of two examples of long-term monitoring (40 and 18 years) of the fluvial geomorphic response to natural disturbances. One disturbance was ∼50-year flood on Powder River in Montana in 1978, and the second disturbance was a catastrophic flood on Spring Creek following a ∼100-year rainstorm after a wildfire in Colorado in 1996.

Two critical issues arise relative to preserving fluvial geomorphic data. The first is preserving the data themselves, but the second, and just as important, is preserving information about the location of the field research sites where the data were collected so the sites can be re-located and re-surveyed in the future. The latter allows long-term datasets to be extended into the future and to provide critical background data for interpreting future landscape changes. Data were preserved on a website to allow world-wide accessibility and to upload new data to the website as they become available. We describe the architecture of the website, lessons learned in developing the website, future improvements, and recommendations on how also to preserve information about the location of field research sites.

美国目前没有中央数据库或存储库来保存长期的、空间上广泛的河流地貌数据记录,或提供未来的可访问性。然而,由于它们的长度和连续性,这些数据对未来的研究是有价值的。因此,我们建立了一个公众可访问的网站,以保存两个长期监测实例(40年和18年)对自然干扰的河流地貌响应的数据记录。第一次扰动是1978年蒙大拿州粉河发生的50年一遇的洪水,第二次扰动是1996年科罗拉多州山火后发生的100年一遇的暴雨后发生的春溪特大洪水。关于保存河流地貌数据,出现了两个关键问题。第一个是保存数据本身,第二个也是同样重要的,是保存收集数据的实地研究地点的位置信息,以便将来重新定位和重新调查这些地点。后者允许将长期数据集扩展到未来,并为解释未来景观变化提供关键的背景数据。数据保存在一个网站上,以便全世界都能访问,并在新数据可用时将其上传到该网站。我们描述了网站的架构,在开发网站的经验教训,未来的改进,以及如何保存有关实地研究站点位置的信息的建议。
{"title":"Preserving geomorphic data records of flood disturbances","authors":"John A. Moody ,&nbsp;Deborah A. Martin ,&nbsp;Robert H. Meade","doi":"10.1016/j.grj.2015.02.016","DOIUrl":"10.1016/j.grj.2015.02.016","url":null,"abstract":"<div><p>No central database or repository is currently available in the USA to preserve long-term, spatially extensive records of fluvial geomorphic data or to provide future accessibility. Yet, because of their length and continuity these data are valuable for future research. Therefore, we built a public accessible website to preserve data records of two examples of long-term monitoring (40 and 18<!--> <!-->years) of the fluvial geomorphic response to natural disturbances. One disturbance was ∼50-year flood on Powder River in Montana in 1978, and the second disturbance was a catastrophic flood on Spring Creek following a ∼100-year rainstorm after a wildfire in Colorado in 1996.</p><p>Two critical issues arise relative to preserving fluvial geomorphic data. The first is preserving the data themselves, but the second, and just as important, is preserving information about the location of the field research sites where the data were collected so the sites can be re-located and re-surveyed in the future. The latter allows long-term datasets to be extended into the future and to provide critical background data for interpreting future landscape changes. Data were preserved on a website to allow world-wide accessibility and to upload new data to the website as they become available. We describe the architecture of the website, lessons learned in developing the website, future improvements, and recommendations on how also to preserve information about the location of field research sites.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 164-174"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.016","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
GeoResJ
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1