Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.013
D. Gallaher , G.G. Campbell , W. Meier , J. Moses , D. Wingo
Myriad environmental satellite missions are currently orbiting the earth. The comprehensive monitoring by these sensors provide scientists, policymakers, and the public critical information on the earth’s weather and climate system. The state of the art technology of our satellite monitoring system is the legacy of the first environment satellites, the Nimbus systems launched by NASA in the mid-1960s. Such early data can extend our climate record and provide important context in longer-term climate changes. However, the data was stowed away and, over the years, largely forgotten. It was nearly lost before its value was recognized and attempts to recover the data were undertaken. This paper covers what it took the authors to recover, navigate and reprocess the data into modern formats so that it could be used as a part of the satellite climate record. The procedures to recover the Nimbus data, from both film and tape, could be used by other data rescue projects, however the algorithms presented will tend to be Nimbus specific. Data rescue projects are often both difficult and time consuming but the data they bring back to the science community makes these efforts worthwhile.
{"title":"The process of bringing dark data to light: The rescue of the early Nimbus satellite data","authors":"D. Gallaher , G.G. Campbell , W. Meier , J. Moses , D. Wingo","doi":"10.1016/j.grj.2015.02.013","DOIUrl":"10.1016/j.grj.2015.02.013","url":null,"abstract":"<div><p>Myriad environmental satellite missions are currently orbiting the earth. The comprehensive monitoring by these sensors provide scientists, policymakers, and the public critical information on the earth’s weather and climate system. The state of the art technology of our satellite monitoring system is the legacy of the first environment satellites, the Nimbus systems launched by NASA in the mid-1960s. Such early data can extend our climate record and provide important context in longer-term climate changes. However, the data was stowed away and, over the years, largely forgotten. It was nearly lost before its value was recognized and attempts to recover the data were undertaken. This paper covers what it took the authors to recover, navigate and reprocess the data into modern formats so that it could be used as a part of the satellite climate record. The procedures to recover the Nimbus data, from both film and tape, could be used by other data rescue projects, however the algorithms presented will tend to be Nimbus specific. Data rescue projects are often both difficult and time consuming but the data they bring back to the science community makes these efforts worthwhile.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 124-134"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.013","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365528","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.004
R. Elizabeth Griffin, the CODATA Task Group ‘Data At Risk’ (DAR-TG)
What is the value of ‘old’ data when much more sophisticated data are being acquired today in huge quantities with modern equipment and served up in ready-to-use form? Why the hype over delving into the past, when the observers were undoubtedly less well informed than they are today? What can such old records possibly teach us that we don’t already know better from modern electronic data and today’s sophisticated experiments? As this paper demonstrates, the answers to those questions lie in the critical scientific advantages of the long-term date-stamps which only historical data carry.
{"title":"When are Old Data New Data?","authors":"R. Elizabeth Griffin, the CODATA Task Group ‘Data At Risk’ (DAR-TG)","doi":"10.1016/j.grj.2015.02.004","DOIUrl":"10.1016/j.grj.2015.02.004","url":null,"abstract":"<div><p>What is the value of ‘old’ data when much more sophisticated data are being acquired today in huge quantities with modern equipment and served up in ready-to-use form? Why the hype over delving into the past, when the observers were undoubtedly less well informed than they are today? What can such old records possibly teach us that we don’t already know better from modern electronic data and today’s sophisticated experiments? As this paper demonstrates, the answers to those questions lie in the critical scientific advantages of the long-term date-stamps which only historical data carry.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 92-97"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.004","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.006
G.J. Grealish , R.W. Fitzpatrick , J.L. Hutson
In many countries there is a large source of soil survey information that could be used to guide land management decision. This soil information is commonly undervalued and underused, because it is usually not in a user-friendly format that non-soil specialists who generally make land management decisions can readily apply, nor are soil specialists always immediately available to conduct the interpretation required.
The aim of this work was to develop an approach to convey soil survey information by means of special-purpose soil classifications and conceptual toposequence models in order to improve land management decisions. The approach: (i) salvages and reinterprets valuable soil survey legacy data from the plethora of detailed published soil survey technical reports and their numerous appendices of quantitative and qualitative data, and (ii) delivers complex or intricate soil survey information to non-soil specialists using a vocabulary and diagrams that they can understand and have available to apply when they need it.
To illustrate the wide applicability of this approach, case studies were conducted in three different parts of the world – Kuwait, Brunei, and Australia, each of which exhibit vastly different landscapes, climates, soil types and land use problems. Pedologists distilled published soil survey information and identified a limited set of soil properties related to landscape position which enabled non-soil specialists to determine soil types by following user-friendly approach and format. This provides a wider audience with information about soils, rather than always relying on a limited number of soil specialists to conduct the work.
The details provided in the case studies are applicable for the local area that they were prepared for. However, the structured approach developed and used is applicable to other locations throughout the world outside of: (i) Brunei, especially in tropical landscapes, (ii) Kuwait, especially in arid and semi-arid landscapes and (iii) Australian winter rainfall landscapes, especially in Mediterranean landscapes – in order to establish similar local classifications and conceptual models.
{"title":"Soil survey data rescued by means of user friendly soil identification keys and toposequence models to deliver soil information for improved land management","authors":"G.J. Grealish , R.W. Fitzpatrick , J.L. Hutson","doi":"10.1016/j.grj.2015.02.006","DOIUrl":"10.1016/j.grj.2015.02.006","url":null,"abstract":"<div><p>In many countries there is a large source of soil survey information that could be used to guide land management decision. This soil information is commonly undervalued and underused, because it is usually not in a user-friendly format that non-soil specialists who generally make land management decisions can readily apply, nor are soil specialists always immediately available to conduct the interpretation required.</p><p>The aim of this work was to develop an approach to convey soil survey information by means of special-purpose soil classifications and conceptual toposequence models in order to improve land management decisions. The approach: (i) salvages and reinterprets valuable soil survey legacy data from the plethora of detailed published soil survey technical reports and their numerous appendices of quantitative and qualitative data, and (ii) delivers complex or intricate soil survey information to non-soil specialists using a vocabulary and diagrams that they can understand and have available to apply when they need it.</p><p>To illustrate the wide applicability of this approach, case studies were conducted in three different parts of the world – Kuwait, Brunei, and Australia, each of which exhibit vastly different landscapes, climates, soil types and land use problems. Pedologists distilled published soil survey information and identified a limited set of soil properties related to landscape position which enabled non-soil specialists to determine soil types by following user-friendly approach and format. This provides a wider audience with information about soils, rather than always relying on a limited number of soil specialists to conduct the work.</p><p>The details provided in the case studies are applicable for the local area that they were prepared for. However, the structured approach developed and used is applicable to other locations throughout the world outside of: (i) Brunei, especially in tropical landscapes, (ii) Kuwait, especially in arid and semi-arid landscapes and (iii) Australian winter rainfall landscapes, especially in Mediterranean landscapes – in order to establish similar local classifications and conceptual models.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 81-91"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.006","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365411","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.014
Vadim A. An , Vladimir M. Ovtchinnikov , Pyotr B. Kaazik , Vitaly V. Adushkin , Inna N. Sokolova , Iraida B. Aleschenko , Natalya N. Mikhailova , Won-Young Kim , Paul G. Richards , Howard J. Patton , W. Scott Phillips , George Randall , Diane Baker
Seismologists from Kazakhstan, Russia, and the United States have rescued the Soviet-era archive of nuclear explosion seismograms recorded at Borovoye in northern Kazakhstan during the period 1966–1996. The signals had been stored on about 8000 magnetic tapes, which were held at the recording observatory. After hundreds of man-years of work, these digital waveforms together with significant metadata are now available via the project URL, namely http://www.ldeo.columbia.edu/res/pi/Monitoring/Data/ as a modern open database, of use to diverse communities.
Three different sets of recording systems were operated at Borovoye, each using several different seismometers and different gain levels. For some explosions, more than twenty different channels of data are available. A first data release, in 2001, contained numerous glitches and lacked many instrument responses, but could still be used for measuring accurate arrival times and for comparison of the strengths of different types of seismic waves. The project URL also links to our second major data release, for nuclear explosions in Eurasia recorded in Borovoye, in which the data have been deglitched, all instrument responses have been included, and recording systems are described in detail.
This second dataset consists of more than 3700 waveforms (digital seismograms) from almost 500 nuclear explosions in Eurasia, many of them recorded at regional distances. It is important as a training set for the development and evaluation of seismological methods of discriminating between earthquakes and underground explosions, and can be used for assessment of three-dimensional models of the Earth’s interior structure.
{"title":"A digital seismogram archive of nuclear explosion signals, recorded at the Borovoye Geophysical Observatory, Kazakhstan, from 1966 to 1996","authors":"Vadim A. An , Vladimir M. Ovtchinnikov , Pyotr B. Kaazik , Vitaly V. Adushkin , Inna N. Sokolova , Iraida B. Aleschenko , Natalya N. Mikhailova , Won-Young Kim , Paul G. Richards , Howard J. Patton , W. Scott Phillips , George Randall , Diane Baker","doi":"10.1016/j.grj.2015.02.014","DOIUrl":"10.1016/j.grj.2015.02.014","url":null,"abstract":"<div><p>Seismologists from Kazakhstan, Russia, and the United States have rescued the Soviet-era archive of nuclear explosion seismograms recorded at Borovoye in northern Kazakhstan during the period 1966–1996. The signals had been stored on about 8000 magnetic tapes, which were held at the recording observatory. After hundreds of man-years of work, these digital waveforms together with significant metadata are now available via the project URL, namely <span>http://www.ldeo.columbia.edu/res/pi/Monitoring/Data/</span><svg><path></path></svg> as a modern open database, of use to diverse communities.</p><p>Three different sets of recording systems were operated at Borovoye, each using several different seismometers and different gain levels. For some explosions, more than twenty different channels of data are available. A first data release, in 2001, contained numerous glitches and lacked many instrument responses, but could still be used for measuring accurate arrival times and for comparison of the strengths of different types of seismic waves. The project URL also links to our second major data release, for nuclear explosions in Eurasia recorded in Borovoye, in which the data have been deglitched, all instrument responses have been included, and recording systems are described in detail.</p><p>This second dataset consists of more than 3700 waveforms (digital seismograms) from almost 500 nuclear explosions in Eurasia, many of them recorded at regional distances. It is important as a training set for the development and evaluation of seismological methods of discriminating between earthquakes and underground explosions, and can be used for assessment of three-dimensional models of the Earth’s interior structure.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 141-163"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.014","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365537","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.017
Lesley Wyborn, Leslie Hsu, Kerstin Lehnert, Mark A. Parsons
{"title":"Guest Editorial: Special issue Rescuing Legacy data for Future Science","authors":"Lesley Wyborn, Leslie Hsu, Kerstin Lehnert, Mark A. Parsons","doi":"10.1016/j.grj.2015.02.017","DOIUrl":"10.1016/j.grj.2015.02.017","url":null,"abstract":"","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 106-107"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.017","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.01.006
Paolo Diviacco, Nigel Wardell, Edy Forlin, Chiara Sauli, Mihai Burca, Alessandro Busato, Jacques Centonze, Claudio Pelos
Large amounts of vintage seismic data were rescued and disseminated in an internal project of the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale (OGS). Such types of data would be very difficult to acquire today because they cover many areas that are currently subject to restrictions in obtaining exploration permits. The datasets extend over large geographical areas, covering large geological structures and would be very expensive to acquire today. Additionally, these data are particularly interesting because they were acquired using a high-energy source (dynamite) that would be difficult to obtain permission to use today. Therefore the recovery of these data could be very interesting for both the scientific and commercial communities. The urgency of rescuing tapes before degradation, and scanning and converting the paper sections into a usable form was the main focus, but, at the same time, the project looked ahead and attempted to address possible future exploitation of these data. To this end, considering how end users are likely to search for and use data, a full processing path that goes beyond recovery to consider other aspects was developed. The other concerns integrated into this process are
•
data enhancement, to overcome data limitations due to the older technology used during acquisition;
•
data integration, to consolidate different data types within the same data space, and
•
data discovery, for which a specific web based framework named SNAP (Seismic data Network Access Point) was developed that allows end users to search, locate and preview the data.
{"title":"Data rescue to extend the value of vintage seismic data: The OGS-SNAP experience","authors":"Paolo Diviacco, Nigel Wardell, Edy Forlin, Chiara Sauli, Mihai Burca, Alessandro Busato, Jacques Centonze, Claudio Pelos","doi":"10.1016/j.grj.2015.01.006","DOIUrl":"10.1016/j.grj.2015.01.006","url":null,"abstract":"<div><p>Large amounts of vintage seismic data were rescued and disseminated in an internal project of the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale (OGS). Such types of data would be very difficult to acquire today because they cover many areas that are currently subject to restrictions in obtaining exploration permits. The datasets extend over large geographical areas, covering large geological structures and would be very expensive to acquire today. Additionally, these data are particularly interesting because they were acquired using a high-energy source (dynamite) that would be difficult to obtain permission to use today. Therefore the recovery of these data could be very interesting for both the scientific and commercial communities. The urgency of rescuing tapes before degradation, and scanning and converting the paper sections into a usable form was the main focus, but, at the same time, the project looked ahead and attempted to address possible future exploitation of these data. To this end, considering how end users are likely to search for and use data, a full processing path that goes beyond recovery to consider other aspects was developed. The other concerns integrated into this process are</p><p></p><ul><li><span>•</span><span><p>data enhancement, to overcome data limitations due to the older technology used during acquisition;</p></span></li><li><span>•</span><span><p>data integration, to consolidate different data types within the same data space, and</p></span></li><li><span>•</span><span><p>data discovery, for which a specific web based framework named SNAP (Seismic data Network Access Point) was developed that allows end users to search, locate and preview the data.</p></span></li></ul></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 44-52"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.01.006","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365305","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.002
Joanne Murdy , Julian Orford , James Bell
Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110 years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual x–y digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&Convert and MergeHYD allow consistent water level sampling both at 60 min (past standard) and 10 min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.
{"title":"Maintaining legacy data: Saving Belfast Harbour (UK) tide-gauge data (1901–2010)","authors":"Joanne Murdy , Julian Orford , James Bell","doi":"10.1016/j.grj.2015.02.002","DOIUrl":"10.1016/j.grj.2015.02.002","url":null,"abstract":"<div><p>Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110<!--> <!-->years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual <em>x</em>–<em>y</em> digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&Convert and MergeHYD allow consistent water level sampling both at 60<!--> <!-->min (past standard) and 10<!--> <!-->min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 65-73"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.002","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.008
Mike J. Smith , Saskia Keesstra , James Rose
This paper considers legacy data and data rescue within the context of geomorphology. Data rescue may be necessary dependent upon the storage medium (is it physically accessible) and the data format (e.g. digital file type); where either of these is not functional, intervention will be required in order to retrieve the stored data. Within geomorphological research, there are three scenarios that may utilize legacy data: to reinvestigate phenomena, to access information about a landform/process that no longer exists, and to investigate temporal change. Here, we present three case studies with discussion that illustrate these scenarios: striae records of Ireland were used to produce a palaeoglacial reconstruction, geomorphological mapping was used to compile a map of glacial landforms, and aerial photographs were used to analyze temporal change in river channel form and catchment land cover.
{"title":"Use of legacy data in geomorphological research","authors":"Mike J. Smith , Saskia Keesstra , James Rose","doi":"10.1016/j.grj.2015.02.008","DOIUrl":"10.1016/j.grj.2015.02.008","url":null,"abstract":"<div><p>This paper considers legacy data and data rescue within the context of geomorphology. Data rescue may be necessary dependent upon the storage medium (is it physically accessible) and the data format (e.g. digital file type); where either of these is not functional, intervention will be required in order to retrieve the stored data. Within geomorphological research, there are three scenarios that may utilize legacy data: to reinvestigate phenomena, to access information about a landform/process that no longer exists, and to investigate temporal change. Here, we present three case studies with discussion that illustrate these scenarios: striae records of Ireland were used to produce a palaeoglacial reconstruction, geomorphological mapping was used to compile a map of glacial landforms, and aerial photographs were used to analyze temporal change in river channel form and catchment land cover.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 74-80"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.008","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365434","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.01.003
Jamus Collier , Stefanie Schumacher , Cornelia Behrens , Amelie Driemel , Michael Diepenbroek , Hannes Grobe , Taewoon Kim , Uwe Schindler , Rainer Sieger , Hans-Joachim Wallrabe-Adams
Scientific ocean drilling began in 1968 and ever since has been generating huge amounts of data, including that from shipboard analysis of cores, in situ borehole measurements, long-term subseafloor hydrogeological observatories, and post-expedition research done on core samples and data at laboratories around the world (Smith et al., 2010). Much of the data collected aboard the drilling vessels are captured in a number of program databases (e.g., Janus), and eventually archived in a long-term data repository. However, data resulting from researchers’ analyses on core samples in the post-cruise period are generally confined to journal articles and scholarly literature or, particularly for raw or processed data sets, to the hard drives of those researchers. Thus, knowledge of and access to long tail research data that constitutes a significant portion of the overall output of scientific ocean drilling is at risk of remaining lost to the multidisciplinary Earth sciences community.
In order to address the issue of long tail data from scientific ocean drilling, the Integrated Ocean Drilling Program Management International (IODP-MI) partnered with the data publisher PANGAEA hosted by the Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research (AWI) and the University of Bremen Center for Marine Environmental Sciences (MARUM) to conduct a post-cruise data rescue project. This collaboration began in 2007 and continued until 2013. This report summarizes the goals, methods, results, and lessons learned from the IODP Post-Cruise Data Capture Project.
科学海洋钻探始于1968年,从那时起就产生了大量数据,包括船上岩心分析、现场钻孔测量、长期海底水文地质观测,以及在世界各地实验室对岩心样本和数据进行的考察后研究(Smith et al., 2010)。钻井船上收集的大部分数据被捕获到许多程序数据库(例如Janus)中,并最终存档到一个长期数据存储库中。然而,研究人员在巡航后时期对岩心样本进行分析所得的数据通常仅限于期刊文章和学术文献,特别是原始或处理过的数据集,仅限于这些研究人员的硬盘驱动器。因此,对长尾研究数据的了解和获取,构成了科学海洋钻探总体产出的很大一部分,对多学科地球科学界来说,仍然存在丢失的风险。为了解决科学海洋钻井的长尾数据问题,国际综合海洋钻井计划管理公司(IODP-MI)与由阿尔弗雷德·韦格纳研究所亥姆霍兹极地与海洋研究中心(AWI)和不来梅大学海洋环境科学中心(MARUM)主办的数据出版商PANGAEA合作,开展了一个巡航后数据救援项目。这项合作始于2007年,一直持续到2013年。本报告总结了IODP巡航后数据捕获项目的目标、方法、结果和经验教训。
{"title":"Rescued from the deep: Publishing scientific ocean drilling long tail data","authors":"Jamus Collier , Stefanie Schumacher , Cornelia Behrens , Amelie Driemel , Michael Diepenbroek , Hannes Grobe , Taewoon Kim , Uwe Schindler , Rainer Sieger , Hans-Joachim Wallrabe-Adams","doi":"10.1016/j.grj.2015.01.003","DOIUrl":"10.1016/j.grj.2015.01.003","url":null,"abstract":"<div><p>Scientific ocean drilling began in 1968 and ever since has been generating huge amounts of data, including that from shipboard analysis of cores, in situ borehole measurements, long-term subseafloor hydrogeological observatories, and post-expedition research done on core samples and data at laboratories around the world (Smith et al., 2010). Much of the data collected aboard the drilling vessels are captured in a number of program databases (e.g., Janus), and eventually archived in a long-term data repository. However, data resulting from researchers’ analyses on core samples in the post-cruise period are generally confined to journal articles and scholarly literature or, particularly for raw or processed data sets, to the hard drives of those researchers. Thus, knowledge of and access to long tail research data that constitutes a significant portion of the overall output of scientific ocean drilling is at risk of remaining lost to the multidisciplinary Earth sciences community.</p><p>In order to address the issue of long tail data from scientific ocean drilling, the Integrated Ocean Drilling Program Management International (IODP-MI) partnered with the data publisher PANGAEA hosted by the Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research (AWI) and the University of Bremen Center for Marine Environmental Sciences (MARUM) to conduct a post-cruise data rescue project. This collaboration began in 2007 and continued until 2013. This report summarizes the goals, methods, results, and lessons learned from the IODP Post-Cruise Data Capture Project.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 17-20"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.01.003","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365265","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.016
John A. Moody , Deborah A. Martin , Robert H. Meade
No central database or repository is currently available in the USA to preserve long-term, spatially extensive records of fluvial geomorphic data or to provide future accessibility. Yet, because of their length and continuity these data are valuable for future research. Therefore, we built a public accessible website to preserve data records of two examples of long-term monitoring (40 and 18 years) of the fluvial geomorphic response to natural disturbances. One disturbance was ∼50-year flood on Powder River in Montana in 1978, and the second disturbance was a catastrophic flood on Spring Creek following a ∼100-year rainstorm after a wildfire in Colorado in 1996.
Two critical issues arise relative to preserving fluvial geomorphic data. The first is preserving the data themselves, but the second, and just as important, is preserving information about the location of the field research sites where the data were collected so the sites can be re-located and re-surveyed in the future. The latter allows long-term datasets to be extended into the future and to provide critical background data for interpreting future landscape changes. Data were preserved on a website to allow world-wide accessibility and to upload new data to the website as they become available. We describe the architecture of the website, lessons learned in developing the website, future improvements, and recommendations on how also to preserve information about the location of field research sites.
{"title":"Preserving geomorphic data records of flood disturbances","authors":"John A. Moody , Deborah A. Martin , Robert H. Meade","doi":"10.1016/j.grj.2015.02.016","DOIUrl":"10.1016/j.grj.2015.02.016","url":null,"abstract":"<div><p>No central database or repository is currently available in the USA to preserve long-term, spatially extensive records of fluvial geomorphic data or to provide future accessibility. Yet, because of their length and continuity these data are valuable for future research. Therefore, we built a public accessible website to preserve data records of two examples of long-term monitoring (40 and 18<!--> <!-->years) of the fluvial geomorphic response to natural disturbances. One disturbance was ∼50-year flood on Powder River in Montana in 1978, and the second disturbance was a catastrophic flood on Spring Creek following a ∼100-year rainstorm after a wildfire in Colorado in 1996.</p><p>Two critical issues arise relative to preserving fluvial geomorphic data. The first is preserving the data themselves, but the second, and just as important, is preserving information about the location of the field research sites where the data were collected so the sites can be re-located and re-surveyed in the future. The latter allows long-term datasets to be extended into the future and to provide critical background data for interpreting future landscape changes. Data were preserved on a website to allow world-wide accessibility and to upload new data to the website as they become available. We describe the architecture of the website, lessons learned in developing the website, future improvements, and recommendations on how also to preserve information about the location of field research sites.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 164-174"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.016","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}