Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.03.001
Peter H. Wiebe, M. Dickson Allison
Data generated as a result of publicly funded research in the USA and other countries are now required to be available in public data repositories. However, many scientific data over the past 50+ years were collected at a time when the technology for curation, storage, and dissemination were primitive or non-existent and consequently many of these datasets are not available publicly. These so-called “dark data” sets are essential to the understanding of how the ocean has changed chemically and biologically in response to the documented shifts in temperature and salinity (aka climate change). An effort is underway to bring into the light, dark data about zooplankton collected in the 1970s and 1980s as part of the cold-core and warm-core rings multidisciplinary programs and other related projects. Zooplankton biomass and euphausiid species abundance from 306 tows and related environmental data including many depth specific tows taken on 34 research cruises in the Northwest Atlantic are online and accessible from the Biological and Chemical Oceanography Data Management Office (BCO-DMO).
{"title":"Bringing dark data into the light: A case study of the recovery of Northwestern Atlantic zooplankton data collected in the 1970s and 1980s","authors":"Peter H. Wiebe, M. Dickson Allison","doi":"10.1016/j.grj.2015.03.001","DOIUrl":"10.1016/j.grj.2015.03.001","url":null,"abstract":"<div><p>Data generated as a result of publicly funded research in the USA and other countries are now required to be available in public data repositories. However, many scientific data over the past 50+ years were collected at a time when the technology for curation, storage, and dissemination were primitive or non-existent and consequently many of these datasets are <em>not</em> available publicly. These so-called “dark data” sets are essential to the understanding of how the ocean has changed chemically and biologically in response to the documented shifts in temperature and salinity (aka climate change). An effort is underway to bring into the light, dark data about zooplankton collected in the 1970s and 1980s as part of the cold-core and warm-core rings multidisciplinary programs and other related projects. Zooplankton biomass and euphausiid species abundance from 306 tows and related environmental data including many depth specific tows taken on 34 research cruises in the Northwest Atlantic are online and accessible from the Biological and Chemical Oceanography Data Management Office (BCO-DMO).</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 195-201"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.03.001","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.011
Jens Klump , Damian Ulbricht , Ronald Conze
On timescales beyond the life of a research project, a core task in the curation of digital research data is the migration of data and metadata to new storage media, new hardware, and software systems. These migrations are necessitated by ageing software systems, ageing hardware systems, and the rise of new technologies in data management. Using the example of the German Continental Deep Drilling Program (KTB) we outline steps taken to keep the acquired data accessible to researchers and trace the history of data management in KTB from a project platform in the early 1990ies through three migrations up to the current data management platform. The migration steps taken not only preserved the data, but also made data from KTB accessible via internet and citable through Digital Object Identifier (DOI). We also describe measures taken to manage hardware and software obsolescence and minimise the amount of maintenance necessary to keep data accessible beyond the active project phase. At present, data from KTB are stored in an Open Archival Information System (OAIS) compliant repository based on the eSciDoc repository framework. Information packages consist of self-contained packages of binary data files and discovery metadata in Extensible Mark-up Language (XML) format. The binary data files were created from a relational database used for data management in the previous version of the system, and from websites generated from a content management system. Metadata are provided in DataCite, GCMD-DIF, and ISO19139/INSPIRE schema definitions. Access to the KTB data is provided through download pages which are produced by XML transformation from the stored metadata.
{"title":"Curating the web’s deep past – Migration strategies for the German Continental Deep Drilling Program web content","authors":"Jens Klump , Damian Ulbricht , Ronald Conze","doi":"10.1016/j.grj.2015.02.011","DOIUrl":"10.1016/j.grj.2015.02.011","url":null,"abstract":"<div><p>On timescales beyond the life of a research project, a core task in the curation of digital research data is the migration of data and metadata to new storage media, new hardware, and software systems. These migrations are necessitated by ageing software systems, ageing hardware systems, and the rise of new technologies in data management. Using the example of the German Continental Deep Drilling Program (KTB) we outline steps taken to keep the acquired data accessible to researchers and trace the history of data management in KTB from a project platform in the early 1990ies through three migrations up to the current data management platform. The migration steps taken not only preserved the data, but also made data from KTB accessible via internet and citable through Digital Object Identifier (DOI). We also describe measures taken to manage hardware and software obsolescence and minimise the amount of maintenance necessary to keep data accessible beyond the active project phase. At present, data from KTB are stored in an Open Archival Information System (OAIS) compliant repository based on the eSciDoc repository framework. Information packages consist of self-contained packages of binary data files and discovery metadata in Extensible Mark-up Language (XML) format. The binary data files were created from a relational database used for data management in the previous version of the system, and from websites generated from a content management system. Metadata are provided in DataCite, GCMD-DIF, and ISO19139/INSPIRE schema definitions. Access to the KTB data is provided through download pages which are produced by XML transformation from the stored metadata.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 98-105"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.011","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365495","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.007
Denise J. Hills
Adopting standards for data and metadata collection is necessary for success of data rescue and preservation initiatives. Physical sample data and metadata rescue and preservation can be particularly challenging in that much of the available information may not be readily digitized or machine-readable. Making the legacy data rescue and preservation process as simple as possible through the development of template workflows can enable wider adoption of standards by personnel. Template workflows also simplify training of additional personnel to assist in the registration process.
{"title":"Let’s make it easy: A workflow for physical sample metadata rescue","authors":"Denise J. Hills","doi":"10.1016/j.grj.2015.02.007","DOIUrl":"10.1016/j.grj.2015.02.007","url":null,"abstract":"<div><p>Adopting standards for data and metadata collection is necessary for success of data rescue and preservation initiatives. Physical sample data and metadata rescue and preservation can be particularly challenging in that much of the available information may not be readily digitized or machine-readable. Making the legacy data rescue and preservation process as simple as possible through the development of template workflows can enable wider adoption of standards by personnel. Template workflows also simplify training of additional personnel to assist in the registration process.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 1-8"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.007","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365420","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.009
John F. Wehmiller , Vincent Pellerito
Amino acid racemization (AAR) dating methods have been used since the mid-1960s. Since that time, information technologies have evolved as AAR laboratories have worked to appropriately catalog sample collections and analyses. The University of Delaware AAR Database (UDAARDB) is a database of AAR and other geochronological data from coastal Quaternary sites in North and South America that has been in development for over 25 years. In that time, database and software platforms have changed and a concerted effort has been made to digitize legacy data for preservation and to make these data available for future use. To ensure data preservation, all or part of UDAARDB is redundantly hosted at three institutions as data files and maps. Furthermore, the flexible nature of accessing the data (i.e., as online maps and common format data files) helps to maintain a public presence and, therefore, assists in their preservation.
{"title":"An evolving database for Quaternary aminostratigraphy","authors":"John F. Wehmiller , Vincent Pellerito","doi":"10.1016/j.grj.2015.02.009","DOIUrl":"10.1016/j.grj.2015.02.009","url":null,"abstract":"<div><p>Amino acid racemization (AAR) dating methods have been used since the mid-1960s. Since that time, information technologies have evolved as AAR laboratories have worked to appropriately catalog sample collections and analyses. The University of Delaware AAR Database (UDAARDB) is a database of AAR and other geochronological data from coastal Quaternary sites in North and South America that has been in development for over 25<!--> <!-->years. In that time, database and software platforms have changed and a concerted effort has been made to digitize legacy data for preservation and to make these data available for future use. To ensure data preservation, all or part of UDAARDB is redundantly hosted at three institutions as data files and maps. Furthermore, the flexible nature of accessing the data (i.e., as online maps and common format data files) helps to maintain a public presence and, therefore, assists in their preservation.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 115-123"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.009","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.005
Elizabeth Bradshaw , Lesley Rickards , Thorkild Aarup
The Global Sea Level Observing System (GLOSS) Group of Experts (GE) data archaeology group is collating tools and producing guidelines for historic sea level data. They aim to aid the discovery, scanning, digitising and quality control of analogue tide gauge charts and sea level ledgers. Their goal is to improve the quality, quantity and availability of long-term sea level data series. This paper examines different tools for the automatic digitisation of tide gauge charts, the methods available for transcribing handwritten tide gauge ledgers and possible future developments that might speed up and partially automate these processes.
{"title":"Sea level data archaeology and the Global Sea Level Observing System (GLOSS)","authors":"Elizabeth Bradshaw , Lesley Rickards , Thorkild Aarup","doi":"10.1016/j.grj.2015.02.005","DOIUrl":"10.1016/j.grj.2015.02.005","url":null,"abstract":"<div><p>The Global Sea Level Observing System (GLOSS) Group of Experts (GE) data archaeology group is collating tools and producing guidelines for historic sea level data. They aim to aid the discovery, scanning, digitising and quality control of analogue tide gauge charts and sea level ledgers. Their goal is to improve the quality, quantity and availability of long-term sea level data series. This paper examines different tools for the automatic digitisation of tide gauge charts, the methods available for transcribing handwritten tide gauge ledgers and possible future developments that might speed up and partially automate these processes.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 9-16"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.005","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365394","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.010
Matthew B.J. Purss , Adam Lewis , Simon Oliver , Alex Ip , Joshua Sixsmith , Ben Evans , Roger Edberg , Glenn Frankish , Lachlan Hurst , Tai Chan
Earth Observation data acquired by the Landsat missions are of immense value to the global community and constitute the world’s longest continuous civilian Earth Observation program. However, because of the costs of data storage infrastructure these data have traditionally been stored in raw form on tape storage infrastructures which introduces a data retrieval and processing overhead that limits the efficiency of use of this data. As a consequence these data have become ‘dark data’ with only limited use in a piece-meal and labor intensive manner. The Unlocking the Landsat Archive project was set up in 2011 to address this issue and to help realize the true value and potential of these data.
The key outcome of the project was the migration of the raw Landsat data that was housed in tape archives at Geoscience Australia to High Performance Data facilities hosted by the National Computational Infrastructure (a super computer facility located at the Australian National University). Once this migration was completed the data were calibrated to produce a living and accessible archive of sensor and scene independent data products derived from Landsat-5 and Landsat-7 data for the period 1998–2012. The calibrated data were organized into High Performance Data structures, underpinned by ISO/OGC standards and web services, which have opened up a vast range of opportunities to efficiently apply these data to applications across multiple scientific domains.
{"title":"Unlocking the Australian Landsat Archive – From dark data to High Performance Data infrastructures","authors":"Matthew B.J. Purss , Adam Lewis , Simon Oliver , Alex Ip , Joshua Sixsmith , Ben Evans , Roger Edberg , Glenn Frankish , Lachlan Hurst , Tai Chan","doi":"10.1016/j.grj.2015.02.010","DOIUrl":"10.1016/j.grj.2015.02.010","url":null,"abstract":"<div><p>Earth Observation data acquired by the Landsat missions are of immense value to the global community and constitute the world’s longest continuous civilian Earth Observation program. However, because of the costs of data storage infrastructure these data have traditionally been stored in raw form on tape storage infrastructures which introduces a data retrieval and processing overhead that limits the efficiency of use of this data. As a consequence these data have become ‘dark data’ with only limited use in a piece-meal and labor intensive manner. The Unlocking the Landsat Archive project was set up in 2011 to address this issue and to help realize the true value and potential of these data.</p><p>The key outcome of the project was the migration of the raw Landsat data that was housed in tape archives at Geoscience Australia to High Performance Data facilities hosted by the National Computational Infrastructure (a super computer facility located at the Australian National University). Once this migration was completed the data were calibrated to produce a living and accessible archive of sensor and scene independent data products derived from Landsat-5 and Landsat-7 data for the period 1998–2012. The calibrated data were organized into High Performance Data structures, underpinned by ISO/OGC standards and web services, which have opened up a vast range of opportunities to efficiently apply these data to applications across multiple scientific domains.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 135-140"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.010","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365477","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.012
Leslie Hsu , Kerstin A. Lehnert , Andrew Goodwillie , John W. Delano , James B. Gill , Maurice A. Tivey , Vicki L. Ferrini , Suzanne M. Carbotte , Robert A. Arko
Over the course of a scientific career, a large fraction of the data collected by scientific investigators turns into data at risk of becoming inaccessible to future science. Although a part of the investigators’ data is made available in manuscripts and databases, other data may remain unpublished, non-digital, on degrading or near obsolete digital media, or inadequately documented for reuse. In 2013, Integrated Earth Data Applications (IEDA) provided data rescue mini-awards to three Earth science investigators. IEDA’s user communities in geochemistry, petrology, geochronology, and marine geophysics collect long-tail data, defined as data produced by individuals and small teams for specific projects, tending to be of small volume and initially for use only by these teams, thus being less likely to be easily transferred or reused. Long-tail data are at greater risk of omission from the scientific record. The awarded projects topics were (1) Geochemical and Geochronological data on volcanic rocks from the Fiji, Izu-Bonin-Mariana arc, and Endeavor segments of the global mid-ocean ridge, (2) High-Resolution, Near-bottom Magnetic Field Data, and (3) Geochemistry of Lunar Glasses. IEDA worked closely with the awardees to create a plan for the data rescue, resulting in the registration of hundreds of samples and the entry of dozens of data and documentation files into IEDA data systems. The data were made openly accessible and citable by assigning persistent identifiers for samples and files. The mini-award program proved that a relatively small incentive combined with data facility guidance can motivate investigators to accomplish significant data rescue.
{"title":"Rescue of long-tail data from the ocean bottom to the Moon: IEDA Data Rescue Mini-Awards","authors":"Leslie Hsu , Kerstin A. Lehnert , Andrew Goodwillie , John W. Delano , James B. Gill , Maurice A. Tivey , Vicki L. Ferrini , Suzanne M. Carbotte , Robert A. Arko","doi":"10.1016/j.grj.2015.02.012","DOIUrl":"10.1016/j.grj.2015.02.012","url":null,"abstract":"<div><p>Over the course of a scientific career, a large fraction of the data collected by scientific investigators turns into data at risk of becoming inaccessible to future science. Although a part of the investigators’ data is made available in manuscripts and databases, other data may remain unpublished, non-digital, on degrading or near obsolete digital media, or inadequately documented for reuse. In 2013, Integrated Earth Data Applications (IEDA) provided data rescue mini-awards to three Earth science investigators. IEDA’s user communities in geochemistry, petrology, geochronology, and marine geophysics collect long-tail data, defined as data produced by individuals and small teams for specific projects, tending to be of small volume and initially for use only by these teams, thus being less likely to be easily transferred or reused. Long-tail data are at greater risk of omission from the scientific record. The awarded projects topics were (1) Geochemical and Geochronological data on volcanic rocks from the Fiji, Izu-Bonin-Mariana arc, and Endeavor segments of the global mid-ocean ridge, (2) High-Resolution, Near-bottom Magnetic Field Data, and (3) Geochemistry of Lunar Glasses. IEDA worked closely with the awardees to create a plan for the data rescue, resulting in the registration of hundreds of samples and the entry of dozens of data and documentation files into IEDA data systems. The data were made openly accessible and citable by assigning persistent identifiers for samples and files. The mini-award program proved that a relatively small incentive combined with data facility guidance can motivate investigators to accomplish significant data rescue.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 108-114"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.012","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365517","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.01.007
Emile A. Okal
The youth of seismology as a science, compared to the typical duration of seismic cycles, results in a relative scarcity of records of large earthquakes available for processing by modern analytical techniques, which in turn makes archived datasets of historical seismograms extremely valuable in order to enhance our understanding of the occurrence of large, destructive earthquakes. Unfortunately, the value of these datasets is not always perceived adequately by decision-making administrators, which has resulted in the destruction (or last-minute salvage) of irreplaceable datasets.
We present a quick review of the nature of the datasets of seismological archives, and of specific algorithms allowing their use for the modern retrieval of the source characteristics of the relevant earthquakes. We then describe protocols for the transfer of analog datasets to digital support, including by contact-less photography when the poor physical state of the records prevents the use of mechanical scanners.
Finally, we give some worldwide examples of existing collections, and of successful programs of digital archiving of these valuable datasets.
{"title":"Historical seismograms: Preserving an endangered species","authors":"Emile A. Okal","doi":"10.1016/j.grj.2015.01.007","DOIUrl":"10.1016/j.grj.2015.01.007","url":null,"abstract":"<div><p>The youth of seismology as a science, compared to the typical duration of seismic cycles, results in a relative scarcity of records of large earthquakes available for processing by modern analytical techniques, which in turn makes archived datasets of historical seismograms extremely valuable in order to enhance our understanding of the occurrence of large, destructive earthquakes. Unfortunately, the value of these datasets is not always perceived adequately by decision-making administrators, which has resulted in the destruction (or last-minute salvage) of irreplaceable datasets.</p><p>We present a quick review of the nature of the datasets of seismological archives, and of specific algorithms allowing their use for the modern retrieval of the source characteristics of the relevant earthquakes. We then describe protocols for the transfer of analog datasets to digital support, including by contact-less photography when the poor physical state of the records prevents the use of mechanical scanners.</p><p>Finally, we give some worldwide examples of existing collections, and of successful programs of digital archiving of these valuable datasets.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 53-64"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.01.007","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365315","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-06-01DOI: 10.1016/j.grj.2015.02.015
Angela Riganti, Terence R. Farrell, Margaret J. Ellis, Felicia Irimies, Colin D. Strickland, Sarah K. Martin, Darren J. Wallace
For over a century the Geological Survey of Western Australia has been accumulating an enormous amount of information on the geology, mineral resources, and petroleum fields of Western Australia, either through the activities of State-employed regional mappers or the submission of mineral and petroleum reports mandated by State legislation. Recognizing the importance of this legacy for future exploration and research, in the last 25 years the Survey has been digitally capturing this information into custom-designed systems/databases that collate data on, amongst others, field observations (WAROX, for ‘Western Australia Rocks’), mineral exploration reports (WAMEX), and petroleum exploration information (WAPIMS). Data are made available to the public through the GeoVIEW.WA web application, designed in-house to view and query these integrated geoscientific and related datasets.
{"title":"125 years of legacy data at the Geological Survey of Western Australia: Capture and delivery","authors":"Angela Riganti, Terence R. Farrell, Margaret J. Ellis, Felicia Irimies, Colin D. Strickland, Sarah K. Martin, Darren J. Wallace","doi":"10.1016/j.grj.2015.02.015","DOIUrl":"10.1016/j.grj.2015.02.015","url":null,"abstract":"<div><p>For over a century the Geological Survey of Western Australia has been accumulating an enormous amount of information on the geology, mineral resources, and petroleum fields of Western Australia, either through the activities of State-employed regional mappers or the submission of mineral and petroleum reports mandated by State legislation. Recognizing the importance of this legacy for future exploration and research, in the last 25<!--> <!-->years the Survey has been digitally capturing this information into custom-designed systems/databases that collate data on, amongst others, field observations (WAROX, for ‘Western Australia Rocks’), mineral exploration reports (WAMEX), and petroleum exploration information (WAPIMS). Data are made available to the public through the GeoVIEW.WA web application, designed in-house to view and query these integrated geoscientific and related datasets.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"6 ","pages":"Pages 175-194"},"PeriodicalIF":0.0,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2015.02.015","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-03-01DOI: 10.1016/j.grj.2014.11.001
Willem Renema
Accurate assessment of location and timing of speciation of species is needed to discriminate between macroevolutionary models explaining large scale biodiversity patterns. In this paper I evaluate fossil evidence of variation in geographical ranges through time, as well as spatio-temporal variation in morphological parameters to examine geographical aspects of speciation and range variation. Specifically I test for geographical morphological stability within time slices and for temporal modes of morphological change within lineages.
Past distribution ranges of all species of the large benthic foraminifera Cycloclypeus have been documented on paleogeographic maps. From those samples with sufficiently well preserved specimens internal morphological data were measured and analysed.
Within a small sample of six species in a single genus of reef associated large benthic foraminifera evidence for heterogeneity in geographic speciation modes, including vicariance, peripheral speciation, and sympatric speciation in the centre of the range has been found. Morphological evolution was found to be either homogeneous over large geographic ranges or spatially restricted. In time two gradually evolving lineages were found. Furthermore, an evolutionary transition of two species that previously was regarded as gradual, is shown to be punctuated with intermediate populations occurring restricted in both time and space.
I demonstrate the marked heterogeneity of evolutionary processes and the difficulty to make assumptions regarding tempo and mode of evolution. Furthermore, I introduce the concept of geographically undersampled punctuations. This example exposes some of the pitfalls when conclusions regarding the mode and location of speciation are based on the combination of phylogeny and extant distribution alone.
{"title":"Spatiotemporal variation in morphological evolution in the Oligocene–Recent larger benthic foraminifera genus Cycloclypeus reveals geographically undersampled speciation","authors":"Willem Renema","doi":"10.1016/j.grj.2014.11.001","DOIUrl":"10.1016/j.grj.2014.11.001","url":null,"abstract":"<div><p>Accurate assessment of location and timing of speciation of species is needed to discriminate between macroevolutionary models explaining large scale biodiversity patterns. In this paper I evaluate fossil evidence of variation in geographical ranges through time, as well as spatio-temporal variation in morphological parameters to examine geographical aspects of speciation and range variation. Specifically I test for geographical morphological stability within time slices and for temporal modes of morphological change within lineages.</p><p>Past distribution ranges of all species of the large benthic foraminifera <em>Cycloclypeus</em> have been documented on paleogeographic maps. From those samples with sufficiently well preserved specimens internal morphological data were measured and analysed.</p><p>Within a small sample of six species in a single genus of reef associated large benthic foraminifera evidence for heterogeneity in geographic speciation modes, including vicariance, peripheral speciation, and sympatric speciation in the centre of the range has been found. Morphological evolution was found to be either homogeneous over large geographic ranges or spatially restricted. In time two gradually evolving lineages were found. Furthermore, an evolutionary transition of two species that previously was regarded as gradual, is shown to be punctuated with intermediate populations occurring restricted in both time and space.</p><p>I demonstrate the marked heterogeneity of evolutionary processes and the difficulty to make assumptions regarding tempo and mode of evolution. Furthermore, I introduce the concept of geographically undersampled punctuations. This example exposes some of the pitfalls when conclusions regarding the mode and location of speciation are based on the combination of phylogeny and extant distribution alone.</p></div>","PeriodicalId":93099,"journal":{"name":"GeoResJ","volume":"5 ","pages":"Pages 12-22"},"PeriodicalIF":0.0,"publicationDate":"2015-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.grj.2014.11.001","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"54365202","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}