The gravitational settling of organic particles in the ocean drives long-term sequestration of carbon from surface waters to the deep ocean. Quantifying the magnitude of carbon sequestration flux at high spatiotemporal resolution is critical for monitoring the ocean's ability to sequester carbon as ecological conditions change. Here, we propose a computer vision-based method for classifying images of sinking marine particles and using allometric relationships to estimate the amount of carbon that the particles transport to the deep ocean. We show that our method reduces the amount of time required by a human image annotator by at least 90% while producing ecologically informed estimates of carbon flux that are comparable to estimates based on purely manual review and chemical bulk carbon measurements. This method utilizes a human-in-the-loop domain adaptation approach to leverage images collected from previous sampling campaigns in classifying images from novel campaigns in the future. If used in conjunction with autonomous imaging platforms deployed throughout the world's oceans, this method has the potential to provide estimates of carbon sequestration fluxes at high spatiotemporal resolution while facilitating an understanding of the ecological pathways that are most important in driving these fluxes.
{"title":"A computer vision-based approach for estimating carbon fluxes from sinking particles in the ocean","authors":"Vinícius J. Amaral, Colleen A. Durkin","doi":"10.1002/lom3.10665","DOIUrl":"https://doi.org/10.1002/lom3.10665","url":null,"abstract":"<p>The gravitational settling of organic particles in the ocean drives long-term sequestration of carbon from surface waters to the deep ocean. Quantifying the magnitude of carbon sequestration flux at high spatiotemporal resolution is critical for monitoring the ocean's ability to sequester carbon as ecological conditions change. Here, we propose a computer vision-based method for classifying images of sinking marine particles and using allometric relationships to estimate the amount of carbon that the particles transport to the deep ocean. We show that our method reduces the amount of time required by a human image annotator by at least 90% while producing ecologically informed estimates of carbon flux that are comparable to estimates based on purely manual review and chemical bulk carbon measurements. This method utilizes a human-in-the-loop domain adaptation approach to leverage images collected from previous sampling campaigns in classifying images from novel campaigns in the future. If used in conjunction with autonomous imaging platforms deployed throughout the world's oceans, this method has the potential to provide estimates of carbon sequestration fluxes at high spatiotemporal resolution while facilitating an understanding of the ecological pathways that are most important in driving these fluxes.</p>","PeriodicalId":18145,"journal":{"name":"Limnology and Oceanography: Methods","volume":"23 2","pages":"117-130"},"PeriodicalIF":2.1,"publicationDate":"2024-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/lom3.10665","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143389457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nicholas Strait, David Taylor, Rebecca Forney, Jacob Amos, Jessica Miller
Biochronological information stored in the calcified structures of organisms provide fundamental organismal, environmental, and ecological data. Bones, teeth, statoliths, corals, and otoliths are widely used to answer a myriad of questions related to trophic position, migration, age and growth, environmental variation, and historical climate. Many calcified structures, particularly the ear stones of fishes (otoliths), are small (50 μm to 5 mm) and require precise preparation methods, which vary depending on the structure and research question but commonly include embedding, sectioning, and polishing prior to structural or chemical analysis. Globally, management agencies rely on the precise polishing of millions of otoliths each year to obtain vital demographic data, such as age and growth. However, this process is time consuming, labor intensive, and ergonomically strenuous. Since the early 1970s, there has been limited advancement in preparation methods with many still using manual approaches or costly, and at times inefficient, equipment. Therefore, we designed and fabricated an affordable, adjustable speed, multi-wheel polisher, which can be powered with alternating or direct current. Sample preparation time is reduced, and sample consistency is notably improved compared to manual approaches. While specifically designed for consistent and relatively rapid preparation of otolith thin sections, the polisher is readily adaptable to a variety of applications. Designs and manufacturing for these wheels are publicly available through the iLab at Oregon State University.
{"title":"Otoliths, bones, teeth, and more: Development of a new polishing wheel for calcified structures","authors":"Nicholas Strait, David Taylor, Rebecca Forney, Jacob Amos, Jessica Miller","doi":"10.1002/lom3.10662","DOIUrl":"https://doi.org/10.1002/lom3.10662","url":null,"abstract":"<p>Biochronological information stored in the calcified structures of organisms provide fundamental organismal, environmental, and ecological data. Bones, teeth, statoliths, corals, and otoliths are widely used to answer a myriad of questions related to trophic position, migration, age and growth, environmental variation, and historical climate. Many calcified structures, particularly the ear stones of fishes (otoliths), are small (50 <i>μ</i>m to 5 mm) and require precise preparation methods, which vary depending on the structure and research question but commonly include embedding, sectioning, and polishing prior to structural or chemical analysis. Globally, management agencies rely on the precise polishing of millions of otoliths each year to obtain vital demographic data, such as age and growth. However, this process is time consuming, labor intensive, and ergonomically strenuous. Since the early 1970s, there has been limited advancement in preparation methods with many still using manual approaches or costly, and at times inefficient, equipment. Therefore, we designed and fabricated an affordable, adjustable speed, multi-wheel polisher, which can be powered with alternating or direct current. Sample preparation time is reduced, and sample consistency is notably improved compared to manual approaches. While specifically designed for consistent and relatively rapid preparation of otolith thin sections, the polisher is readily adaptable to a variety of applications. Designs and manufacturing for these wheels are publicly available through the iLab at Oregon State University.</p>","PeriodicalId":18145,"journal":{"name":"Limnology and Oceanography: Methods","volume":"23 2","pages":"131-137"},"PeriodicalIF":2.1,"publicationDate":"2024-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/lom3.10662","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143389458","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Aquatic ecosystems face increasing threats from heatwaves driven by anthropogenic climate change, necessitating continued research to understand and manage the ecological consequences. Experimental studies are essential for understanding the impacts of heatwaves in aquatic systems; however, traditional experimental methods often fail to capture real-world complexity. Here, we present a method for simulating aquatic heatwaves that match the dynamic nature of real-world heatwave events in an experimental setting. Our method allows researchers to re-create heatwaves that have happened in the past or produce entirely new heatwave scenarios based on future projections. A Raspberry Pi serves as the foundation of our autonomous, customizable temperature control system, leveraging a low-cost and open-source platform for adaptability and accessibility. We demonstrate system functionality for laboratory experiments by first simulating a hypothetical marine heatwave scenario with defined temperature parameters and then replicating a real-world marine heatwave that occurred in the Santa Barbara Channel, California, in 2015. The average difference between desired and observed temperatures was 0.023°C for the basic heatwave simulation and less than 0.001°C for the real-world heatwave simulation, with standard deviations of 0.04°C and 0.01°C, respectively. Our novel method facilitates broader access to high-quality and affordable tools to study extreme climate events. By adopting a more realistic experimental approach, scientists can conduct more informative aquatic heatwaves studies.
{"title":"Bringing heatwaves into the lab: A low-cost, open-source, and automated system to simulate realistic warming events in an experimental setting","authors":"Amelia L. Ritger, Gretchen E. Hofmann","doi":"10.1002/lom3.10663","DOIUrl":"https://doi.org/10.1002/lom3.10663","url":null,"abstract":"<p>Aquatic ecosystems face increasing threats from heatwaves driven by anthropogenic climate change, necessitating continued research to understand and manage the ecological consequences. Experimental studies are essential for understanding the impacts of heatwaves in aquatic systems; however, traditional experimental methods often fail to capture real-world complexity. Here, we present a method for simulating aquatic heatwaves that match the dynamic nature of real-world heatwave events in an experimental setting. Our method allows researchers to re-create heatwaves that have happened in the past or produce entirely new heatwave scenarios based on future projections. A Raspberry Pi serves as the foundation of our autonomous, customizable temperature control system, leveraging a low-cost and open-source platform for adaptability and accessibility. We demonstrate system functionality for laboratory experiments by first simulating a hypothetical marine heatwave scenario with defined temperature parameters and then replicating a real-world marine heatwave that occurred in the Santa Barbara Channel, California, in 2015. The average difference between desired and observed temperatures was 0.023°C for the basic heatwave simulation and less than 0.001°C for the real-world heatwave simulation, with standard deviations of 0.04°C and 0.01°C, respectively. Our novel method facilitates broader access to high-quality and affordable tools to study extreme climate events. By adopting a more realistic experimental approach, scientists can conduct more informative aquatic heatwaves studies.</p>","PeriodicalId":18145,"journal":{"name":"Limnology and Oceanography: Methods","volume":"23 2","pages":"87-96"},"PeriodicalIF":2.1,"publicationDate":"2024-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/lom3.10663","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143389427","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Thomas M. DeCarlo, Allyndaire Whelehan, Brighton Hedger, Devyn Perry, Maya Pompel, Oliwia Jasnos, Avi Strange
We present CoralCT, a software application for analysis of annual extension, density, and calcification in coral skeletal cores. CoralCT can be used to analyze computed tomography (CT) scans or X-ray images of skeletal cores through a process in which observers interact with images of a core to define the locations of annual density bands. The application streamlines this process by organizing the observer-defined banding patterns and automatically measuring growth parameters. Analyses can be conducted in two or three dimensions, and observers have the option to utilize an automatic band-detection feature. CoralCT is linked to a server that stores the raw CT and X-ray image data, as well as output growth rate data for hundreds of cores. Overall, this server-based system enables broad collaborations on coral core analysis with standardized methods and—crucially—creates a pathway for implementing multiobserver analysis. We assess the method by comparing multiple techniques for measuring annual extension and density, including a corallite-tracing approach, medical imaging software, two-dimensional vs. three-dimensional analyses, and between multiple observers. We recommend that CoralCT be used not only as a measurement tool but also as a platform for data archiving and conducting open, collaborative science.
{"title":"CoralCT: A platform for transparent and collaborative analyses of growth parameters in coral skeletal cores","authors":"Thomas M. DeCarlo, Allyndaire Whelehan, Brighton Hedger, Devyn Perry, Maya Pompel, Oliwia Jasnos, Avi Strange","doi":"10.1002/lom3.10661","DOIUrl":"https://doi.org/10.1002/lom3.10661","url":null,"abstract":"<p>We present CoralCT, a software application for analysis of annual extension, density, and calcification in coral skeletal cores. CoralCT can be used to analyze computed tomography (CT) scans or X-ray images of skeletal cores through a process in which observers interact with images of a core to define the locations of annual density bands. The application streamlines this process by organizing the observer-defined banding patterns and automatically measuring growth parameters. Analyses can be conducted in two or three dimensions, and observers have the option to utilize an automatic band-detection feature. CoralCT is linked to a server that stores the raw CT and X-ray image data, as well as output growth rate data for hundreds of cores. Overall, this server-based system enables broad collaborations on coral core analysis with standardized methods and—crucially—creates a pathway for implementing multiobserver analysis. We assess the method by comparing multiple techniques for measuring annual extension and density, including a corallite-tracing approach, medical imaging software, two-dimensional vs. three-dimensional analyses, and between multiple observers. We recommend that CoralCT be used not only as a measurement tool but also as a platform for data archiving and conducting open, collaborative science.</p>","PeriodicalId":18145,"journal":{"name":"Limnology and Oceanography: Methods","volume":"23 2","pages":"97-116"},"PeriodicalIF":2.1,"publicationDate":"2024-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/lom3.10661","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143389258","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Axel Wohleber, Camille Blouzon, Julien Witwicky, Patrick Ginot, Nicolas C. Jourdain, Roberto Grilli
We describe a novel compact autonomous in situ sensor for semi-continuous measurement of water isotopes (δD, δ18O, and δ17O) in liquid water. The sensor relies on a dual-inlet water vapor injection system based on the pervaporation through a semi-permeable membrane, and on the water vapor composition analysis using a dedicated optical feedback cavity enhanced absorption spectrometer. The sensor has dimensions of 165 mm diameter and 550 mm long, for a weight of ∼ 8 kg. A titanium casing allows applications down to 6000 m deep for a total effective weight of 45 (23) kg in air (water). It has a power consumption of ∼ 40 W, and an autonomy of 10–12 h which is ensured by a dedicated Li-ion battery pack. The sensor is equipped with single-pair high-speed digital subscriber line communication for telemetry purposes. The instrument provides an accuracy of 0.3‰ (2σ) for all water isotopes with a 9-min integration time. The instrument is suitable for investigating the freshwater cycle in the ocean, and in particular the transformation of ocean water masses related to iceberg and ice shelf melting.
{"title":"A membrane inlet laser spectrometer for in situ measurement of triple water isotopologues","authors":"Axel Wohleber, Camille Blouzon, Julien Witwicky, Patrick Ginot, Nicolas C. Jourdain, Roberto Grilli","doi":"10.1002/lom3.10660","DOIUrl":"https://doi.org/10.1002/lom3.10660","url":null,"abstract":"<p>We describe a novel compact autonomous in situ sensor for semi-continuous measurement of water isotopes (δD, δ<sup>18</sup>O, and δ<sup>17</sup>O) in liquid water. The sensor relies on a dual-inlet water vapor injection system based on the pervaporation through a semi-permeable membrane, and on the water vapor composition analysis using a dedicated optical feedback cavity enhanced absorption spectrometer. The sensor has dimensions of 165 mm diameter and 550 mm long, for a weight of ∼ 8 kg. A titanium casing allows applications down to 6000 m deep for a total effective weight of 45 (23) kg in air (water). It has a power consumption of ∼ 40 W, and an autonomy of 10–12 h which is ensured by a dedicated Li-ion battery pack. The sensor is equipped with single-pair high-speed digital subscriber line communication for telemetry purposes. The instrument provides an accuracy of 0.3‰ (2<i>σ</i>) for all water isotopes with a 9-min integration time. The instrument is suitable for investigating the freshwater cycle in the ocean, and in particular the transformation of ocean water masses related to iceberg and ice shelf melting.</p>","PeriodicalId":18145,"journal":{"name":"Limnology and Oceanography: Methods","volume":"23 1","pages":"26-38"},"PeriodicalIF":2.1,"publicationDate":"2024-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/lom3.10660","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143120093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Chen, S. P. Kyathanahally, M. Reyes, S. Merkli, E. Merz, E. Francazi, M. Hoege, F. Pomati, M. Baity-Jesi
Modern plankton high-throughput monitoring relies on deep learning classifiers for species recognition in water ecosystems. Despite satisfactory nominal performances, a significant challenge arises from dataset shift, which causes performances to drop during deployment. In our study, we integrate the ZooLake dataset, which consists of dark-field images of lake plankton (Kyathanahally et al. 2021a), with manually annotated images from 10 independent days of deployment, serving as test cells to benchmark out-of-dataset (OOD) performances. Our analysis reveals instances where classifiers, initially performing well in in-dataset conditions, encounter notable failures in practical scenarios. For example, a MobileNet with a 92% nominal test accuracy shows a 77% OOD accuracy. We systematically investigate conditions leading to OOD performance drops and propose a preemptive assessment method to identify potential pitfalls when classifying new data, and pinpoint features in OOD images that adversely impact classification. We present a three-step pipeline: (i) identifying OOD degradation compared to nominal test performance, (ii) conducting a diagnostic analysis of degradation causes, and (iii) providing solutions. We find that ensembles of BEiT vision transformers, with targeted augmentations addressing OOD robustness, geometric ensembling, and rotation-based test-time augmentation, constitute the most robust model, which we call BEsT. It achieves an 83% OOD accuracy, with errors concentrated on container classes. Moreover, it exhibits lower sensitivity to dataset shift, and reproduces well the plankton abundances. Our proposed pipeline is applicable to generic plankton classifiers, contingent on the availability of suitable test cells. By identifying critical shortcomings and offering practical procedures to fortify models against dataset shift, our study contributes to the development of more reliable plankton classification technologies.
{"title":"Producing plankton classifiers that are robust to dataset shift","authors":"C. Chen, S. P. Kyathanahally, M. Reyes, S. Merkli, E. Merz, E. Francazi, M. Hoege, F. Pomati, M. Baity-Jesi","doi":"10.1002/lom3.10659","DOIUrl":"https://doi.org/10.1002/lom3.10659","url":null,"abstract":"<p>Modern plankton high-throughput monitoring relies on deep learning classifiers for species recognition in water ecosystems. Despite satisfactory nominal performances, a significant challenge arises from dataset shift, which causes performances to drop during deployment. In our study, we integrate the ZooLake dataset, which consists of dark-field images of lake plankton (Kyathanahally et al. 2021a), with manually annotated images from 10 independent days of deployment, serving as <i>test cells</i> to benchmark out-of-dataset (OOD) performances. Our analysis reveals instances where classifiers, initially performing well in in-dataset conditions, encounter notable failures in practical scenarios. For example, a MobileNet with a 92% nominal test accuracy shows a 77% OOD accuracy. We systematically investigate conditions leading to OOD performance drops and propose a preemptive assessment method to identify potential pitfalls when classifying new data, and pinpoint features in OOD images that adversely impact classification. We present a three-step pipeline: (i) identifying OOD degradation compared to nominal test performance, (ii) conducting a diagnostic analysis of degradation causes, and (iii) providing solutions. We find that ensembles of BEiT vision transformers, with targeted augmentations addressing OOD robustness, geometric ensembling, and rotation-based test-time augmentation, constitute the most robust model, which we call <i>BEsT</i>. It achieves an 83% OOD accuracy, with errors concentrated on container classes. Moreover, it exhibits lower sensitivity to dataset shift, and reproduces well the plankton abundances. Our proposed pipeline is applicable to generic plankton classifiers, contingent on the availability of suitable test cells. By identifying critical shortcomings and offering practical procedures to fortify models against dataset shift, our study contributes to the development of more reliable plankton classification technologies.</p>","PeriodicalId":18145,"journal":{"name":"Limnology and Oceanography: Methods","volume":"23 1","pages":"39-66"},"PeriodicalIF":2.1,"publicationDate":"2024-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143120092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. L. Dickerson, A. T. Fisher, R. N. Harris, M. Hutnak
We present software for processing and interpretation of marine heat-flow data. These data commonly include in situ measurements of the thermal gradient and thermal conductivity as a function of subseafloor depth, and are used to calculate vertical heat flow. New software includes SlugPen, for parsing and correcting datasets for each penetration, and SlugHeat, for calculating equilibrium temperatures (