Count data time series, characterized by non-negative integer values, frequently arise across diverse domains, including finance, public health, economics, epidemiology, and environmental sciences. Such series often exhibit characteristics such as equidispersion, overdispersion, underdispersion, and zero-inflation/deflation. Failure to appropriately account for these features can result in biased parameter estimates and misleading statistical inference. This review presents a comprehensive overview of recent methodological developments in integer-valued autoregressive (INAR) models, with particular emphasis on thinning operators, estimation methods, and model extensions. A systematic literature search was conducted using electronic databases, including Scopus and Google Scholar, to identify relevant studies published between 2010 and 2024. Recent research has primarily focused on the development of novel thinning operators and flexible innovation distributions aimed at constructing unified modeling frameworks capable of accommodating multiple characteristics of count data simultaneously. This review highlights prevailing research trends, identifies existing methodological gaps, and outlines promising directions for future research in count data time series modeling.
{"title":"Recent advancements in integer-valued autoregressive models for count data time series: A comprehensive review","authors":"Vinitha Serrao, Satyanarayana Poojari, Asha Kamath","doi":"10.1016/j.mex.2026.103805","DOIUrl":"10.1016/j.mex.2026.103805","url":null,"abstract":"<div><div>Count data time series, characterized by non-negative integer values, frequently arise across diverse domains, including finance, public health, economics, epidemiology, and environmental sciences. Such series often exhibit characteristics such as equidispersion, overdispersion, underdispersion, and zero-inflation/deflation. Failure to appropriately account for these features can result in biased parameter estimates and misleading statistical inference. This review presents a comprehensive overview of recent methodological developments in integer-valued autoregressive (INAR) models, with particular emphasis on thinning operators, estimation methods, and model extensions. A systematic literature search was conducted using electronic databases, including Scopus and Google Scholar, to identify relevant studies published between 2010 and 2024. Recent research has primarily focused on the development of novel thinning operators and flexible innovation distributions aimed at constructing unified modeling frameworks capable of accommodating multiple characteristics of count data simultaneously. This review highlights prevailing research trends, identifies existing methodological gaps, and outlines promising directions for future research in count data time series modeling.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103805"},"PeriodicalIF":1.9,"publicationDate":"2026-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-24DOI: 10.1016/j.mex.2026.103804
Eduardo Zamudio-Huertas , César Augusto García-Ubaque , Nelson Obregón-Neira
Reliable discharge estimation is essential for water resource management, yet many regions lack sufficient hydrological stations. To address this limitation, we propose the Spatial Hydraulic Geometry Interpolation (SHGI) method, which estimates discharge (Q), hydraulic depth (D), and mean velocity (V) from river width (W) obtained via surveys or satellite imagery. SHGI integrates hydraulic geometry theory with multiquadric radial basis interpolation, applied to the Meta and Atrato river basins in Colombia. Parameters of at‑station hydraulic geometry (coefficients , c, k and exponents b, f, m) were derived using least squares and transformed into log‑ratio space to preserve their compositional constraints. Interpolation along upstream distance ensures spatial continuity, and closure operations guarantee internal consistency. Validation against observed data in basins with contrasting geomorphology and data density confirmed the method’s robustness.
The principal contributions of SHGI are:
•
Longitudinal continuity: explicit incorporation of upstream distance to interpolate parameters consistently along channels and tributaries.
•
Compositional integrity: preservation of the multiplicative and additive constraints of hydraulic geometry parameters during interpolation.
•
Estimation under data scarcity: enabling calculation of Q, D, and V at ungauged sites using only river width.
{"title":"Method for estimating discharge, hydraulic depth, and mean velocity in rivers through spatial interpolation of at-a-station hydraulic geometry in data- scarce regions","authors":"Eduardo Zamudio-Huertas , César Augusto García-Ubaque , Nelson Obregón-Neira","doi":"10.1016/j.mex.2026.103804","DOIUrl":"10.1016/j.mex.2026.103804","url":null,"abstract":"<div><div>Reliable discharge estimation is essential for water resource management, yet many regions lack sufficient hydrological stations. To address this limitation, we propose the Spatial Hydraulic Geometry Interpolation (SHGI) method, which estimates discharge (Q), hydraulic depth (D), and mean velocity (V) from river width (W) obtained via surveys or satellite imagery. SHGI integrates hydraulic geometry theory with multiquadric radial basis interpolation, applied to the Meta and Atrato river basins in Colombia. Parameters of at‑station hydraulic geometry (coefficients <span><math><mi>a</mi></math></span>, <em>c, k</em> and exponents <em>b, f, m</em>) were derived using least squares and transformed into log‑ratio space to preserve their compositional constraints. Interpolation along upstream distance ensures spatial continuity, and closure operations guarantee internal consistency. Validation against observed data in basins with contrasting geomorphology and data density confirmed the method’s robustness.</div><div>The principal contributions of SHGI are:<ul><li><span>•</span><span><div>Longitudinal continuity: explicit incorporation of upstream distance to interpolate parameters consistently along channels and tributaries.</div></span></li><li><span>•</span><span><div>Compositional integrity: preservation of the multiplicative and additive constraints of hydraulic geometry parameters during interpolation.</div></span></li><li><span>•</span><span><div>Estimation under data scarcity: enabling calculation of Q, D, and V at ungauged sites using only river width.</div></span></li></ul></div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103804"},"PeriodicalIF":1.9,"publicationDate":"2026-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077547","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-16DOI: 10.1016/j.mex.2026.103800
Letícia da Silva Brito , Sidinei Magela Thomaz , Heliana Teixeira , Ana I. Lillebø
Pontederia crassipes is known for its asexual reproduction and rapid growth. Outside its native range, it has been identified as an environmental threat, while it has also been widely used for ex-situ phytoremediation. To understand both its invasive potential and its phytoremediation capacity, it is necessary to examine the environmental factors that favor its growth beyond those already described in the literature, such as water temperature and nutrient availability. Previous studies also suggest that alkalinity, conductivity, dissolved oxygen, salinity, water depth and pH. These variables help define the species niche and highlight the importance of distinguishing between its fundamental niche, the full set of abiotic conditions that support growth and its realized niche, which reflects biotic interactions and local constraints. However, the scientific literature does not yet provide sufficient description of the ex-situ experimental conditions required for the successful cultivation of this aquatic plant in controlled settings. This protocol therefore reports the results and lessons learned from a series of mesocosm experiments. By standardizing procedures and documenting growth outcomes, the protocol enhances reproducibility, facilitates comparisons across studies and supports both basic and applied research on P. crassipes.
{"title":"Ex-situ growth protocol for the invasive macrophyte Pontederia crassipes","authors":"Letícia da Silva Brito , Sidinei Magela Thomaz , Heliana Teixeira , Ana I. Lillebø","doi":"10.1016/j.mex.2026.103800","DOIUrl":"10.1016/j.mex.2026.103800","url":null,"abstract":"<div><div><em>Pontederia crassipes</em> is known for its asexual reproduction and rapid growth. Outside its native range, it has been identified as an environmental threat, while it has also been widely used for <em>ex-situ</em> phytoremediation. To understand both its invasive potential and its phytoremediation capacity, it is necessary to examine the environmental factors that favor its growth beyond those already described in the literature, such as water temperature and nutrient availability. Previous studies also suggest that alkalinity, conductivity, dissolved oxygen, salinity, water depth and pH. These variables help define the species niche and highlight the importance of distinguishing between its fundamental niche, the full set of abiotic conditions that support growth and its realized niche, which reflects biotic interactions and local constraints. However, the scientific literature does not yet provide sufficient description of the <em>ex-situ</em> experimental conditions required for the successful cultivation of this aquatic plant in controlled settings. This protocol therefore reports the results and lessons learned from a series of mesocosm experiments. By standardizing procedures and documenting growth outcomes, the protocol enhances reproducibility, facilitates comparisons across studies and supports both basic and applied research on <em>P. crassipes.</em></div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103800"},"PeriodicalIF":1.9,"publicationDate":"2026-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077548","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Oceans exhibit complex dynamics influenced by climate change, anthropogenic activities, and natural phenomena. Understanding these dynamics is critical for ensuring the sustainability of marine environments and their optimal utilization. This research aims to study and monitor upwelling phenomena in the South Sea of Java. Upwelling, the exchange of nutrient-rich, cold water from deeper layers to the surface, enhances marine biological productivity; Sea Surface Temperature (SST) serves as a key indicator for its detection. To achieve these objectives, this study employs both ConvLSTM and 3D-CNN. ConvLSTM, a deep learning architecture that integrates convolutional structures within LSTM units, effectively captures spatiotemporal dependencies in sequential data. 3D-CNN, a deep learning model extending traditional 2D convolutional neural networks, processes volumetric data, enabling the extraction of spatial features across three dimensions. Analysis reveals that ConvLSTM outperforms 3D-CNN in modeling upwelling data in the South Sea of Java. This is evidenced by lower Root Mean Square Error (RMSE) and Mean Absolute Error (MAE). The ConvLSTM method was then used for forecasting, and the results were validated with data obtained from local fishermen regarding their fishing expeditions. Visual analysis confirms that the ConvLSTM method accurately models upwelling data in the South Sea of Java with fishermen's schedules.
ConvLSTM and 3D-CNN methods were comparatively evaluated for modeling Sea Surface Temperature (SST) data, considering wind speed, sea surface salinity, and the El Niño-Southern Oscillation (ENSO) phase as influential factors.
Based on Root Mean Square Error (RMSE) and Mean Absolute Error (MAE) values, the ConvLSTM method exhibited lower values, indicating superior performance compared to the 3D-CNN approach. Specifically, RMSE and MAE values for ConvLSTM were 0.4161 and 0.3017, respectively, while for 3D-CNN, the corresponding values were 0.6095 and 0.4259.
Upwelling data forecasting results were validated against local fishermen's schedules, with data collected in July 2022. Visual inspection confirmed alignment between the forecasted upwelling patterns and the fishermen's activity.
{"title":"Predicting upwelling dynamics in the South Sea of Java, Indonesia: A deep learning approach with ConvLSTM and 3D-CNN","authors":"Dwi Rantini , Rumaisa Kruba , Yudi Haditiar , Muhammad Ikhwan , Yusuf Jati Wijaya , Aris Ismanto , Muhammad Mahdy Yandra , Hafiz Rahman , Arip Ramadan , Fazidah Othman","doi":"10.1016/j.mex.2026.103802","DOIUrl":"10.1016/j.mex.2026.103802","url":null,"abstract":"<div><div>Oceans exhibit complex dynamics influenced by climate change, anthropogenic activities, and natural phenomena. Understanding these dynamics is critical for ensuring the sustainability of marine environments and their optimal utilization. This research aims to study and monitor upwelling phenomena in the South Sea of Java. Upwelling, the exchange of nutrient-rich, cold water from deeper layers to the surface, enhances marine biological productivity; Sea Surface Temperature (SST) serves as a key indicator for its detection. To achieve these objectives, this study employs both ConvLSTM and 3D-CNN. ConvLSTM, a deep learning architecture that integrates convolutional structures within LSTM units, effectively captures spatiotemporal dependencies in sequential data. 3D-CNN, a deep learning model extending traditional 2D convolutional neural networks, processes volumetric data, enabling the extraction of spatial features across three dimensions. Analysis reveals that ConvLSTM outperforms 3D-CNN in modeling upwelling data in the South Sea of Java. This is evidenced by lower Root Mean Square Error (RMSE) and Mean Absolute Error (MAE). The ConvLSTM method was then used for forecasting, and the results were validated with data obtained from local fishermen regarding their fishing expeditions. Visual analysis confirms that the ConvLSTM method accurately models upwelling data in the South Sea of Java with fishermen's schedules.</div><div>ConvLSTM and 3D-CNN methods were comparatively evaluated for modeling Sea Surface Temperature (SST) data, considering wind speed, sea surface salinity, and the El Niño-Southern Oscillation (ENSO) phase as influential factors.</div><div>Based on Root Mean Square Error (RMSE) and Mean Absolute Error (MAE) values, the ConvLSTM method exhibited lower values, indicating superior performance compared to the 3D-CNN approach. Specifically, RMSE and MAE values for ConvLSTM were 0.4161 and 0.3017, respectively, while for 3D-CNN, the corresponding values were 0.6095 and 0.4259.</div><div>Upwelling data forecasting results were validated against local fishermen's schedules, with data collected in July 2022. Visual inspection confirmed alignment between the forecasted upwelling patterns and the fishermen's activity.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103802"},"PeriodicalIF":1.9,"publicationDate":"2026-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146034411","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-14DOI: 10.1016/j.mex.2026.103795
Merve Caner , Murat Tunç , Gürsel Sunal , Mehmet Akif Sarıkaya
Inductively Coupled Plasma Mass Spectrometry (ICP-MS), when combined with laser ablation (LA-ICP-MS), is widely used for precise U-Pb zircon geochronology. While the technique is powerful, reliable age determinations depend on careful optimization of laser and plasma parameters of the instrument. In this study, we used a Sector Field ICP-MS (Thermo Scientific Element 2) coupled to a New Wave UP213 laser to present practical, step-by-step measurement and optimization procedures for geological U–Pb dating.
Here, Design of Experiments (DoE)—a statistical approach that evaluates multiple parameters simultaneously and reveals their interactions—was used to efficiently determine optimal analytical conditions with a limited number of experiments. For the first time in LA-ICP-MS research, a DoE framework was applied to systematically optimize instrumental settings. DoE is also compared with conventional manual optimization strategies, providing clear evidence that statistically guided optimization enhances accuracy, reproducibility, and robustness, while reducing operator bias and time required for method development.
•
The outcome of our method demonstrate that DoE not only identifies the most influential parameters but also accounts for interactions often overlooked during manual refinement. Optimization of U-Pb dating using LA-ICP-MS.
•
First systematic application of DoE in LA-ICP-MS studies.
•
Comparative evaluation of manual versus DoE-based optimization.
电感耦合等离子体质谱(ICP-MS)与激光烧蚀(LA-ICP-MS)相结合,被广泛用于精确的U-Pb锆石年代学。虽然这项技术很强大,但可靠的年龄测定取决于仪器的激光和等离子体参数的仔细优化。在这项研究中,我们使用了Sector Field ICP-MS (Thermo Scientific Element 2)和New Wave UP213激光器,为地质U-Pb定年提供了实用的、逐步的测量和优化程序。在这里,实验设计(DoE)是一种同时评估多个参数并揭示其相互作用的统计方法,用于在有限的实验数量下有效地确定最佳分析条件。在LA-ICP-MS研究中,首次采用DoE框架系统地优化仪器设置。DoE还与传统的人工优化策略进行了比较,提供了明确的证据,表明统计指导优化提高了准确性、可重复性和鲁棒性,同时减少了操作人员的偏差和方法开发所需的时间。•我们的方法的结果表明,DoE不仅确定了最具影响力的参数,而且还说明了在人工精化过程中经常被忽视的相互作用。LA-ICP-MS对U-Pb定年的优化。•首次将DoE系统应用于LA-ICP-MS研究。•手动优化与基于doe优化的比较评估。
{"title":"Method development with LA-ICP-MS in U-Pb geochronology using design of experiment","authors":"Merve Caner , Murat Tunç , Gürsel Sunal , Mehmet Akif Sarıkaya","doi":"10.1016/j.mex.2026.103795","DOIUrl":"10.1016/j.mex.2026.103795","url":null,"abstract":"<div><div>Inductively Coupled Plasma Mass Spectrometry (ICP-MS), when combined with laser ablation (LA-ICP-MS), is widely used for precise U-Pb zircon geochronology. While the technique is powerful, reliable age determinations depend on careful optimization of laser and plasma parameters of the instrument. In this study, we used a Sector Field ICP-MS (Thermo Scientific Element 2) coupled to a New Wave UP213 laser to present practical, step-by-step measurement and optimization procedures for geological U–Pb dating.</div><div>Here, Design of Experiments (DoE)—a statistical approach that evaluates multiple parameters simultaneously and reveals their interactions—was used to efficiently determine optimal analytical conditions with a limited number of experiments. For the first time in LA-ICP-MS research, a DoE framework was applied to systematically optimize instrumental settings. DoE is also compared with conventional manual optimization strategies, providing clear evidence that statistically guided optimization enhances accuracy, reproducibility, and robustness, while reducing operator bias and time required for method development.<ul><li><span>•</span><span><div>The outcome of our method demonstrate that DoE not only identifies the most influential parameters but also accounts for interactions often overlooked during manual refinement. Optimization of U-Pb dating using LA-ICP-MS.</div></span></li><li><span>•</span><span><div>First systematic application of DoE in LA-ICP-MS studies.</div></span></li><li><span>•</span><span><div>Comparative evaluation of manual versus DoE-based optimization.</div></span></li></ul></div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103795"},"PeriodicalIF":1.9,"publicationDate":"2026-01-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146034292","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-14DOI: 10.1016/j.mex.2026.103797
Eugene B. Postnikov , Anastasia I. Lavrova
The registration of fluorescence intensity converted from resazurin in the course of the respective microtiter assay, as a response to the activity of antibacterial drugs, provides a basis for the quantitative characterisation of their activity. At the same time, this assay operates with relatively small samples, and the studied biochemical process exhibits variability. In this work, we address the question of reporting the minimal inhibitory concentration arguing in favour of its range rather than a single value. To achieve this goal, we propose a method for the combinatorial enhancement of single microplate-based data, followed by non-parametric statistical processing. The approach is illustrated by the case study of ten first- and second-line anti-tuberculosis drugs acting on the standard laboratory strain H37Rv of Mycobacterium tuberculosis.
{"title":"Quantifying uncertainty of tuberculosis drug susceptibility range from single-microplate test","authors":"Eugene B. Postnikov , Anastasia I. Lavrova","doi":"10.1016/j.mex.2026.103797","DOIUrl":"10.1016/j.mex.2026.103797","url":null,"abstract":"<div><div>The registration of fluorescence intensity converted from resazurin in the course of the respective microtiter assay, as a response to the activity of antibacterial drugs, provides a basis for the quantitative characterisation of their activity. At the same time, this assay operates with relatively small samples, and the studied biochemical process exhibits variability. In this work, we address the question of reporting the minimal inhibitory concentration arguing in favour of its range rather than a single value. To achieve this goal, we propose a method for the combinatorial enhancement of single microplate-based data, followed by non-parametric statistical processing. The approach is illustrated by the case study of ten first- and second-line anti-tuberculosis drugs acting on the standard laboratory strain H37Rv of <em>Mycobacterium tuberculosis</em>.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103797"},"PeriodicalIF":1.9,"publicationDate":"2026-01-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146034290","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Health polices in residential care settings for older adults are essential to align housing and care provision with the needs of ageing populations, ensuring safety, participation, and quality of life. Despite their importance, evidence regarding policy implementation in long-term care (LTC) facilities remains fragmented and inconsistent.
Objective
This protocol describes an original method for synthesizing health policy documents using the Joanna Briggs Institute (JBI) framework for textual evidence. It aims to identify, analyze, and integrate policy evidence related to LTC facilities for older adults.
Methods
Following JBI methodological guidance for systematic reviews of text and opinion, this protocol employs the PICo framework to define inclusion criteria and a three-step search strategy. Searches will be conducted in MEDLINE, CINAHL Complete®, and the Virtual Health Library (BVS). Eligible documents include laws, regulations, policy frameworks, and technical guidelines addressing long-term care within publicly regulated systems. Data extraction and quality appraisal will be independently performed by two reviewers using JBI instruments.
Expected Results
The review will synthesize existing polices, highlighting their characteristics, implementation strategies, and outcomes. By applying a transparent and replicable JBI-based method, this protocol supports the production of high-quality evidence to inform equitable and effective governance in LTC facilities
{"title":"Health policies in long-term care facilities for older adults: A systematic review of textual evidence","authors":"Florbela Bia , Matheus Kirton dos Anjos , Zaida Charepe , Cristina Marques-Vieira","doi":"10.1016/j.mex.2026.103798","DOIUrl":"10.1016/j.mex.2026.103798","url":null,"abstract":"<div><h3>Background</h3><div>Health polices in residential care settings for older adults are essential to align housing and care provision with the needs of ageing populations, ensuring safety, participation, and quality of life. Despite their importance, evidence regarding policy implementation in long-term care (LTC) facilities remains fragmented and inconsistent.</div></div><div><h3>Objective</h3><div>This protocol describes an original method for synthesizing health policy documents using the Joanna Briggs Institute (JBI) framework for textual evidence. It aims to identify, analyze, and integrate policy evidence related to LTC facilities for older adults.</div></div><div><h3>Methods</h3><div>Following JBI methodological guidance for systematic reviews of text and opinion, this protocol employs the PICo framework to define inclusion criteria and a three-step search strategy. Searches will be conducted in MEDLINE, CINAHL Complete®, and the Virtual Health Library (BVS). Eligible documents include laws, regulations, policy frameworks, and technical guidelines addressing long-term care within publicly regulated systems. Data extraction and quality appraisal will be independently performed by two reviewers using JBI instruments.</div></div><div><h3>Expected Results</h3><div>The review will synthesize existing polices, highlighting their characteristics, implementation strategies, and outcomes. By applying a transparent and replicable JBI-based method, this protocol supports the production of high-quality evidence to inform equitable and effective governance in LTC facilities</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103798"},"PeriodicalIF":1.9,"publicationDate":"2026-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146034287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-13DOI: 10.1016/j.mex.2026.103799
Felice Diekel, Rosalie Arendt, Markus Berger
Environmental and climate policies, as well as the knowledge underpinning them, are often developed in isolation. This is evident in offsetting research and policy, which tend to address carbon, biodiversity, and water as separate issues. This paper presents the development of an adapted scoping review methodology to compare these three distinct bodies of literature within a unified framework, which also allows for the introduction of the emerging water offsetting literature. The approach ensures comparability across datasets of relevant literature while addressing the challenge of managing large volumes of literature within time and resource constraints. It provides a practical solution for managing diverse bodies of literature in scoping reviews, enabling a holistic understanding of the interrelationships among carbon, biodiversity, and water offsetting.
Key elements of the method include:
•
Applying a consistent approach across all three datasets, while accommodating the specificities of each.
•
Utilizing the machine learning tool ASReviewer to streamline the screening process, alongside a pilot screening phase to establish consistent inclusion criteria.
•
Combining quantitative bibliometric analysis with qualitative thematic analysis.
{"title":"A comparative scoping review approach: identifying the intersection of carbon, biodiversity, and water offsetting","authors":"Felice Diekel, Rosalie Arendt, Markus Berger","doi":"10.1016/j.mex.2026.103799","DOIUrl":"10.1016/j.mex.2026.103799","url":null,"abstract":"<div><div>Environmental and climate policies, as well as the knowledge underpinning them, are often developed in isolation. This is evident in offsetting research and policy, which tend to address carbon, biodiversity, and water as separate issues. This paper presents the development of an adapted scoping review methodology to compare these three distinct bodies of literature within a unified framework, which also allows for the introduction of the emerging water offsetting literature. The approach ensures comparability across datasets of relevant literature while addressing the challenge of managing large volumes of literature within time and resource constraints. It provides a practical solution for managing diverse bodies of literature in scoping reviews, enabling a holistic understanding of the interrelationships among carbon, biodiversity, and water offsetting.</div><div>Key elements of the method include:<ul><li><span>•</span><span><div>Applying a consistent approach across all three datasets, while accommodating the specificities of each.</div></span></li><li><span>•</span><span><div>Utilizing the machine learning tool ASReviewer to streamline the screening process, alongside a pilot screening phase to establish consistent inclusion criteria.</div></span></li><li><span>•</span><span><div>Combining quantitative bibliometric analysis with qualitative thematic analysis.</div></span></li></ul></div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103799"},"PeriodicalIF":1.9,"publicationDate":"2026-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146034291","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Diabetic peripheral neuropathy (DPN) is the leading cause of disturbances in reactive balance control. The repeated, external mechanical perturbations in perturbation-based balance training(PBBT) evoke balance recovery strategies; which subsequently improve reactive balance performance. Using the practice schedule concept of motor learning in the design of PBBT is a relatively new approach related to balance exercises. This study aims to investigate the effects of blocked and random PBBT on reactive balance control and its persistency and transfer to conditions different from those experienced during training.
Individuals with DPN will be recruited and randomly allocated to one of the three groups: random, blocked, and control group. Random and blocked PBBT groups will receive single-session balance training, including unexpected perturbations of platform during quiet standing in two directions (anterior and posterior), and three difficulty levels of platform motion (displacement, velocity, and acceleration). Each balance perturbation in blocked group will be repeated over blocks of four trials. For the random group, perturbation sequence will be unpredictable for these four trials in each block. Primary outcomes (i.e., center of pressure variables, reaction time, movement time, and total response time variables) will be assessed at baseline as well as immediately and one day after intervention.
{"title":"Random vs. blocked perturbation training on reactive balance control in peripheral neuropathy: A protocol study for a randomized controlled trial","authors":"Razieh Javadian Kootenayi , Razieh Mofateh , Mehrnoosh Zakerkish , Neda Orakifar , Saeideh Monjezi , Mohammad Mehravar , Maryam Seyedtabib","doi":"10.1016/j.mex.2026.103796","DOIUrl":"10.1016/j.mex.2026.103796","url":null,"abstract":"<div><div>Diabetic peripheral neuropathy (DPN) is the leading cause of disturbances in reactive balance control. The repeated, external mechanical perturbations in perturbation-based balance training(PBBT) evoke balance recovery strategies; which subsequently improve reactive balance performance. Using the practice schedule concept of motor learning in the design of PBBT is a relatively new approach related to balance exercises. This study aims to investigate the effects of blocked and random PBBT on reactive balance control and its persistency and transfer to conditions different from those experienced during training.</div><div>Individuals with DPN will be recruited and randomly allocated to one of the three groups: random, blocked, and control group. Random and blocked PBBT groups will receive single-session balance training, including unexpected perturbations of platform during quiet standing in two directions (anterior and posterior), and three difficulty levels of platform motion (displacement, velocity, and acceleration). Each balance perturbation in blocked group will be repeated over blocks of four trials. For the random group, perturbation sequence will be unpredictable for these four trials in each block. Primary outcomes (i.e., center of pressure variables, reaction time, movement time, and total response time variables) will be assessed at baseline as well as immediately and one day after intervention.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103796"},"PeriodicalIF":1.9,"publicationDate":"2026-01-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146034289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-09DOI: 10.1016/j.mex.2026.103794
Thuan Ha, Kwabena Abrefa Nketia, Hansanee Fernando, Sarah van Steenbergen, Shawn Neudorf, Steve J. Shirtliffe
Accurate field boundary delineation is critical for accurate modelling on crop yields and for precision agriculture (PA), enabling site-specific management to optimize resource use and crop productivity. Traditional boundary mapping methods, such as manual digitization and semi-automated extraction from farm machinery, are labor-intensive and challenging to apply at large scales. Advances in high-resolution land cover data and satellite imagery offer scalable solutions for automated field boundary extraction. In this study, we propose a fully automated workflow that integrates a pre-trained foundation model, the Segment Anything Model - SAM [1] with time-series Sentinel-2 imagery. Seasonal composites of Red, Green, and Blue bands were generated at different phenological stages to support segmentation. The method was applied across over 32 million hectares (79 million acres) of cultivated land in the Canadian Prairies, achieving an intersection-over-union (IoU) accuracy of 0.86 compared to manual segmentation. The workflow consists of four main steps: (1) setting the python working environment, (2) seasonal image acquisition and preprocessing using Google Earth Engine via Python API; (3) field boundary segmentation using SAM; and (4) post-processing and feature cleaning using ArcGIS Pro. This approach demonstrates a scalable, efficient solution for large-scale field boundary mapping to support PA applications.
•
Integrates a foundation segmentation model (SAM) with Sentinel-2 seasonal imagery
•
Demonstrates high-accuracy, large-scale automated field boundary delineation
•
Provides a reproducible workflow adaptable to other regions and datasets
{"title":"Field boundary delineation with seasonal sentinel 2 imagery using Segment Anything Model (SAM)","authors":"Thuan Ha, Kwabena Abrefa Nketia, Hansanee Fernando, Sarah van Steenbergen, Shawn Neudorf, Steve J. Shirtliffe","doi":"10.1016/j.mex.2026.103794","DOIUrl":"10.1016/j.mex.2026.103794","url":null,"abstract":"<div><div>Accurate field boundary delineation is critical for accurate modelling on crop yields and for precision agriculture (PA), enabling site-specific management to optimize resource use and crop productivity. Traditional boundary mapping methods, such as manual digitization and semi-automated extraction from farm machinery, are labor-intensive and challenging to apply at large scales. Advances in high-resolution land cover data and satellite imagery offer scalable solutions for automated field boundary extraction. In this study, we propose a fully automated workflow that integrates a pre-trained foundation model, the Segment Anything Model - SAM [<span><span>1</span></span>] with time-series Sentinel-2 imagery. Seasonal composites of Red, Green, and Blue bands were generated at different phenological stages to support segmentation. The method was applied across over 32 million hectares (79 million acres) of cultivated land in the Canadian Prairies, achieving an intersection-over-union (IoU) accuracy of 0.86 compared to manual segmentation. The workflow consists of four main steps: (1) setting the python working environment, (2) seasonal image acquisition and preprocessing using Google Earth Engine via Python API; (3) field boundary segmentation using SAM; and (4) post-processing and feature cleaning using ArcGIS Pro. This approach demonstrates a scalable, efficient solution for large-scale field boundary mapping to support PA applications.<ul><li><span>•</span><span><div>Integrates a foundation segmentation model (SAM) with Sentinel-2 seasonal imagery</div></span></li><li><span>•</span><span><div>Demonstrates high-accuracy, large-scale automated field boundary delineation</div></span></li><li><span>•</span><span><div>Provides a reproducible workflow adaptable to other regions and datasets</div></span></li></ul></div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103794"},"PeriodicalIF":1.9,"publicationDate":"2026-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146034293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}