This article presents the development and validation of a computerized clinical protocol for assessing Central Visual Processing (CVP). The protocol was designed to overcome limitations in current visual assessment tools by integrating sensory, perceptual, and cognitive visual functions within the dorsal and ventral processing streams. It comprises psychophysically controlled tasks measuring contrast sensitivity, texture perception, coherent motion, form integration, visual attention, reading-related eye movements, quantity estimation, and spatial-numerical mapping. Stimuli were developed using high-precision presentation software, and procedures were adapted to ensure both clinical feasibility and psychophysical validity.
Method validation was conducted with 41 healthy adults through test–retest analysis, Cronbach’s alpha, and Spearman–Brown split-half reliability. No significant differences were observed between first and second assessments (p > 0.05), and reliability indices showed strong internal consistency across subtests. These findings confirm the reproducibility and methodological robustness of the protocol.
•
A comprehensive computerized battery assessing central visual functions across dorsal and ventral streams
•
Psychophysical methods adapted for clinical precision and feasibility
•
Strong reliability demonstrated through test–retest, internal consistency, and split-half correlations
{"title":"Defining a clinical protocol using a computerized central visual processing battery","authors":"Marcelo Fernandes Costa , Leonardo Dutra Henriques , Givago Silva Souza","doi":"10.1016/j.mex.2026.103789","DOIUrl":"10.1016/j.mex.2026.103789","url":null,"abstract":"<div><div>This article presents the development and validation of a computerized clinical protocol for assessing Central Visual Processing (CVP). The protocol was designed to overcome limitations in current visual assessment tools by integrating sensory, perceptual, and cognitive visual functions within the dorsal and ventral processing streams. It comprises psychophysically controlled tasks measuring contrast sensitivity, texture perception, coherent motion, form integration, visual attention, reading-related eye movements, quantity estimation, and spatial-numerical mapping. Stimuli were developed using high-precision presentation software, and procedures were adapted to ensure both clinical feasibility and psychophysical validity.</div><div>Method validation was conducted with 41 healthy adults through test–retest analysis, Cronbach’s alpha, and Spearman–Brown split-half reliability. No significant differences were observed between first and second assessments (p > 0.05), and reliability indices showed strong internal consistency across subtests. These findings confirm the reproducibility and methodological robustness of the protocol.<ul><li><span>•</span><span><div>A comprehensive computerized battery assessing central visual functions across dorsal and ventral streams</div></span></li><li><span>•</span><span><div>Psychophysical methods adapted for clinical precision and feasibility</div></span></li><li><span>•</span><span><div>Strong reliability demonstrated through test–retest, internal consistency, and split-half correlations</div></span></li></ul></div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103789"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145926245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-06-01Epub Date: 2026-01-02DOI: 10.1016/j.mex.2026.103785
Iuria Betco , Cláudia M. Viana , Jorge Rocha
Street-level imagery (SLI) is increasingly used in urban analytics for tasks like estimating greenery, conducting transport audits, and assessing facades. However, inconsistent image quality, uneven spatial coverage, and non-standardized acquisition methods limit reproducibility. We introduce USE-SVI (Urban Sampling & Extraction of Street View Imagery), a reproducible process to sample, acquire, and stitch street-view images for city-wide analysis. The protocol ensures regular spatial coverage sampling points at fixed intervals, generates four viewing directions per point to capture main views, acquires images through official Street View APIs or open-licence platforms (e.g., Mapillary or KartaView) with detailed metadata recording, and creates panoramas using OpenCV (e.g., ORB keypoints, FLANN matching, Stitcher). This approach produces evenly spaced images, clear provenance, and ready-to-use outputs (CSV, PNG, XLSX), supporting machine learning and visual checks. By standardizing key steps, sampling, acquisition, and stitching, USE-SVI enhances transparency and scalability, adheres to platform terms, and enables replication across cities and periods. Limitations involve variable provider coverage and occasional stitching failures in scenes with few features.
{"title":"USE-SVI: A reproducible pipeline for sampling, acquiring, and stitching Street View imagery to support urban analytics","authors":"Iuria Betco , Cláudia M. Viana , Jorge Rocha","doi":"10.1016/j.mex.2026.103785","DOIUrl":"10.1016/j.mex.2026.103785","url":null,"abstract":"<div><div>Street-level imagery (SLI) is increasingly used in urban analytics for tasks like estimating greenery, conducting transport audits, and assessing facades. However, inconsistent image quality, uneven spatial coverage, and non-standardized acquisition methods limit reproducibility. We introduce USE-SVI (Urban Sampling & Extraction of Street View Imagery), a reproducible process to sample, acquire, and stitch street-view images for city-wide analysis. The protocol ensures regular spatial coverage sampling points at fixed intervals, generates four viewing directions per point to capture main views, acquires images through official Street View APIs or open-licence platforms (e.g., Mapillary or KartaView) with detailed metadata recording, and creates panoramas using OpenCV (e.g., ORB keypoints, FLANN matching, Stitcher). This approach produces evenly spaced images, clear provenance, and ready-to-use outputs (CSV, PNG, XLSX), supporting machine learning and visual checks. By standardizing key steps, sampling, acquisition, and stitching, USE-SVI enhances transparency and scalability, adheres to platform terms, and enables replication across cities and periods. Limitations involve variable provider coverage and occasional stitching failures in scenes with few features.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103785"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145926356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-06-01Epub Date: 2025-12-25DOI: 10.1016/j.mex.2025.103782
S Sowmyadevi, Anna Alphy
The mutation-aware test prioritisation system in this paper uses Graph Neural Networks (GNNs) to combine static program structure, dynamic execution traces, and mutation coverage into a hybrid graph representation to enhance regression testing. The framework embeds higher-order dependencies in test cases using GCN, GAT, and GraphSAGE variations and ranks them using a multi-objective optimisation function that balances fault detection, execution cost, and mutation coverage. On benchmark datasets like Defects4J and ManySStuBs4J, the proposed approach consistently outperforms traditional baselines (coverage-based APFD = 72.4 %, cost-based = 74.5 %) and ML baselines (LSTM = 80.1 %, RL = 82.7 %), achieving an average APFD of 88.9 % and mutation score of 84.6 % with a 16.1-second execution overhead. Statistical tests (Wilcoxon signed-rank, p < 0.05) indicate the robustness of these gains. Ablation experiments show that removing execution traces or mutation characteristics reduces APFD by 5–8 %, emphasising their relevance. Qualitative research shows that GNN embeddings cluster fault-related test cases for interpretable prioritisation. The suggested paradigm for contemporary regression testing is scalable, accurate, and mutation-driven.
•
Multi-Tiered Graph-Based Architecture: The method transforms raw program artifacts (codebase, mutants, test traces) into Program Dependence Graphs and Call Graphs, where nodes represent program elements and edges capture dependencies enriched with runtime characteristics.
•
GNN-Powered Multi-Objective Optimization: Core innovation uses Graph Neural Networks (GCN, GAT, GraphSAGE) to create enriched embeddings through iterative neighborhood aggregation, feeding into a scoring function that balances fault detection potential, execution cost, and mutation coverage.
•
Superior Validated Performance: Achieves 88.9 % APFD compared to 82.7 % for best baseline methods on real-world datasets, with statistical significance confirmed through Wilcoxon signed-rank tests across multiple evaluation metrics.
{"title":"Graph neural network-based mutation-aware regression test ordering using code dependency graphs and execution traces","authors":"S Sowmyadevi, Anna Alphy","doi":"10.1016/j.mex.2025.103782","DOIUrl":"10.1016/j.mex.2025.103782","url":null,"abstract":"<div><div>The mutation-aware test prioritisation system in this paper uses Graph Neural Networks (GNNs) to combine static program structure, dynamic execution traces, and mutation coverage into a hybrid graph representation to enhance regression testing. The framework embeds higher-order dependencies in test cases using GCN, GAT, and GraphSAGE variations and ranks them using a multi-objective optimisation function that balances fault detection, execution cost, and mutation coverage. On benchmark datasets like Defects4J and ManySStuBs4J, the proposed approach consistently outperforms traditional baselines (coverage-based APFD = 72.4 %, cost-based = 74.5 %) and ML baselines (LSTM = 80.1 %, RL = 82.7 %), achieving an average APFD of 88.9 % and mutation score of 84.6 % with a 16.1-second execution overhead. Statistical tests (Wilcoxon signed-rank, <em>p</em> < 0.05) indicate the robustness of these gains. Ablation experiments show that removing execution traces or mutation characteristics reduces APFD by 5–8 %, emphasising their relevance. Qualitative research shows that GNN embeddings cluster fault-related test cases for interpretable prioritisation. The suggested paradigm for contemporary regression testing is scalable, accurate, and mutation-driven.<ul><li><span>•</span><span><div><strong>Multi-Tiered Graph-Based Architecture</strong>: The method transforms raw program artifacts (codebase, mutants, test traces) into Program Dependence Graphs and Call Graphs, where nodes represent program elements and edges capture dependencies enriched with runtime characteristics.</div></span></li><li><span>•</span><span><div><strong>GNN-Powered Multi-Objective Optimization</strong>: Core innovation uses Graph Neural Networks (GCN, GAT, GraphSAGE) to create enriched embeddings through iterative neighborhood aggregation, feeding into a scoring function that balances fault detection potential, execution cost, and mutation coverage.</div></span></li><li><span>•</span><span><div><strong>Superior Validated Performance</strong>: Achieves 88.9 % APFD compared to 82.7 % for best baseline methods on real-world datasets, with statistical significance confirmed through Wilcoxon signed-rank tests across multiple evaluation metrics.</div></span></li></ul></div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103782"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145926354","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Oceans exhibit complex dynamics influenced by climate change, anthropogenic activities, and natural phenomena. Understanding these dynamics is critical for ensuring the sustainability of marine environments and their optimal utilization. This research aims to study and monitor upwelling phenomena in the South Sea of Java. Upwelling, the exchange of nutrient-rich, cold water from deeper layers to the surface, enhances marine biological productivity; Sea Surface Temperature (SST) serves as a key indicator for its detection. To achieve these objectives, this study employs both ConvLSTM and 3D-CNN. ConvLSTM, a deep learning architecture that integrates convolutional structures within LSTM units, effectively captures spatiotemporal dependencies in sequential data. 3D-CNN, a deep learning model extending traditional 2D convolutional neural networks, processes volumetric data, enabling the extraction of spatial features across three dimensions. Analysis reveals that ConvLSTM outperforms 3D-CNN in modeling upwelling data in the South Sea of Java. This is evidenced by lower Root Mean Square Error (RMSE) and Mean Absolute Error (MAE). The ConvLSTM method was then used for forecasting, and the results were validated with data obtained from local fishermen regarding their fishing expeditions. Visual analysis confirms that the ConvLSTM method accurately models upwelling data in the South Sea of Java with fishermen's schedules.
ConvLSTM and 3D-CNN methods were comparatively evaluated for modeling Sea Surface Temperature (SST) data, considering wind speed, sea surface salinity, and the El Niño-Southern Oscillation (ENSO) phase as influential factors.
Based on Root Mean Square Error (RMSE) and Mean Absolute Error (MAE) values, the ConvLSTM method exhibited lower values, indicating superior performance compared to the 3D-CNN approach. Specifically, RMSE and MAE values for ConvLSTM were 0.4161 and 0.3017, respectively, while for 3D-CNN, the corresponding values were 0.6095 and 0.4259.
Upwelling data forecasting results were validated against local fishermen's schedules, with data collected in July 2022. Visual inspection confirmed alignment between the forecasted upwelling patterns and the fishermen's activity.
{"title":"Predicting upwelling dynamics in the South Sea of Java, Indonesia: A deep learning approach with ConvLSTM and 3D-CNN","authors":"Dwi Rantini , Rumaisa Kruba , Yudi Haditiar , Muhammad Ikhwan , Yusuf Jati Wijaya , Aris Ismanto , Muhammad Mahdy Yandra , Hafiz Rahman , Arip Ramadan , Fazidah Othman","doi":"10.1016/j.mex.2026.103802","DOIUrl":"10.1016/j.mex.2026.103802","url":null,"abstract":"<div><div>Oceans exhibit complex dynamics influenced by climate change, anthropogenic activities, and natural phenomena. Understanding these dynamics is critical for ensuring the sustainability of marine environments and their optimal utilization. This research aims to study and monitor upwelling phenomena in the South Sea of Java. Upwelling, the exchange of nutrient-rich, cold water from deeper layers to the surface, enhances marine biological productivity; Sea Surface Temperature (SST) serves as a key indicator for its detection. To achieve these objectives, this study employs both ConvLSTM and 3D-CNN. ConvLSTM, a deep learning architecture that integrates convolutional structures within LSTM units, effectively captures spatiotemporal dependencies in sequential data. 3D-CNN, a deep learning model extending traditional 2D convolutional neural networks, processes volumetric data, enabling the extraction of spatial features across three dimensions. Analysis reveals that ConvLSTM outperforms 3D-CNN in modeling upwelling data in the South Sea of Java. This is evidenced by lower Root Mean Square Error (RMSE) and Mean Absolute Error (MAE). The ConvLSTM method was then used for forecasting, and the results were validated with data obtained from local fishermen regarding their fishing expeditions. Visual analysis confirms that the ConvLSTM method accurately models upwelling data in the South Sea of Java with fishermen's schedules.</div><div>ConvLSTM and 3D-CNN methods were comparatively evaluated for modeling Sea Surface Temperature (SST) data, considering wind speed, sea surface salinity, and the El Niño-Southern Oscillation (ENSO) phase as influential factors.</div><div>Based on Root Mean Square Error (RMSE) and Mean Absolute Error (MAE) values, the ConvLSTM method exhibited lower values, indicating superior performance compared to the 3D-CNN approach. Specifically, RMSE and MAE values for ConvLSTM were 0.4161 and 0.3017, respectively, while for 3D-CNN, the corresponding values were 0.6095 and 0.4259.</div><div>Upwelling data forecasting results were validated against local fishermen's schedules, with data collected in July 2022. Visual inspection confirmed alignment between the forecasted upwelling patterns and the fishermen's activity.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103802"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146034411","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-06-01Epub Date: 2026-01-06DOI: 10.1016/j.mex.2026.103787
Alexandra N. Buntin-Nakamura, Daniel Quintana
Head fixation combined with cranial window, GRIN lens, or prism implantation are common techniques in systems neuroscience for real-time imaging of neural activity. A critical step in these experiments is the removal of these implants and subsequent histological processing to assess tissue integrity and verify opsin expression. However, removing implants from the dorsal surface of the skull often causes traumatic damage to the underlying cortex. This study introduces a novel technique for intact brain excision that removes dorsal implants while preserving cortical integrity. Instead of the conventional dorsal approach, which often risks cortical damage, the skull is resected from the ventral side, allowing the implant to remain in place for post-mortem analysis.
This approach:
•
Enables dorsal implants to remain embedded in the brain during fixation.
•
Improves reproducibility by standardizing the extraction method, reducing variability introduced by traditional dorsal implant removal.
•
Preserves tissue integrity that would otherwise be compromised by removing the implant from fresh brain tissue.
{"title":"Ventral skull based approach to whole-brain extraction in mice","authors":"Alexandra N. Buntin-Nakamura, Daniel Quintana","doi":"10.1016/j.mex.2026.103787","DOIUrl":"10.1016/j.mex.2026.103787","url":null,"abstract":"<div><div>Head fixation combined with cranial window, GRIN lens, or prism implantation are common techniques in systems neuroscience for real-time imaging of neural activity. A critical step in these experiments is the removal of these implants and subsequent histological processing to assess tissue integrity and verify opsin expression. However, removing implants from the dorsal surface of the skull often causes traumatic damage to the underlying cortex. This study introduces a novel technique for intact brain excision that removes dorsal implants while preserving cortical integrity. Instead of the conventional dorsal approach, which often risks cortical damage, the skull is resected from the ventral side, allowing the implant to remain in place for post-mortem analysis.</div><div>This approach:<ul><li><span>•</span><span><div>Enables dorsal implants to remain embedded in the brain during fixation.</div></span></li><li><span>•</span><span><div>Improves reproducibility by standardizing the extraction method, reducing variability introduced by traditional dorsal implant removal.</div></span></li><li><span>•</span><span><div>Preserves tissue integrity that would otherwise be compromised by removing the implant from fresh brain tissue.</div></span></li></ul></div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103787"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145977506","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-06-01Epub Date: 2026-01-07DOI: 10.1016/j.mex.2026.103792
Ambreen Gul , Abdul Qayyum Rao , Allah Bakhsh
RNA extraction from polyphenolic-rich plants poses significant challenges, demanding precise sample handling and rigorous experimental conditions to ensure high yield and quality of the RNA. Numerous kits are available for RNA extraction from various sources, however extracting RNA from phenolic-rich plants requires special attention. On the other hand, manual extraction methods are time-consuming and involve harsh chemicals (LiCl, phenol, guanidine thiocyanate). To address this issue, we have improvised and compared three strategies for RNA extraction. We found that CTAB-based extraction buffer produces high-quality RNA from cotton, followed by ammonium acetate purification, within two hours. Column purification (DirectZol RNA purification kit) followed by CTAB extraction step not only accelerates the RNA purification process (∼30 min) but produces high-quality RNA suitable for various downstream applications such as real-time quantitative PCR, sequencing, and RNA library preparation. The RNA yield ranged between 80-100µg/100mg for CTAB-ammonium acetate as compared to up to 12µg/100mg for guanidium thiocyanate-ammonium acetate-based extraction from cotyledonary leaves of cotton. RNA purification from mature cotton leaves was unsuccessful with guanidinium thiocyanate. The downstream applications like qPCR and the sequencing results depicted the contaminant-free RNA from phenolic-rich mature cotton leaves.
The modified method is rapid and could be completed within 2 hours for completely manual procedure
For extra fast purification CTAB-buffer extraction can be combined with RNA purification kit (∼30-40 min)
The modified method yields high quality and phenol-free RNA for downstream applications like real time quantitative PCR, sequencing, and RNA library preparation.
{"title":"Development of a rapid and modified total RNA extraction method from polyphenolic-rich Gossypium hirsutum","authors":"Ambreen Gul , Abdul Qayyum Rao , Allah Bakhsh","doi":"10.1016/j.mex.2026.103792","DOIUrl":"10.1016/j.mex.2026.103792","url":null,"abstract":"<div><div>RNA extraction from polyphenolic-rich plants poses significant challenges, demanding precise sample handling and rigorous experimental conditions to ensure high yield and quality of the RNA. Numerous kits are available for RNA extraction from various sources, however extracting RNA from phenolic-rich plants requires special attention. On the other hand, manual extraction methods are time-consuming and involve harsh chemicals (LiCl, phenol, guanidine thiocyanate). To address this issue, we have improvised and compared three strategies for RNA extraction. We found that CTAB-based extraction buffer produces high-quality RNA from cotton, followed by ammonium acetate purification, within two hours. Column purification (DirectZol RNA purification kit) followed by CTAB extraction step not only accelerates the RNA purification process (∼30 min) but produces high-quality RNA suitable for various downstream applications such as real-time quantitative PCR, sequencing, and RNA library preparation. The RNA yield ranged between 80-100µg/100mg for CTAB-ammonium acetate as compared to up to 12µg/100mg for guanidium thiocyanate-ammonium acetate-based extraction from cotyledonary leaves of cotton. RNA purification from mature cotton leaves was unsuccessful with guanidinium thiocyanate. The downstream applications like qPCR and the sequencing results depicted the contaminant-free RNA from phenolic-rich mature cotton leaves.</div><div>The modified method is rapid and could be completed within 2 hours for completely manual procedure</div><div>For extra fast purification CTAB-buffer extraction can be combined with RNA purification kit (∼30-40 min)</div><div>The modified method yields high quality and phenol-free RNA for downstream applications like real time quantitative PCR, sequencing, and RNA library preparation.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103792"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145977505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-06-01Epub Date: 2025-12-12DOI: 10.1016/j.mex.2025.103757
Sayed Saber , Abdullah A. Alahmari
<div><div>This study presents a novel numerical framework for simulating glucose-insulin regulatory dynamics using the Caputo-Fabrizio (CF) fractal-fractional operator with both constant and variable fractional orders. The model incorporates an exponential decay kernel to capture memory and hereditary effects in metabolic regulation. A Newton interpolation-based numerical scheme is developed to approximate the CF-FF derivatives, ensuring computational stability and accuracy. For the variable-order formulation, the fractional order <span><math><mrow><mi>β</mi><mo>(</mo><mi>t</mi><mo>)</mo></mrow></math></span> dynamically evolves with time, reflecting physiological variability typically observed during intravenous glucose tolerance tests (IVGTT). Numerical experiments reproduce physiologically realistic glucose-insulin oscillations and demonstrate how feedback control stabilizes chaotic metabolic behavior. The results are based entirely on simulation evidence calibrated within clinically reported parameter ranges, providing conceptual validation rather than direct patient-data comparison. The proposed approach bridges mathematical fractional calculus with biomedical applications, offering new insights for personalized diabetes management and adaptive glucose control strategies.<ul><li><span>•</span><span><div>Fractal-fractional model formulation capturing glucose-insulin memory and adaptation</div></span></li><li><span>•</span><span><div>Stable numerical scheme using Newton interpolation for accurate fractional integration</div></span></li><li><span>•</span><span><div>Linear feedback control applied to regulate chaotic glucose-insulin dynamics</div></span></li><li><span>•</span><span><div>Numerical Methodology for glucose-insulin dynamics. Our investigation of the fractal-fractional glucose-insulin system employs the following analytical framework:</div></span></li><li><span>•</span><span><div>Model Development: We formulate a fractal-fractional-order extension of the minimal glucose insulin model, incorporating an exponential decay type kernel to capture the system's memory effects and anomalous diffusion characteristics inherent in metabolic processes. The model accounts for both insulin-dependent and independent glucose utilization dynamics.</div></span></li><li><span>•</span><span><div>Computational Implementation: We develop a novel numerical solver based on Newton's interpolation polynomials, implementing the Atangana-Seda fractal-fractional derivative formulation. This method provides an efficient computational framework for solving the coupled nonlinear fractional differential equations while maintaining numerical stability across different fractional orders.</div></span></li><li><span>•</span><span><div>The purpose of this section is to define a mathematical model to study the dynamic behavior of glucose-insulin physiology.</div></span></li><li><span>•</span><span><div>With the Adams-Bashforth-Moulton numerical scheme, we compute the Lyap
{"title":"Existence, Stability, and Control of Glucose-Insulin Dynamics via Caputo-Fabrizio Fractal-Fractional Operators","authors":"Sayed Saber , Abdullah A. Alahmari","doi":"10.1016/j.mex.2025.103757","DOIUrl":"10.1016/j.mex.2025.103757","url":null,"abstract":"<div><div>This study presents a novel numerical framework for simulating glucose-insulin regulatory dynamics using the Caputo-Fabrizio (CF) fractal-fractional operator with both constant and variable fractional orders. The model incorporates an exponential decay kernel to capture memory and hereditary effects in metabolic regulation. A Newton interpolation-based numerical scheme is developed to approximate the CF-FF derivatives, ensuring computational stability and accuracy. For the variable-order formulation, the fractional order <span><math><mrow><mi>β</mi><mo>(</mo><mi>t</mi><mo>)</mo></mrow></math></span> dynamically evolves with time, reflecting physiological variability typically observed during intravenous glucose tolerance tests (IVGTT). Numerical experiments reproduce physiologically realistic glucose-insulin oscillations and demonstrate how feedback control stabilizes chaotic metabolic behavior. The results are based entirely on simulation evidence calibrated within clinically reported parameter ranges, providing conceptual validation rather than direct patient-data comparison. The proposed approach bridges mathematical fractional calculus with biomedical applications, offering new insights for personalized diabetes management and adaptive glucose control strategies.<ul><li><span>•</span><span><div>Fractal-fractional model formulation capturing glucose-insulin memory and adaptation</div></span></li><li><span>•</span><span><div>Stable numerical scheme using Newton interpolation for accurate fractional integration</div></span></li><li><span>•</span><span><div>Linear feedback control applied to regulate chaotic glucose-insulin dynamics</div></span></li><li><span>•</span><span><div>Numerical Methodology for glucose-insulin dynamics. Our investigation of the fractal-fractional glucose-insulin system employs the following analytical framework:</div></span></li><li><span>•</span><span><div>Model Development: We formulate a fractal-fractional-order extension of the minimal glucose insulin model, incorporating an exponential decay type kernel to capture the system's memory effects and anomalous diffusion characteristics inherent in metabolic processes. The model accounts for both insulin-dependent and independent glucose utilization dynamics.</div></span></li><li><span>•</span><span><div>Computational Implementation: We develop a novel numerical solver based on Newton's interpolation polynomials, implementing the Atangana-Seda fractal-fractional derivative formulation. This method provides an efficient computational framework for solving the coupled nonlinear fractional differential equations while maintaining numerical stability across different fractional orders.</div></span></li><li><span>•</span><span><div>The purpose of this section is to define a mathematical model to study the dynamic behavior of glucose-insulin physiology.</div></span></li><li><span>•</span><span><div>With the Adams-Bashforth-Moulton numerical scheme, we compute the Lyap","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103757"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145798161","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tamil Nadu conducted the first round of WHO STEPS survey in 2020 to assess non-communicable disease (NCD) risk factors. In 2021, subsequently the state launched the Makkalai Thedi Maruthuvam (MTM) initiative to deliver home-based care for individuals with NCDs. To evaluate the effectiveness of the program, we designed a population-based survey in 2023–2024 using electoral booth sampling as a primary sampling unit to assess control rates of hypertension and diabetes among individuals aged 18–69 years using the same WHO STEPS approach. Secondary objective was to estimate the prevalence of behavioural and biological NCD risk factors. A multistage cluster sampling method was used across the Tamil Nadu. A total of 8880 participants were selected from 148 clusters. One eligible adult per household was selected using the KISH method. Data were collected through interviews such as sociodemographic, anthropometric and biological parameters. This survey is the first in Tamil Nadu to use updated electoral data as a sampling frame to estimate the care cascade of the hypertension and diabetes. Although limited by its cross-sectional design and lack of biochemical markers test like HbA1c and cholesterol, the survey offers a practical and scalable model for NCD surveillance in India.
•
This study used the updated electoral booth data as primary sampling units in the absence of recent census data (India) to ensure the population representativeness.
•
Comprehensive assessment of hypertension and diabetes care cascade including prevalence, awareness, treatment and control rates among the individual aged 18–69 years.
•
Designed to evaluate the impact of Makkalai Thedi Maruthuvam (MTM) scheme and support to evidence based NCD intervention in Tamil Nadu.
{"title":"A cross-sectional study on hypertension and diabetes care cascade and prevalence of noncommunicable diseases risk factors in Tamil Nadu: rationale and methods","authors":"Archana Ramalingam , Joshua Chadwick , Mosoniro Kriina , Manjula Devi Neelavanan , Anusuya Govindan , Devendhiran R , Vettrichelvan Venkatasamy , Emily Devasagayam , Surya Joseph , Swathy Madhusoodanan , Bhoopathy Kangusamy , Sabarinathan Ramaswamy , Elavarasu Govindasamy , Kalaimani I , Ashok Kumar Paparaju , Chokkalingam D , Dinesh Kumar , Daniel Rajasekar , Augustine Duraisamy , Sudharshini Subramaniam , Manoj Murekhar","doi":"10.1016/j.mex.2025.103753","DOIUrl":"10.1016/j.mex.2025.103753","url":null,"abstract":"<div><div>Tamil Nadu conducted the first round of WHO STEPS survey in 2020 to assess non-communicable disease (NCD) risk factors. In 2021, subsequently the state launched the Makkalai Thedi Maruthuvam (MTM) initiative to deliver home-based care for individuals with NCDs. To evaluate the effectiveness of the program, we designed a population-based survey in 2023–2024 using electoral booth sampling as a primary sampling unit to assess control rates of hypertension and diabetes among individuals aged 18–69 years using the same WHO STEPS approach. Secondary objective was to estimate the prevalence of behavioural and biological NCD risk factors. A multistage cluster sampling method was used across the Tamil Nadu. A total of 8880 participants were selected from 148 clusters. One eligible adult per household was selected using the KISH method. Data were collected through interviews such as sociodemographic, anthropometric and biological parameters. This survey is the first in Tamil Nadu to use updated electoral data as a sampling frame to estimate the care cascade of the hypertension and diabetes. Although limited by its cross-sectional design and lack of biochemical markers test like HbA1c and cholesterol, the survey offers a practical and scalable model for NCD surveillance in India.<ul><li><span>•</span><span><div>This study used the updated electoral booth data as primary sampling units in the absence of recent census data (India) to ensure the population representativeness.</div></span></li><li><span>•</span><span><div>Comprehensive assessment of hypertension and diabetes care cascade including prevalence, awareness, treatment and control rates among the individual aged 18–69 years.</div></span></li><li><span>•</span><span><div>Designed to evaluate the impact of Makkalai Thedi Maruthuvam (MTM) scheme and support to evidence based NCD intervention in Tamil Nadu.</div></span></li></ul></div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103753"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145798168","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-06-01Epub Date: 2025-12-11DOI: 10.1016/j.mex.2025.103762
Prabhu Shankar B , RajKumar N , Jayavadivel Ravi , Viji C , Gobinath J , Govindharaj I , Dinesh Kumar K , Elango Muthusamy
In cloud computing, it remains difficult to make data available in a cloud service such that the data is replicated and maintained consistently across various data centers. Traditional replication systems are sufficient, even though they take too long to process, cause significant data transfers, and face problems with final data consistency. This work presents a new method named Quantum Entanglement-Based Replication Algorithm (QERA), which makes use of quantum entanglement to ensure quick and high-performance synchronization of cloud data across all nodes. In this proposed work, the QERA approach encodes data changes in the primary cloud node onto quantum states and entangled qubit pairs to the related replica nodes. As a result, any change is quickly shown on all replicas without the usual overhead and delay of message broadcasts. It simulates how QERA is designed to decrease latency, promote consistency, and make better use of resources in cloud environments. This paper creates a theoretical framework using IBM Qiskit and Microsoft Quantum Development Kit simulators to compare classical and quantum baseline algorithms. The results show that QERA may greatly enhance the way updates and replications are managed across many cloud systems.
It demonstrates how QERA can ensure a very synchronized replication among the remote cloud nodes.
Employs a qubit pair entangled to minimize latency and decrease bandwidth expenses as it goes through updates.
Combines the idea of quantum teleportation with methods of non-invasive verification made to maintain the integrity of the state without altering the quantum system.
{"title":"Efficient data replication in distributed clouds via quantum entanglement algorithms","authors":"Prabhu Shankar B , RajKumar N , Jayavadivel Ravi , Viji C , Gobinath J , Govindharaj I , Dinesh Kumar K , Elango Muthusamy","doi":"10.1016/j.mex.2025.103762","DOIUrl":"10.1016/j.mex.2025.103762","url":null,"abstract":"<div><div>In cloud computing, it remains difficult to make data available in a cloud service such that the data is replicated and maintained consistently across various data centers. Traditional replication systems are sufficient, even though they take too long to process, cause significant data transfers, and face problems with final data consistency. This work presents a new method named Quantum Entanglement-Based Replication Algorithm (QERA), which makes use of quantum entanglement to ensure quick and high-performance synchronization of cloud data across all nodes. In this proposed work, the QERA approach encodes data changes in the primary cloud node onto quantum states and entangled qubit pairs to the related replica nodes. As a result, any change is quickly shown on all replicas without the usual overhead and delay of message broadcasts. It simulates how QERA is designed to decrease latency, promote consistency, and make better use of resources in cloud environments. This paper creates a theoretical framework using IBM Qiskit and Microsoft Quantum Development Kit simulators to compare classical and quantum baseline algorithms. The results show that QERA may greatly enhance the way updates and replications are managed across many cloud systems.</div><div>It demonstrates how QERA can ensure a very synchronized replication among the remote cloud nodes.</div><div>Employs a qubit pair entangled to minimize latency and decrease bandwidth expenses as it goes through updates.</div><div>Combines the idea of quantum teleportation with methods of non-invasive verification made to maintain the integrity of the state without altering the quantum system.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103762"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145798570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-06-01Epub Date: 2025-12-05DOI: 10.1016/j.mex.2025.103750
Ricvan Dana Nindrea , Linda Rosalina , Milya Novera , Long Chiau Ming , Nissa Prima Sari , Nabil Aresto Avilla , Fanisha Anugrah Rahmadhani Putri , Nailah Putri Rivani
Preterm and low birth weight (LBW) infants face elevated health risks and require specialized care. Maternal postnatal depression (PND) and the quality of mother–infant bonding are critical determinants of caregiving practices and neonatal outcomes. However, practical, validated methods for assessing these constructs remain limited within the Indonesian clinical and research context. This study presents a protocol for assessing PND and bonding among mothers of preterm and LBW infants in Indonesia. A community-based cross-sectional design was implemented across three districts in West Sumatra. A total of 255 mothers of preterm or LBW infants were selected using multistage random sampling. PND was measured using the validated 10-item Edinburgh Postnatal Depression Scale (EPDS), with a cut-off score of 12/13 indicating significant depression. Mother–infant bonding was assessed with a culturally adapted 10-item bonding questionnaire. Maternal practices were evaluated using an 8-item checklist covering breastfeeding, Kangaroo Mother Care, immunization, and use of maternal–child health records. Instruments underwent expert review, translation and back-translation, and pilot testing to ensure validity and reliability (Cronbach’s α: 0.75–0.90). The primary endpoints included the identification of maternal PND, the quality of bonding, and maternal adherence to essential infant care practices. Data collection followed a standardized interviewer protocol, and data were analyzed using Partial Least Squares Structural Equation Modeling (PLS-SEM). The protocol proved feasible in community settings and provides a replicable method to evaluate maternal PND and bonding, with potential to inform interventions that enhance neonatal care outcomes.
{"title":"Assessing maternal postnatal depression, bonding and practices in mothers of preterm and low birth weight infants in Indonesia","authors":"Ricvan Dana Nindrea , Linda Rosalina , Milya Novera , Long Chiau Ming , Nissa Prima Sari , Nabil Aresto Avilla , Fanisha Anugrah Rahmadhani Putri , Nailah Putri Rivani","doi":"10.1016/j.mex.2025.103750","DOIUrl":"10.1016/j.mex.2025.103750","url":null,"abstract":"<div><div>Preterm and low birth weight (LBW) infants face elevated health risks and require specialized care. Maternal postnatal depression (PND) and the quality of mother–infant bonding are critical determinants of caregiving practices and neonatal outcomes. However, practical, validated methods for assessing these constructs remain limited within the Indonesian clinical and research context. This study presents a protocol for assessing PND and bonding among mothers of preterm and LBW infants in Indonesia. A community-based cross-sectional design was implemented across three districts in West Sumatra. A total of 255 mothers of preterm or LBW infants were selected using multistage random sampling. PND was measured using the validated 10-item Edinburgh Postnatal Depression Scale (EPDS), with a cut-off score of 12/13 indicating significant depression. Mother–infant bonding was assessed with a culturally adapted 10-item bonding questionnaire. Maternal practices were evaluated using an 8-item checklist covering breastfeeding, Kangaroo Mother Care, immunization, and use of maternal–child health records. Instruments underwent expert review, translation and back-translation, and pilot testing to ensure validity and reliability (Cronbach’s α: 0.75–0.90). The primary endpoints included the identification of maternal PND, the quality of bonding, and maternal adherence to essential infant care practices. Data collection followed a standardized interviewer protocol, and data were analyzed using Partial Least Squares Structural Equation Modeling (PLS-SEM). The protocol proved feasible in community settings and provides a replicable method to evaluate maternal PND and bonding, with potential to inform interventions that enhance neonatal care outcomes.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"16 ","pages":"Article 103750"},"PeriodicalIF":1.9,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145749539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}