Pub Date : 2026-02-01Epub Date: 2026-01-11DOI: 10.1016/j.softx.2026.102512
Yoel Arroyo, Ana I. Molina, Carmen Lacave, Miguel Á. Redondo
Current ASD-focused app development faces key limitations, such as high technical barriers for non-experts, limited personalization, and scarce involvement of therapists, families and educators in the design process. This paper presents CoGenASD, a framework that integrates co-design principles with a Model-Driven Development (MDD) approach to support the semi-automatic generation of cross-platform applications for individuals with ASD. The tool enables multidisciplinary teams (therapists, families and educators) to collaboratively define and model participant profiles, activities, interaction modes and content, supporting the semi-automatic generation of cross-platform, accessible and tailored applications. CoGenASD lowers technical barriers, promotes inclusive design practices, and accelerates the development of support tools. Its potential impact includes increasing application effectiveness, fostering stakeholder engagement, and enabling new research on customizable interventions for neurodiverse populations.
{"title":"CoGenASD: A tool for the co-design and generation of cross-platform applications for people with Autism spectrum disorder","authors":"Yoel Arroyo, Ana I. Molina, Carmen Lacave, Miguel Á. Redondo","doi":"10.1016/j.softx.2026.102512","DOIUrl":"10.1016/j.softx.2026.102512","url":null,"abstract":"<div><div>Current ASD-focused app development faces key limitations, such as high technical barriers for non-experts, limited personalization, and scarce involvement of therapists, families and educators in the design process. This paper presents <strong>CoGenASD</strong>, a framework that integrates co-design principles with a Model-Driven Development (MDD) approach to support the semi-automatic generation of cross-platform applications for individuals with ASD. The tool enables multidisciplinary teams (therapists, families and educators) to collaboratively define and model participant profiles, activities, interaction modes and content, supporting the semi-automatic generation of cross-platform, accessible and tailored applications. CoGenASD lowers technical barriers, promotes inclusive design practices, and accelerates the development of support tools. Its potential impact includes increasing application effectiveness, fostering stakeholder engagement, and enabling new research on customizable interventions for neurodiverse populations.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102512"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145977775","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-01Epub Date: 2026-01-17DOI: 10.1016/j.softx.2026.102513
Tuğberk Kocatekin, Aziz Kubilay Ovacıklı, Mert Yağcıoğlu
Real-time fault detection in industrial rotating machinery requires both accurate machine learning models and software frameworks capable of handling continuous sensor streams. This study introduces FD-REST, an open-source, Dockerized platform that enables the deployment, execution, and real-time visualization of multi-sensor fault diagnosis models. The system integrates vibration, ultrasound, and temperature features and employs a Deep Neural Network (DNN) to generate continuous fault similarity scores across eight mechanical conditions. All predictions and raw signals are streamed to the frontend via WebSockets and stored in a lightweight SQLite database for reproducibility, session replay, and report generation. The embedded DNN model was validated on a real-world multi-modal dataset and achieved strong predictive performance, including a Mean Squared Error (MSE) of 0.00253, an score of 0.8436, and approximately 93% threshold-based classification accuracy. These results demonstrate both the numerical reliability of the model and the effectiveness of FD-REST as a streaming-oriented benchmarking environment. By providing a modular, reproducible, and on-premises-ready framework, FD-REST bridges the gap between offline algorithm development and real-time industrial deployment, offering a practical tool for researchers, engineers, and practitioners in predictive maintenance.
{"title":"FD-REST: A lightweight RESTful platform for real-time fault detection and diagnosis in industrial systems","authors":"Tuğberk Kocatekin, Aziz Kubilay Ovacıklı, Mert Yağcıoğlu","doi":"10.1016/j.softx.2026.102513","DOIUrl":"10.1016/j.softx.2026.102513","url":null,"abstract":"<div><div>Real-time fault detection in industrial rotating machinery requires both accurate machine learning models and software frameworks capable of handling continuous sensor streams. This study introduces FD-REST, an open-source, Dockerized platform that enables the deployment, execution, and real-time visualization of multi-sensor fault diagnosis models. The system integrates vibration, ultrasound, and temperature features and employs a Deep Neural Network (DNN) to generate continuous fault similarity scores across eight mechanical conditions. All predictions and raw signals are streamed to the frontend via WebSockets and stored in a lightweight SQLite database for reproducibility, session replay, and report generation. The embedded DNN model was validated on a real-world multi-modal dataset and achieved strong predictive performance, including a Mean Squared Error (MSE) of 0.00253, an <span><math><msup><mi>R</mi><mn>2</mn></msup></math></span> score of 0.8436, and approximately 93% threshold-based classification accuracy. These results demonstrate both the numerical reliability of the model and the effectiveness of FD-REST as a streaming-oriented benchmarking environment. By providing a modular, reproducible, and on-premises-ready framework, FD-REST bridges the gap between offline algorithm development and real-time industrial deployment, offering a practical tool for researchers, engineers, and practitioners in predictive maintenance.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102513"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145977772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-01Epub Date: 2026-02-03DOI: 10.1016/j.softx.2026.102548
Kurnia Cahya Febryanto , Izzat Aji Androfaza , Lalu Aldo Wadagraprana , Riyanarto Sarno , Kelly Rossa Sungkono , Yeni Anistyasari , Joko Siswantoro , A Min Tjoa
Heterogeneous Business Process Model and Notation (BPMN) platforms present critical integration challenges, as approximately 75% of large enterprises employ multiple modeling tools lacking unified transformation capabilities. Existing solutions address only single-format conversions or provide limited cross-platform compatibility without comprehensive validation. This paper presents a production-ready multi-format BPMN parser library uniquely integrating intelligent format detection, dual-tier validation, and optimized graph transformation within a unified architecture. The library utilizes specialized parsers for BPMN 2.0 XML, XML Process Definition Language (XPDL) 2.2, native formats, and Microsoft Visio diagrams through a plugin-based architecture. Multi-criteria detection algorithms automatically identify source formats with 99.2% accuracy by analyzing file signatures, XML namespaces, structural patterns, and content heuristics. The dual-tier validation framework ensures structural BPMN 2.0 compliance through rule-based constraints derived from official OMG specifications and semantic consistency through metadata quality assessment based on established process modeling guidelines, surpassing existing tools that perform only syntactic validation. The transformation pipeline generates standardized Cypher queries optimized for process mining workflows. Evaluation across 127 real-world business process models demonstrates 98.7% overall parsing accuracy, with format-specific performance ranging from 97.2% (Visio) to 99.8% (BPMN XML), achieving 85% reduction in transformation time compared to manual approaches. Released as open-source software via the Python Package Index with complete documentation, the library establishes foundational infrastructure for cross-platform business process intelligence, enabling unified graph-based analytics across heterogeneous modeling ecosystems without format-specific preprocessing.
{"title":"BPMN graph transformation: A unified multi-format parser library for standardized graph-based business process model integration","authors":"Kurnia Cahya Febryanto , Izzat Aji Androfaza , Lalu Aldo Wadagraprana , Riyanarto Sarno , Kelly Rossa Sungkono , Yeni Anistyasari , Joko Siswantoro , A Min Tjoa","doi":"10.1016/j.softx.2026.102548","DOIUrl":"10.1016/j.softx.2026.102548","url":null,"abstract":"<div><div>Heterogeneous Business Process Model and Notation (BPMN) platforms present critical integration challenges, as approximately 75% of large enterprises employ multiple modeling tools lacking unified transformation capabilities. Existing solutions address only single-format conversions or provide limited cross-platform compatibility without comprehensive validation. This paper presents a production-ready multi-format BPMN parser library uniquely integrating intelligent format detection, dual-tier validation, and optimized graph transformation within a unified architecture. The library utilizes specialized parsers for BPMN 2.0 XML, XML Process Definition Language (XPDL) 2.2, native formats, and Microsoft Visio diagrams through a plugin-based architecture. Multi-criteria detection algorithms automatically identify source formats with 99.2% accuracy by analyzing file signatures, XML namespaces, structural patterns, and content heuristics. The dual-tier validation framework ensures structural BPMN 2.0 compliance through rule-based constraints derived from official OMG specifications and semantic consistency through metadata quality assessment based on established process modeling guidelines, surpassing existing tools that perform only syntactic validation. The transformation pipeline generates standardized Cypher queries optimized for process mining workflows. Evaluation across 127 real-world business process models demonstrates 98.7% overall parsing accuracy, with format-specific performance ranging from 97.2% (Visio) to 99.8% (BPMN XML), achieving 85% reduction in transformation time compared to manual approaches. Released as open-source software via the Python Package Index with complete documentation, the library establishes foundational infrastructure for cross-platform business process intelligence, enabling unified graph-based analytics across heterogeneous modeling ecosystems without format-specific preprocessing.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102548"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146187960","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-01Epub Date: 2026-01-02DOI: 10.1016/j.softx.2025.102489
Wojciech Dudek , Daniel Giełdowski , Kamil Młodzikowski , Dominik Belter , Tomasz Winiarski
This article introduces a software framework for benchmarking robot task scheduling algorithms in dynamic and uncertain service environments. The system provides standardised interfaces, configurable scenarios with movable objects, human agents, tools for automated test generation, and performance evaluation. It supports both classical and AI-based methods, enabling repeatable, comparable assessments across diverse tasks and configurations. The framework facilitates diagnosis of algorithm behaviour, identification of implementation flaws, and selection or tuning of strategies for specific applications. It includes a SysML-based domain-specific language for structured scenario modelling and integrates with the ROS-based system for runtime execution. Validated on patrol, fall assistance, and pick-and-place tasks, the open-source framework is suited for researchers and integrators developing and testing scheduling algorithms under real-world-inspired conditions.
{"title":"TaBSA – A framework for training and benchmarking algorithms for scheduling tasks for mobile robots working in dynamic environments","authors":"Wojciech Dudek , Daniel Giełdowski , Kamil Młodzikowski , Dominik Belter , Tomasz Winiarski","doi":"10.1016/j.softx.2025.102489","DOIUrl":"10.1016/j.softx.2025.102489","url":null,"abstract":"<div><div>This article introduces a software framework for benchmarking robot task scheduling algorithms in dynamic and uncertain service environments. The system provides standardised interfaces, configurable scenarios with movable objects, human agents, tools for automated test generation, and performance evaluation. It supports both classical and AI-based methods, enabling repeatable, comparable assessments across diverse tasks and configurations. The framework facilitates diagnosis of algorithm behaviour, identification of implementation flaws, and selection or tuning of strategies for specific applications. It includes a SysML-based domain-specific language for structured scenario modelling and integrates with the ROS-based system for runtime execution. Validated on patrol, fall assistance, and pick-and-place tasks, the open-source framework is suited for researchers and integrators developing and testing scheduling algorithms under real-world-inspired conditions.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102489"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146187981","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present fastrerandomize, an R package for fast, scalable rerandomization in experimental design. Rerandomization improves precision by discarding treatment assignments that fail a prespecified covariate-balance criterion, but existing implementations can become computationally prohibitive as the number of units or covariates grows. fastrerandomize introduces three complementary advances: (i) optional GPU/TPU acceleration to parallelize balance checks, (ii) memory-efficient key-only storage that avoids retaining full assignment matrices, and (iii) auto-vectorized, just-in-time compiled kernels for batched candidate generation and inference. This approach enables exact or Monte Carlo rerandomization at previously intractable scales, making it practical to adopt the tighter balance thresholds required in modern high-dimensional experiments while simultaneously quantifying the resulting gains in precision and power for a given covariate set. Our approach also supports randomization-based testing conditioned on acceptance. In controlled benchmarks, we observe order-of-magnitude speedups over baseline workflows, with larger gains as the sample size or dimensionality grows, translating into improved precision of causal estimates. Code: github.com/cjerzak/fastrerandomize-software. Interactive capsule: fastrerandomize.github.io/space.
{"title":"FastRerandomize: Fast rerandomization using accelerated computing","authors":"Rebecca Goldstein , Connor T. Jerzak , Aniket Kamat , Fucheng Warren Zhu","doi":"10.1016/j.softx.2026.102508","DOIUrl":"10.1016/j.softx.2026.102508","url":null,"abstract":"<div><div>We present <span>fastrerandomize</span>, an <span>R</span> package for fast, scalable rerandomization in experimental design. Rerandomization improves precision by discarding treatment assignments that fail a prespecified covariate-balance criterion, but existing implementations can become computationally prohibitive as the number of units or covariates grows. <span>fastrerandomize</span> introduces three complementary advances: (i) optional GPU/TPU acceleration to parallelize balance checks, (ii) memory-efficient key-only storage that avoids retaining full assignment matrices, and (iii) auto-vectorized, just-in-time compiled kernels for batched candidate generation and inference. This approach enables exact or Monte Carlo rerandomization at previously intractable scales, making it practical to adopt the tighter balance thresholds required in modern high-dimensional experiments while simultaneously quantifying the resulting gains in precision and power for a given covariate set. Our approach also supports randomization-based testing conditioned on acceptance. In controlled benchmarks, we observe order-of-magnitude speedups over baseline workflows, with larger gains as the sample size or dimensionality grows, translating into improved precision of causal estimates. Code: <span><span>github.com/cjerzak/fastrerandomize-software</span><svg><path></path></svg></span>. Interactive capsule: <span><span>fastrerandomize.github.io/space</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102508"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146037351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-01Epub Date: 2025-12-05DOI: 10.1016/j.softx.2025.102478
Anastasia S. Saridou, Ioannis Kansizoglou, Athanasios P. Vavatsikos
“Spatial-CustSat” is a GIS-based package that includes three models aiming to extent customer satisfaction (CS) analysis to the spatial context using MUlticriteria Satisfaction Analysis (MUSA) methods. The first two models use spatial datasets to perform the -means algorithm and create homogeneous customer zones (clusters). The distinction between the two lies in the method of declaring the number of clusters. Supported by the MUSA method, CS analysis allows the identification of areas where the company's strengths and weaknesses lie. The latter model supports the implementation of CS benchmarking analysis for companies with store networks. Based on Walter's theory that customers shop at the nearest store, it identifies the service area of each store and implements the MUSAplus method. This option enables comparative performance analysis of the stores under evaluation.
{"title":"Spatial-CustSat: An opensource package for customer satisfaction analysis in GIS environment","authors":"Anastasia S. Saridou, Ioannis Kansizoglou, Athanasios P. Vavatsikos","doi":"10.1016/j.softx.2025.102478","DOIUrl":"10.1016/j.softx.2025.102478","url":null,"abstract":"<div><div>“Spatial-CustSat” is a GIS-based package that includes three models aiming to extent customer satisfaction (CS) analysis to the spatial context using MUlticriteria Satisfaction Analysis (MUSA) methods. The first two models use spatial datasets to perform the <span><math><mi>k</mi></math></span>-means algorithm and create homogeneous customer zones (clusters). The distinction between the two lies in the method of declaring the number of clusters. Supported by the MUSA method, CS analysis allows the identification of areas where the company's strengths and weaknesses lie. The latter model supports the implementation of CS benchmarking analysis for companies with store networks. Based on Walter's theory that customers shop at the nearest store, it identifies the service area of each store and implements the MUSAplus method. This option enables comparative performance analysis of the stores under evaluation.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102478"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145693273","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-01Epub Date: 2026-01-16DOI: 10.1016/j.softx.2026.102514
Federico Gatti
We present CUBE, an open-source Python framework for generating adaptive, non-singular meshes on the sphere using a cubed-sphere projection. The software maps spherical slices to Cartesian faces of an inscribed cube, avoiding the pole singularities inherent to latitude–longitude grids and producing quasi-uniform element sizes across the globe. A core feature of CUBE is error-driven spatial adaptation: the mesh is refined according to an estimator based on an approximation of the -seminorm of the topography discretization error, which concentrates resolution where terrain gradients are large. The implementation leverages numpy and scipy for efficient array operations, integrates gmsh via its Python API for meshing, and supports standard geospatial input (e.g., GTOPO30 digital elevation models). CUBE is intended as an extensible tool to produce high-quality input meshes for atmospheric and geophysical models, improving accuracy while reducing computational costs through targeted refinement.
{"title":"CUBE: Cubed-sphere projection for adaptive mesh generation in spherical coordinates","authors":"Federico Gatti","doi":"10.1016/j.softx.2026.102514","DOIUrl":"10.1016/j.softx.2026.102514","url":null,"abstract":"<div><div>We present <span>CUBE</span>, an open-source Python framework for generating adaptive, non-singular meshes on the sphere using a cubed-sphere projection. The software maps spherical slices to Cartesian faces of an inscribed cube, avoiding the pole singularities inherent to latitude–longitude grids and producing quasi-uniform element sizes across the globe. A core feature of <span>CUBE</span> is error-driven spatial adaptation: the mesh is refined according to an estimator based on an approximation of the <span><math><msup><mi>H</mi><mn>1</mn></msup></math></span>-seminorm of the topography discretization error, which concentrates resolution where terrain gradients are large. The implementation leverages <span>numpy</span> and <span>scipy</span> for efficient array operations, integrates <span>gmsh</span> via its Python API for meshing, and supports standard geospatial input (e.g., GTOPO30 digital elevation models). <span>CUBE</span> is intended as an extensible tool to produce high-quality input meshes for atmospheric and geophysical models, improving accuracy while reducing computational costs through targeted refinement.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102514"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145977773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-01Epub Date: 2025-12-02DOI: 10.1016/j.softx.2025.102460
Jing Liu , Saber Elsayed , Daryl Essam , Kyle Harrison , Ruhul Sarker
This paper presents an open-source software tool, PPSSolver, which is developed for academic research and practical decision-making in the Project Portfolio Selection and Scheduling Problem (PPSSP). PPSSolver provides an integrated platform for generating PPSSP instances with different complexities, optimizing PPSSPs using various algorithms, handling dynamic changes, and analyzing results. The software is designed to support researchers in algorithm development by allowing them to integrate customized solvers and evaluate them across benchmark instances. Additionally, PPSSolver provides a graphical user interface for ease of use, thereby lowering the technical barrier for practitioners.
{"title":"PPSSolver: An open-source software tool for Project Portfolio Selection and Scheduling Problems","authors":"Jing Liu , Saber Elsayed , Daryl Essam , Kyle Harrison , Ruhul Sarker","doi":"10.1016/j.softx.2025.102460","DOIUrl":"10.1016/j.softx.2025.102460","url":null,"abstract":"<div><div>This paper presents an open-source software tool, PPSSolver, which is developed for academic research and practical decision-making in the Project Portfolio Selection and Scheduling Problem (PPSSP). PPSSolver provides an integrated platform for generating PPSSP instances with different complexities, optimizing PPSSPs using various algorithms, handling dynamic changes, and analyzing results. The software is designed to support researchers in algorithm development by allowing them to integrate customized solvers and evaluate them across benchmark instances. Additionally, PPSSolver provides a graphical user interface for ease of use, thereby lowering the technical barrier for practitioners.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102460"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145652076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-01Epub Date: 2026-01-29DOI: 10.1016/j.softx.2026.102526
O. Agost , F. Aran , J. Rius , P. Fraile , I. Barri , J. Vilaplana , J. Mateo
Simet provides a modular framework designed for the rigorous evaluation of synthetic image datasets. The framework integrates data provisioning, preprocessing, feature extraction, and complementary metrics, including Fréchet Inception Distance (FID), generative Precision/Recall, and classifier two-sample area under the receiver operating characteristic curve (ROC-AUC), within a single GPU-accelerated pipeline. A restraint mechanism enables declarative pass or fail gating. YAML- and command-line (CLI)-driven orchestration, shared feature caches, and structured logs facilitate reproducible, continuous-integration (CI)-ready workflows. Extensible abstractions, including providers, transforms, feature extractors, and metrics, allow practitioners to add new data sources or tests with minimal code. Templates support downstream utility evaluations, such as training on synthetic data and testing on real data (TSTR). Simet is positioned relative to existing toolkits, and protocols are outlined to demonstrate scalable, multidimensional evaluation of synthetic image data.
{"title":"Simet: Synthetic image metrics - a synthetic image evaluation framework","authors":"O. Agost , F. Aran , J. Rius , P. Fraile , I. Barri , J. Vilaplana , J. Mateo","doi":"10.1016/j.softx.2026.102526","DOIUrl":"10.1016/j.softx.2026.102526","url":null,"abstract":"<div><div>Simet provides a modular framework designed for the rigorous evaluation of synthetic image datasets. The framework integrates data provisioning, preprocessing, feature extraction, and complementary metrics, including Fréchet Inception Distance (FID), generative Precision/Recall, and classifier two-sample area under the receiver operating characteristic curve (ROC-AUC), within a single GPU-accelerated pipeline. A restraint mechanism enables declarative pass or fail gating. YAML- and command-line (CLI)-driven orchestration, shared feature caches, and structured logs facilitate reproducible, continuous-integration (CI)-ready workflows. Extensible abstractions, including providers, transforms, feature extractors, and metrics, allow practitioners to add new data sources or tests with minimal code. Templates support downstream utility evaluations, such as training on synthetic data and testing on real data (TSTR). Simet is positioned relative to existing toolkits, and protocols are outlined to demonstrate scalable, multidimensional evaluation of synthetic image data.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102526"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146077487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-01Epub Date: 2026-01-06DOI: 10.1016/j.softx.2025.102480
I. Lukosiunas , D. Gailevicius , K. Staliunas
We present a comprehensive solver implementation of a 2D Rigorous Coupled Wave Analysis (RCWA), tailored specifically for conformal thin multilayer devices and 2D photonic crystals with arbitrary interface profiles. Unlike traditional diffraction efficiency analysis, our approach emphasizes beam-shaping applications. Thus, our solver uniquely incorporates parameter sweeps across both wavelength and angular domains. This enables effective optimization of devices, such as low-pass spatial filters. Our software streamlines the design and analysis of complex photonic structures, broadening the practical application of RCWA methods and enabling the rapid development and optimization of novel photonic components.
{"title":"Reducing complexity in photonic simulations: ZenScat — an efficient 2D RCWA solver","authors":"I. Lukosiunas , D. Gailevicius , K. Staliunas","doi":"10.1016/j.softx.2025.102480","DOIUrl":"10.1016/j.softx.2025.102480","url":null,"abstract":"<div><div>We present a comprehensive solver implementation of a 2D Rigorous Coupled Wave Analysis (RCWA), tailored specifically for conformal thin multilayer devices and 2D photonic crystals with arbitrary interface profiles. Unlike traditional diffraction efficiency analysis, our approach emphasizes beam-shaping applications. Thus, our solver uniquely incorporates parameter sweeps across both wavelength and angular domains. This enables effective optimization of devices, such as low-pass spatial filters. Our software streamlines the design and analysis of complex photonic structures, broadening the practical application of RCWA methods and enabling the rapid development and optimization of novel photonic components.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"33 ","pages":"Article 102480"},"PeriodicalIF":2.4,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145925734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}