Pub Date : 2024-09-26DOI: 10.1016/j.softx.2024.101908
Manuel Ortega Cordovilla, Sergio Garrido Merino, Crescencio Bravo Santos, Ana Isabel Molina Díaz, Manuel Ortega Cantero
This paper presents Excursiona, an application that provides substantial value to group excursions. Excursiona promotes collaboration and awareness during the excursion, as the group members navigate the map. Moreover, users can share pictures of interesting points they discover and interact in the chat room. The application has great potential in fields that benefit from outdoor collaboration, with application cases on children with special needs or firefighters. Regarding the technological approach, Excursiona has been developed in Flutter, making it compatible with iOS and Android operating systems, which along other technological tools has enhanced the possibilities of the project. Finally, an evaluation with users has allowed the testing of the system and the evaluation of the collaborative and awareness features.
{"title":"Excursiona: A collaborative mobile application for excursions in nature","authors":"Manuel Ortega Cordovilla, Sergio Garrido Merino, Crescencio Bravo Santos, Ana Isabel Molina Díaz, Manuel Ortega Cantero","doi":"10.1016/j.softx.2024.101908","DOIUrl":"10.1016/j.softx.2024.101908","url":null,"abstract":"<div><div>This paper presents Excursiona, an application that provides substantial value to group excursions. Excursiona promotes collaboration and awareness during the excursion, as the group members navigate the map. Moreover, users can share pictures of interesting points they discover and interact in the chat room. The application has great potential in fields that benefit from outdoor collaboration, with application cases on children with special needs or firefighters. Regarding the technological approach, Excursiona has been developed in Flutter, making it compatible with iOS and Android operating systems, which along other technological tools has enhanced the possibilities of the project. Finally, an evaluation with users has allowed the testing of the system and the evaluation of the collaborative and awareness features.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101908"},"PeriodicalIF":2.4,"publicationDate":"2024-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142323938","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-26DOI: 10.1016/j.softx.2024.101843
Shu-Min Tan , Shih-Hsun Hung , Je-Chiang Tsai
Coffee is one of the most important agricultural commodities in commodity markets. The quality of coffee beverages strongly depends on that of green coffee beans. However, the conventional selection technique mainly relies on personnel visual inspection, which is subjective and time-consuming. Based on our recently discovered site-specific color characteristics of the seat coat of green coffee beans and support vector machines (a machine learning classifier), the Python-based identification/evaluation scheme of beans, GCBICT, provides an affordable, effective, and user-friendly way to identify qualified beans and their growing sites.
The command-line tool consists of two functions: (1) the Qualified-Defective Separator and (2) the Mixed Separator. The Qualified-Defective Separator function is to distinguish between qualified and defective green coffee beans. Due to the site-specific property of our color characteristics of beans, the training set can be small. The Mixed Separator can identify qualified beans from different growing sites if coffee distributors mix them for cost in their business. Moreover, this function is unique to our evaluation scheme.
{"title":"GCBICT: Green Coffee Bean Identification Command-line Tool","authors":"Shu-Min Tan , Shih-Hsun Hung , Je-Chiang Tsai","doi":"10.1016/j.softx.2024.101843","DOIUrl":"10.1016/j.softx.2024.101843","url":null,"abstract":"<div><div>Coffee is one of the most important agricultural commodities in commodity markets. The quality of coffee beverages strongly depends on that of green coffee beans. However, the conventional selection technique mainly relies on personnel visual inspection, which is subjective and time-consuming. Based on our recently discovered site-specific color characteristics of the seat coat of green coffee beans and support vector machines (a machine learning classifier), the Python-based identification/evaluation scheme of beans, GCBICT, provides an affordable, effective, and user-friendly way to identify qualified beans and their growing sites.</div><div>The command-line tool consists of two functions: (1) the Qualified-Defective Separator and (2) the Mixed Separator. The Qualified-Defective Separator function is to distinguish between qualified and defective green coffee beans. Due to the site-specific property of our color characteristics of beans, the training set can be small. The Mixed Separator can identify qualified beans from different growing sites if coffee distributors mix them for cost in their business. Moreover, this function is unique to our evaluation scheme.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101843"},"PeriodicalIF":2.4,"publicationDate":"2024-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142323937","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-24DOI: 10.1016/j.softx.2024.101896
Gerhard Reinerth , David Messmann , Jean Elsner , Ulrich Walter
The Dynamic Mode Decomposition is a widely used tool for analysis in various scientific fields ranging from plasma physics to robotics, which decomposes high-dimensional signals into interpretable quantities. It reduces the dimensionality of a dynamic system while preserving the complex behavior. The identified quantities then can be used to perform a simulation (inter- and extrapolation) efficiently. The traditional Dynamic Mode Decomposition requires data to be sampled at a constant rate. Measurements however can experience delays or jitter. Due to the structure of the classic Dynamic Mode Decomposition, inter- and extrapolation at specific continuous timesteps become intractable. The Variable Projection method, a nonlinear optimization scheme that splits linear from nonlinear parameters for optimization, relaxes the fixed sampling rate requirement. Thus, the measurements can arrive at any time step. The available Python library implements a variant of the Levenberg–Marquardt optimizer for the Variable Projection Method. The optimization procedure uses a complex residual function since the measurements can incorporate complex numbers. Python’s available optimization suites require real, analytic functions. We reformulate the problem to utilize the available optimizers to perform Variable Projection within the Dynamic Mode Decomposition framework, allowing for faster run times w.r.t. the Python implementation in most cases. A preselection scheme on the measurements can enhance overall computational efficiency while maintaining the signal reconstruction capability.
{"title":"VarProDMD: Solving Variable Projection for the Dynamic Mode Decomposition with SciPy’s optimization suite","authors":"Gerhard Reinerth , David Messmann , Jean Elsner , Ulrich Walter","doi":"10.1016/j.softx.2024.101896","DOIUrl":"10.1016/j.softx.2024.101896","url":null,"abstract":"<div><div>The Dynamic Mode Decomposition is a widely used tool for analysis in various scientific fields ranging from plasma physics to robotics, which decomposes high-dimensional signals into interpretable quantities. It reduces the dimensionality of a dynamic system while preserving the complex behavior. The identified quantities then can be used to perform a simulation (inter- and extrapolation) efficiently. The traditional Dynamic Mode Decomposition requires data to be sampled at a constant rate. Measurements however can experience delays or jitter. Due to the structure of the classic Dynamic Mode Decomposition, inter- and extrapolation at specific continuous timesteps become intractable. The Variable Projection method, a nonlinear optimization scheme that splits linear from nonlinear parameters for optimization, relaxes the fixed sampling rate requirement. Thus, the measurements can arrive at any time step. The available Python library implements a variant of the Levenberg–Marquardt optimizer for the Variable Projection Method. The optimization procedure uses a complex residual function since the measurements can incorporate complex numbers. Python’s available optimization suites require real, analytic functions. We reformulate the problem to utilize the available optimizers to perform Variable Projection within the Dynamic Mode Decomposition framework, allowing for faster run times w.r.t. the Python implementation in most cases. A preselection scheme on the measurements can enhance overall computational efficiency while maintaining the signal reconstruction capability.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101896"},"PeriodicalIF":2.4,"publicationDate":"2024-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352711024002668/pdfft?md5=9e60f0ecfde1d2b56ee1a2205d3db26d&pid=1-s2.0-S2352711024002668-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142316034","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-24DOI: 10.1016/j.softx.2024.101913
Stefano Chiaradonna , Petar Jevtić , Beckett Sterner
We present a Python package called Modular Petri Net Assembly Toolkit (MPAT) that empowers users to easily create large-scale, modular Petri Nets for various spatial configurations, including extensive spatial grids or those derived from shapefiles, augmented with heterogeneous information layers. Petri Nets are powerful discrete event system modeling tools in computational biology and engineering. However, their utility for automated construction of large-scale spatial models has been limited by gaps in existing modeling software packages. MPAT addresses this gap by supporting the development of modular Petri Net models with flexible spatial geometries.
我们介绍了一个名为 "模块化 Petri 网组装工具包"(MPAT)的 Python 软件包,它能让用户轻松创建各种空间配置的大规模模块化 Petri 网,包括广泛的空间网格或从 shapefile 导出的网格,并添加异构信息层。Petri 网是计算生物学和工程学领域强大的离散事件系统建模工具。然而,由于现有建模软件包的缺陷,它们在自动构建大规模空间模型方面的实用性受到了限制。MPAT 支持开发具有灵活空间几何结构的模块化 Petri 网模型,从而弥补了这一不足。
{"title":"MPAT: Modular Petri Net Assembly Toolkit","authors":"Stefano Chiaradonna , Petar Jevtić , Beckett Sterner","doi":"10.1016/j.softx.2024.101913","DOIUrl":"10.1016/j.softx.2024.101913","url":null,"abstract":"<div><div>We present a Python package called Modular Petri Net Assembly Toolkit (<span>MPAT</span>) that empowers users to easily create large-scale, modular Petri Nets for various spatial configurations, including extensive spatial grids or those derived from shapefiles, augmented with heterogeneous information layers. Petri Nets are powerful discrete event system modeling tools in computational biology and engineering. However, their utility for automated construction of large-scale spatial models has been limited by gaps in existing modeling software packages. <span>MPAT</span> addresses this gap by supporting the development of modular Petri Net models with flexible spatial geometries.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101913"},"PeriodicalIF":2.4,"publicationDate":"2024-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352711024002838/pdfft?md5=567d16c9448beb3fc239109c48a8487d&pid=1-s2.0-S2352711024002838-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142316033","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-24DOI: 10.1016/j.softx.2024.101909
Di Chang, Guangbao Guo
The goal of the Length and Information Optimization Criterion (LIC) is to handle datasets containing redundant information, identify and select the most informative subsets, and ensure that a large portion of the information from the dataset is retained. The proposed R package, called LIC, is specifically designed for optimal subset selection in distributed redundant data. It achieves this by minimizing the length of the final interval estimator while maximizing the amount of information retained from the selected data subset. This functionality is highly useful across various fields such as economics, industry, and medicine. For example, in studies involving the prediction of nitrogen oxide emissions from gas turbines, self-noise of airfoils under stochastic wind conditions, and real estate valuation predictions, LIC can be used to explore the performance of random distributed block methods in parallel computing environments.
长度与信息优化准则(LIC)的目标是处理包含冗余信息的数据集,识别并选择信息量最大的子集,并确保数据集中的大部分信息得以保留。所提出的 R 软件包名为 LIC,专门用于在分布式冗余数据中选择最优子集。它通过最小化最终区间估计器的长度,同时最大限度地保留所选数据子集的信息量来实现这一目标。这一功能在经济、工业和医学等各个领域都非常有用。例如,在涉及燃气轮机氮氧化物排放预测、随机风力条件下机翼自噪声以及房地产估价预测的研究中,LIC 可用于探索随机分布式块方法在并行计算环境中的性能。
{"title":"LIC: An R package for optimal subset selection for distributed data","authors":"Di Chang, Guangbao Guo","doi":"10.1016/j.softx.2024.101909","DOIUrl":"10.1016/j.softx.2024.101909","url":null,"abstract":"<div><div>The goal of the Length and Information Optimization Criterion (LIC) is to handle datasets containing redundant information, identify and select the most informative subsets, and ensure that a large portion of the information from the dataset is retained. The proposed R package, called LIC, is specifically designed for optimal subset selection in distributed redundant data. It achieves this by minimizing the length of the final interval estimator while maximizing the amount of information retained from the selected data subset. This functionality is highly useful across various fields such as economics, industry, and medicine. For example, in studies involving the prediction of nitrogen oxide emissions from gas turbines, self-noise of airfoils under stochastic wind conditions, and real estate valuation predictions, LIC can be used to explore the performance of random distributed block methods in parallel computing environments.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101909"},"PeriodicalIF":2.4,"publicationDate":"2024-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352711024002796/pdfft?md5=1f04a7aaa4c1a7120a9a3fe12e7bd8a6&pid=1-s2.0-S2352711024002796-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142316119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-24DOI: 10.1016/j.softx.2024.101905
Grzegorz Zakrzewski , Kacper Skonieczka , Mikołaj Małkiński , Jacek Mańdziuk
Electricity price forecasts are essential for making informed business decisions within the electricity markets. Probabilistic forecasts, which provide a range of possible future prices rather than a single estimate, are particularly valuable for capturing market uncertainties. The Quantile Regression Averaging (QRA) method is a leading approach to generating these probabilistic forecasts. In this paper, we introduce ReModels, a comprehensive Python package that implements QRA and its various modifications from recent literature. This package not only offers tools for QRA but also includes features for data acquisition, preparation, and variance stabilizing transformations (VSTs). To the best of our knowledge, there is no publicly available implementation of QRA and its variants. Our package aims to fill this gap, providing researchers and practitioners with the tools to generate accurate and reliable probabilistic forecasts in the field of electricity price forecasting.
{"title":"ReModels: Quantile Regression Averaging models","authors":"Grzegorz Zakrzewski , Kacper Skonieczka , Mikołaj Małkiński , Jacek Mańdziuk","doi":"10.1016/j.softx.2024.101905","DOIUrl":"10.1016/j.softx.2024.101905","url":null,"abstract":"<div><div>Electricity price forecasts are essential for making informed business decisions within the electricity markets. Probabilistic forecasts, which provide a range of possible future prices rather than a single estimate, are particularly valuable for capturing market uncertainties. The Quantile Regression Averaging (QRA) method is a leading approach to generating these probabilistic forecasts. In this paper, we introduce ReModels, a comprehensive Python package that implements QRA and its various modifications from recent literature. This package not only offers tools for QRA but also includes features for data acquisition, preparation, and variance stabilizing transformations (VSTs). To the best of our knowledge, there is no publicly available implementation of QRA and its variants. Our package aims to fill this gap, providing researchers and practitioners with the tools to generate accurate and reliable probabilistic forecasts in the field of electricity price forecasting.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101905"},"PeriodicalIF":2.4,"publicationDate":"2024-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352711024002759/pdfft?md5=37e3ccc5cef7a2ba21273114a979dbf4&pid=1-s2.0-S2352711024002759-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142316035","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cardiovascular diseases claim over 17 million lives annually. Prevention involves adopting healthy habits and regular check-ups, ideally outside hospitals to reduce healthcare costs, leveraging telemedicine tools. However, diagnosing CVDs outside hospitals can be challenging due to noise interference in electrocardiograms (ECGs), necessitating the use of Signal Quality Assessment (SQA) systems. This paper presents a MATLAB toolbox for automated ECG Signal Quality Assessment, featuring a novel method. Furthermore, the toolbox can extract up to 37 Signal Quality Indices (SQIs), commonly used as features in machine learning-based SQA. Therefore, our software has the potential to facilitate the healthcare process, resulting in efficient and cost-effective cardiovascular care.
{"title":"EcgScorer: An open source MATLAB toolbox for ECG signal quality assessment","authors":"Noura Alexendre , Fotsing Kuetche , Ntsama Eloundou Pascal , Simo Thierry","doi":"10.1016/j.softx.2024.101900","DOIUrl":"10.1016/j.softx.2024.101900","url":null,"abstract":"<div><div>Cardiovascular diseases claim over 17 million lives annually. Prevention involves adopting healthy habits and regular check-ups, ideally outside hospitals to reduce healthcare costs, leveraging telemedicine tools. However, diagnosing CVDs outside hospitals can be challenging due to noise interference in electrocardiograms (ECGs), necessitating the use of Signal Quality Assessment (SQA) systems. This paper presents a MATLAB toolbox for automated ECG Signal Quality Assessment, featuring a novel method. Furthermore, the toolbox can extract up to 37 Signal Quality Indices (SQIs), commonly used as features in machine learning-based SQA. Therefore, our software has the potential to facilitate the healthcare process, resulting in efficient and cost-effective cardiovascular care.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101900"},"PeriodicalIF":2.4,"publicationDate":"2024-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S235271102400270X/pdfft?md5=ce3ba818258ed3d32c231a2f0aec98c6&pid=1-s2.0-S235271102400270X-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142310591","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-21DOI: 10.1016/j.softx.2024.101910
Luis Cortés Ramírez , Luis A. Sánchez-Gaspariano , Israel Vivaldo-de-la-Cruz , Carlos Muñiz-Montero , Alejandro I. Bautista-Castillo
Recently Python has become relevant for many tasks in a variety of disciplines leading to the development of various open source libraries. Our contribution to that cluster of tools is SCAPy, a useful program for the symbolic computation of analog circuits. The most appealing feature of SCAPy lies in its ability to solve large circuits with several nodes in few milliseconds due to its DDD algorithm, which drives to the fast solution of the system of equations of the circuit. To show the SCAPy performance, three nonclassical circuit examples are reported: a WTA/LTA filter, a Memristor and a Fractional Integrator.
{"title":"E2SCAPy: Electric and electronic symbolic circuit analysis in python","authors":"Luis Cortés Ramírez , Luis A. Sánchez-Gaspariano , Israel Vivaldo-de-la-Cruz , Carlos Muñiz-Montero , Alejandro I. Bautista-Castillo","doi":"10.1016/j.softx.2024.101910","DOIUrl":"10.1016/j.softx.2024.101910","url":null,"abstract":"<div><div>Recently Python has become relevant for many tasks in a variety of disciplines leading to the development of various open source libraries. Our contribution to that cluster of tools is <span><math><msup><mrow><mi>E</mi></mrow><mrow><mn>2</mn></mrow></msup></math></span>SCAPy, a useful program for the symbolic computation of analog circuits. The most appealing feature of <span><math><msup><mrow><mi>E</mi></mrow><mrow><mn>2</mn></mrow></msup></math></span>SCAPy lies in its ability to solve large circuits with several nodes in few milliseconds due to its DDD algorithm, which drives to the fast solution of the system of equations of the circuit. To show the <span><math><msup><mrow><mi>E</mi></mrow><mrow><mn>2</mn></mrow></msup></math></span>SCAPy performance, three nonclassical circuit examples are reported: a WTA/LTA filter, a Memristor and a Fractional Integrator.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101910"},"PeriodicalIF":2.4,"publicationDate":"2024-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352711024002802/pdfft?md5=8278592a9cef40ab7f05b6e33803fdd0&pid=1-s2.0-S2352711024002802-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142310590","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-20DOI: 10.1016/j.softx.2024.101894
Pietro Scala, Giorgio Manno, Giuseppe Ciraolo
Coastal erosion is a critical issue affecting shorelines worldwide, imposing effective monitoring and management strategies. We present the Coastal Dynamics Analyzer (CDA), a newly developed QGIS plugin designed for transect-based analysis of shoreline changes, enhancing both the accuracy and efficiency of coastal erosion studies. CDA seamlessly integrates into QGIS, providing an open-source, user-friendly tool that automates the calculation of key shoreline change metrics, including End Point Rate (EPR), Net Shoreline Movement (NSM), Shoreline Change Envelope (SCE), and Linear Regression Rate (LRR). This paper presents the motivation behind the CDA's development, its importance in addressing the limitations of existing tools such as the Digital Shoreline Analysis System (DSAS) and Analyzing Moving Boundaries Using R (AMBUR) and details its implementation. The plugin's functionalities are demonstrated through a case study in the Mediterranean Sea, showing its ability to generate accurate and reliable data for coastal management. By providing high quality results with considerable speed, CDA is promising to become a resource for researchers, coastal engineers, and policy makers involved in coastal erosion management and climate change adaptation planning.
{"title":"Coastal dynamics analyzer (CDA): A QGIS plugin for transect based analysis of coastal erosion","authors":"Pietro Scala, Giorgio Manno, Giuseppe Ciraolo","doi":"10.1016/j.softx.2024.101894","DOIUrl":"10.1016/j.softx.2024.101894","url":null,"abstract":"<div><div>Coastal erosion is a critical issue affecting shorelines worldwide, imposing effective monitoring and management strategies. We present the Coastal Dynamics Analyzer (CDA), a newly developed QGIS plugin designed for transect-based analysis of shoreline changes, enhancing both the accuracy and efficiency of coastal erosion studies. CDA seamlessly integrates into QGIS, providing an open-source, user-friendly tool that automates the calculation of key shoreline change metrics, including End Point Rate (EPR), Net Shoreline Movement (NSM), Shoreline Change Envelope (SCE), and Linear Regression Rate (LRR). This paper presents the motivation behind the CDA's development, its importance in addressing the limitations of existing tools such as the Digital Shoreline Analysis System (DSAS) and Analyzing Moving Boundaries Using R (AMBUR) and details its implementation. The plugin's functionalities are demonstrated through a case study in the Mediterranean Sea, showing its ability to generate accurate and reliable data for coastal management. By providing high quality results with considerable speed, CDA is promising to become a resource for researchers, coastal engineers, and policy makers involved in coastal erosion management and climate change adaptation planning.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101894"},"PeriodicalIF":2.4,"publicationDate":"2024-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352711024002644/pdfft?md5=749400015d4540df975d910059e0ab47&pid=1-s2.0-S2352711024002644-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142310589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-17DOI: 10.1016/j.softx.2024.101892
Allen Blackman
The Forest Conservation Evaluation Tool is a user-friendly webtool that measures the effect on tree cover loss of place-based conservation policies such as protected areas and payments for environmental services. It allows nontechnical users to conduct impact evaluations using high spatial resolution satellite data on tree cover loss along with statistical techniques that control for confounding factors. Because it has all requisite data on board and features a map- and menu-based interface, most users can generate intuitive results in single short session.
{"title":"The Forest Conservation Evaluation Tool: Accessible impact evaluation for Latin America","authors":"Allen Blackman","doi":"10.1016/j.softx.2024.101892","DOIUrl":"10.1016/j.softx.2024.101892","url":null,"abstract":"<div><p>The Forest Conservation Evaluation Tool is a user-friendly webtool that measures the effect on tree cover loss of place-based conservation policies such as protected areas and payments for environmental services. It allows nontechnical users to conduct impact evaluations using high spatial resolution satellite data on tree cover loss along with statistical techniques that control for confounding factors. Because it has all requisite data on board and features a map- and menu-based interface, most users can generate intuitive results in single short session.</p></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"28 ","pages":"Article 101892"},"PeriodicalIF":2.4,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352711024002620/pdfft?md5=d9d67ae67b04b8b2cd4defc2effb3cc8&pid=1-s2.0-S2352711024002620-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142240059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}