Pub Date : 2025-02-01DOI: 10.1016/j.softx.2024.102029
Md. Shahidul Salim , Sk Imran Hossain , Tanim Jalal , Dhiman Kumer Bose , Mohammad Jahid Ibna Basher
Large language model (LLM) based interactive chatbots have been gaining popularity as a tool to serve organizational information among people. Building such a tool goes through several development phases i.e. (a) Data collection and preprocessing, (b) LLM fine-tuning, testing, and inference, and (c) Chat interface development. To streamline this development process, in this paper, we present the LLM Question–Answer (QA) builder, a web application, which assembles all the steps and makes it easy for technical and non-technical users to develop the LLM QA chatbot. The system allows the instruction fine-tuning of following LLMs: Zepyhr, Mistral, Llama-3, Phi, Flan-T5, and user provided model for organization-specific information retrieval (IR), which can be further enhanced by Retrieval Augmented Generation (RAG) techniques. We have added an automatic web crawling based RAG data scrapper. Also, our system contains a human evaluation feature and RAG metrics for assessing model quality.
{"title":"LLM based QA chatbot builder: A generative AI-based chatbot builder for question answering","authors":"Md. Shahidul Salim , Sk Imran Hossain , Tanim Jalal , Dhiman Kumer Bose , Mohammad Jahid Ibna Basher","doi":"10.1016/j.softx.2024.102029","DOIUrl":"10.1016/j.softx.2024.102029","url":null,"abstract":"<div><div>Large language model (LLM) based interactive chatbots have been gaining popularity as a tool to serve organizational information among people. Building such a tool goes through several development phases i.e. (a) Data collection and preprocessing, (b) LLM fine-tuning, testing, and inference, and (c) Chat interface development. To streamline this development process, in this paper, we present the LLM Question–Answer (QA) builder, a web application, which assembles all the steps and makes it easy for technical and non-technical users to develop the LLM QA chatbot. The system allows the instruction fine-tuning of following LLMs: Zepyhr, Mistral, Llama-3, Phi, Flan-T5, and user provided model for organization-specific information retrieval (IR), which can be further enhanced by Retrieval Augmented Generation (RAG) techniques. We have added an automatic web crawling based RAG data scrapper. Also, our system contains a human evaluation feature and RAG metrics for assessing model quality.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"29 ","pages":"Article 102029"},"PeriodicalIF":2.4,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143127772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-01DOI: 10.1016/j.softx.2024.102030
Giommaria Pilo, Nour Hezbri, André Pereira e Ferreira, Victor Quétu, Enzo Tartaglione
Large-scale models are the backbone of Computer Vision and Natural Language Processing, and their generalizability allows for transfer learning and deployment in different scenarios. However, their large size means that reducing their computational and memory demands remains a challenge. Recent research proposes to achieve “layer collapse”, a condition where multiple layers can be combined due to the collapse of non-linearities to linear operators. While this is an important discovery, most studies remain theoretical, often replacing non-linearities with simple identity functions and not providing a real implementation of the more compact architecture. Our contribution is LayerFold, a library that studies and implements the merging of collapsed layers. We address typical cases, from fully connected to convolutional layers, discussing constraints and prospective challenges. Our tests on edge devices reveal that merely reducing network depth does not always result in faster computation, even when GPU-equipped. This work raises important warnings and opens the door to further advances in efficient model deployment.
{"title":"LayerFold: A Python library to reduce the depth of neural networks","authors":"Giommaria Pilo, Nour Hezbri, André Pereira e Ferreira, Victor Quétu, Enzo Tartaglione","doi":"10.1016/j.softx.2024.102030","DOIUrl":"10.1016/j.softx.2024.102030","url":null,"abstract":"<div><div>Large-scale models are the backbone of Computer Vision and Natural Language Processing, and their generalizability allows for transfer learning and deployment in different scenarios. However, their large size means that reducing their computational and memory demands remains a challenge. Recent research proposes to achieve “layer collapse”, a condition where multiple layers can be combined due to the collapse of non-linearities to linear operators. While this is an important discovery, most studies remain theoretical, often replacing non-linearities with simple identity functions and not providing a real implementation of the more compact architecture. Our contribution is <span>LayerFold</span>, a library that studies and implements the merging of collapsed layers. We address typical cases, from fully connected to convolutional layers, discussing constraints and prospective challenges. Our tests on edge devices reveal that merely reducing network depth does not always result in faster computation, even when GPU-equipped. This work raises important warnings and opens the door to further advances in efficient model deployment.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"29 ","pages":"Article 102030"},"PeriodicalIF":2.4,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143127776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-01DOI: 10.1016/j.softx.2024.102004
Ke Chen, Lingling Qiu, Xianju Xie, Yuxing Bai
Smile analysis is essential for diagnosing and planning treatments in esthetic rehabilitation dentistry. Compared with static images, smile videos provide valuable data but are more complex to analyze. Traditionally, this process has relied on multiple software programs and involved tedious manual configuration and operation, which are labor-intensive and inconsistent. This study introduces Dynasmile, a universal software solution that integrates advanced artificial intelligence (AI) algorithms to streamline the analysis process. Dynasmile offers a comprehensive approach to video-based smile analysis, significantly improving both efficiency and accuracy, representing advancements in dental smile research.
{"title":"Dynasmile: Video-based smile analysis software in orthodontics","authors":"Ke Chen, Lingling Qiu, Xianju Xie, Yuxing Bai","doi":"10.1016/j.softx.2024.102004","DOIUrl":"10.1016/j.softx.2024.102004","url":null,"abstract":"<div><div>Smile analysis is essential for diagnosing and planning treatments in esthetic rehabilitation dentistry. Compared with static images, smile videos provide valuable data but are more complex to analyze. Traditionally, this process has relied on multiple software programs and involved tedious manual configuration and operation, which are labor-intensive and inconsistent. This study introduces Dynasmile, a universal software solution that integrates advanced artificial intelligence (AI) algorithms to streamline the analysis process. Dynasmile offers a comprehensive approach to video-based smile analysis, significantly improving both efficiency and accuracy, representing advancements in dental smile research.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"29 ","pages":"Article 102004"},"PeriodicalIF":2.4,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143092972","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-01DOI: 10.1016/j.softx.2025.102054
Abraham George, Alison Bartsch, Amir Barati Farimani
Across the robotics field, quality demonstrations are an integral part of many control pipelines. However, collecting high-quality demonstration trajectories remains time-consuming and difficult, often resulting in the number of demonstrations being the performance bottleneck. To address this issue, we present a method of Virtual Reality (VR) Teleoperation that uses an Oculus VR headset to teleoperate a Franka Emika Panda robot. Although other VR teleoperation methods exist, our code is open source, designed for readily available consumer hardware, easy to modify, agnostic to experimental setup, and simple to use.
{"title":"OpenVR: Teleoperation for manipulation","authors":"Abraham George, Alison Bartsch, Amir Barati Farimani","doi":"10.1016/j.softx.2025.102054","DOIUrl":"10.1016/j.softx.2025.102054","url":null,"abstract":"<div><div>Across the robotics field, quality demonstrations are an integral part of many control pipelines. However, collecting high-quality demonstration trajectories remains time-consuming and difficult, often resulting in the number of demonstrations being the performance bottleneck. To address this issue, we present a method of Virtual Reality (VR) Teleoperation that uses an Oculus VR headset to teleoperate a Franka Emika Panda robot. Although other VR teleoperation methods exist, our code is open source, designed for readily available consumer hardware, easy to modify, agnostic to experimental setup, and simple to use.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"29 ","pages":"Article 102054"},"PeriodicalIF":2.4,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143092185","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-01DOI: 10.1016/j.softx.2024.101998
Saman Barakat , Alberto Martin-Lopez , Carlos Müller , Sergio Segura , Antonio Ruiz-Cortés
Web APIs may include inter-parameter dependencies that limit how input parameters can be combined to call services correctly. These dependencies are extremely common, appearing in 4 out of every 5 APIs. This paper presents the IDL tool suite, a set of software tools for managing inter-parameter dependencies in web APIs. The suite includes a specification language (IDL), an OpenAPI Specification extension (IDL4OAS), an analysis engine (IDLReasoner), a web API, a playground, an AI chatbot, and a website. We also highlight several contributions by different groups of authors where the IDL tool suite has proven useful in the domains of automated testing, code generation, and API gateways. To date, the IDL tool suite has contributed to the detection of more than 200 bugs in industrial APIs, including GitHub, Spotify, and YouTube, among others. Also, IDL has been used to boost automated code generation, generating up to 10 times more code than state-of-the-art generators for web APIs.
{"title":"The IDL tool suite: Specifying and analyzing inter-parameter dependencies in web APIs","authors":"Saman Barakat , Alberto Martin-Lopez , Carlos Müller , Sergio Segura , Antonio Ruiz-Cortés","doi":"10.1016/j.softx.2024.101998","DOIUrl":"10.1016/j.softx.2024.101998","url":null,"abstract":"<div><div>Web APIs may include inter-parameter dependencies that limit how input parameters can be combined to call services correctly. These dependencies are extremely common, appearing in 4 out of every 5 APIs. This paper presents the IDL tool suite, a set of software tools for managing inter-parameter dependencies in web APIs. The suite includes a specification language (IDL), an OpenAPI Specification extension (IDL4OAS), an analysis engine (IDLReasoner), a web API, a playground, an AI chatbot, and a website. We also highlight several contributions by different groups of authors where the IDL tool suite has proven useful in the domains of automated testing, code generation, and API gateways. To date, the IDL tool suite has contributed to the detection of more than 200 bugs in industrial APIs, including GitHub, Spotify, and YouTube, among others. Also, IDL has been used to boost automated code generation, generating up to 10 times more code than state-of-the-art generators for web APIs.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"29 ","pages":"Article 101998"},"PeriodicalIF":2.4,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143092973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-01DOI: 10.1016/j.softx.2024.102027
Luis Vincent Tejada Martinez , Ibrahim Coulibaly , Jean-François Witz , Antoine Weisrock , François Lesaffre , Xavier Boidin , Denis Najjar
In this article we introduce a Python module named’ ASAHM’ (Automated Subtractive Additive Hybrid Manufacturing) that generates G-code files for hybrid FFF (Fused Filament Fabrication)/CNC (Computer Numerical Control) manufacturing, which can be used on multi-tool 3D printers from files generated by slicers such as Cura, Prusa Slicer, or Simplify3D. The module is based on the Trimesh library, which allows for common 3D mesh manipulations, and the Shapely library, used for the manipulation and analysis of 2D geometric shapes. By integrating contouring and surfacing operations that enable the machining of the entire 3D-printed geometries, ASAHM represents a first step towards the large-scale adoption of a hybrid FFF/CNC process.
{"title":"ASAHM: A Python module for hybrid FFF (Fused Filament Fabrication)/CNC (computer numerically controlled) manufacturing","authors":"Luis Vincent Tejada Martinez , Ibrahim Coulibaly , Jean-François Witz , Antoine Weisrock , François Lesaffre , Xavier Boidin , Denis Najjar","doi":"10.1016/j.softx.2024.102027","DOIUrl":"10.1016/j.softx.2024.102027","url":null,"abstract":"<div><div>In this article we introduce a Python module named’ ASAHM’ (Automated Subtractive Additive Hybrid Manufacturing) that generates G-code files for hybrid FFF (Fused Filament Fabrication)/CNC (Computer Numerical Control) manufacturing, which can be used on multi-tool 3D printers from files generated by slicers such as Cura, Prusa Slicer, or Simplify3D. The module is based on the Trimesh library, which allows for common 3D mesh manipulations, and the Shapely library, used for the manipulation and analysis of 2D geometric shapes. By integrating contouring and surfacing operations that enable the machining of the entire 3D-printed geometries, ASAHM represents a first step towards the large-scale adoption of a hybrid FFF/CNC process.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"29 ","pages":"Article 102027"},"PeriodicalIF":2.4,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143093070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-01DOI: 10.1016/j.softx.2024.102009
Alejandro Fernández-Fraga, Jorge González-Domínguez, María J. Martín
The efficient computation of Beta distribution functions, particularly the Probability Density Function (PDF) and Cumulative Distribution Function (CDF), is critical in various scientific fields, including bioinformatics and data analysis. This work presents BetaGPU, a high-performance software package written in C++ and CUDA that leverages the parallel processing capabilities of Graphics Processing Units (GPUs) to significantly accelerate these computations, with an OpenMP version for multiCPU systems, and integrated seamlessly with popular statistical programming languages R and Python. This open-source package provides an accessible, accurate, and scalable solution for researchers and practitioners. By offloading intensive calculations to the GPU, this software is significantly faster than traditional single-core CPU-based methods, facilitating faster data analysis and enabling real-time applications. The software’s high performance and ease of use make it an invaluable tool for users in bioinformatics and other data-intensive domains.
{"title":"BetaGPU: Harnessing GPU power for parallelized beta distribution functions","authors":"Alejandro Fernández-Fraga, Jorge González-Domínguez, María J. Martín","doi":"10.1016/j.softx.2024.102009","DOIUrl":"10.1016/j.softx.2024.102009","url":null,"abstract":"<div><div>The efficient computation of Beta distribution functions, particularly the Probability Density Function (PDF) and Cumulative Distribution Function (CDF), is critical in various scientific fields, including bioinformatics and data analysis. This work presents BetaGPU, a high-performance software package written in C++ and CUDA that leverages the parallel processing capabilities of Graphics Processing Units (GPUs) to significantly accelerate these computations, with an OpenMP version for multiCPU systems, and integrated seamlessly with popular statistical programming languages R and Python. This open-source package provides an accessible, accurate, and scalable solution for researchers and practitioners. By offloading intensive calculations to the GPU, this software is significantly faster than traditional single-core CPU-based methods, facilitating faster data analysis and enabling real-time applications. The software’s high performance and ease of use make it an invaluable tool for users in bioinformatics and other data-intensive domains.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"29 ","pages":"Article 102009"},"PeriodicalIF":2.4,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143127868","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-01DOI: 10.1016/j.softx.2024.102015
Osman Caglar , Cem Baglum , Ugur Yayan
The growing deployment of AI systems in high-risk environments, along with the increasing necessity of integrating AI into portable devices, emphasizes the need to rigorously assess their quality and reliability. Existing tools for analyzing Deep Neural Network (DNN) models' strength, safety, and quality are limited. CleanAI addresses this gap, serving as an advanced testing system to evaluate the robustness, quality, and dependability of DNN models. It incorporates eleven coverage testing methods, providing developers with insights into DNN quality, enabling analysis of model performance, and generating comprehensive output reports. This study compares various ResNet models using activation metrics, boundary metrics, and interaction metrics, revealing qualitative differences. This comparative analysis informs developers, setting a critical benchmark to tailor AI solutions adhering to stringent quality standards. Ultimately, it encourages reconsideration of model complexity and memory footprint for optimized designs, enhancing overall performance and efficiency. Additionally, by simplifying models and reducing their size, CleanAI facilitates the acceleration of AI models, resulting in significant time and cost savings. The findings from the comparative analysis also demonstrate the potential for substantial optimization in model complexity and size. By leveraging CleanAI's comprehensive coverage metrics, developers can identify areas for refinement, leading to streamlined models with reduced memory requirements. This approach not only enhances computational efficiency but also supports the growing demand for lightweight AI systems suitable for deployment on portable devices. CleanAI's role in bridging the gap between robustness and efficiency makes it a crucial tool for advancing AI development while maintaining high standards of quality and reliability.
{"title":"CleanAI: Deep neural network model quality evaluation tool","authors":"Osman Caglar , Cem Baglum , Ugur Yayan","doi":"10.1016/j.softx.2024.102015","DOIUrl":"10.1016/j.softx.2024.102015","url":null,"abstract":"<div><div>The growing deployment of AI systems in high-risk environments, along with the increasing necessity of integrating AI into portable devices, emphasizes the need to rigorously assess their quality and reliability. Existing tools for analyzing Deep Neural Network (DNN) models' strength, safety, and quality are limited. CleanAI addresses this gap, serving as an advanced testing system to evaluate the robustness, quality, and dependability of DNN models. It incorporates eleven coverage testing methods, providing developers with insights into DNN quality, enabling analysis of model performance, and generating comprehensive output reports. This study compares various ResNet models using activation metrics, boundary metrics, and interaction metrics, revealing qualitative differences. This comparative analysis informs developers, setting a critical benchmark to tailor AI solutions adhering to stringent quality standards. Ultimately, it encourages reconsideration of model complexity and memory footprint for optimized designs, enhancing overall performance and efficiency. Additionally, by simplifying models and reducing their size, CleanAI facilitates the acceleration of AI models, resulting in significant time and cost savings. The findings from the comparative analysis also demonstrate the potential for substantial optimization in model complexity and size. By leveraging CleanAI's comprehensive coverage metrics, developers can identify areas for refinement, leading to streamlined models with reduced memory requirements. This approach not only enhances computational efficiency but also supports the growing demand for lightweight AI systems suitable for deployment on portable devices. CleanAI's role in bridging the gap between robustness and efficiency makes it a crucial tool for advancing AI development while maintaining high standards of quality and reliability.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"29 ","pages":"Article 102015"},"PeriodicalIF":2.4,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143127872","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-01DOI: 10.1016/j.softx.2025.102071
Andres Felipe Ruiz-Hurtado , Juliana Perez Bolaños , Darwin Alexis Arrechea-Castillo , Juan Andres Cardoso
Tree monitoring is a challenging task due to the labour-intensive and time-consuming data collection methods required. We present TreeEyed, a QGIS plugin designed to facilitate the monitoring of trees using remote sensing RGB imagery and artificial intelligence models. The plugin offers several tools including tree inference process for tree segmentation and detection. This tool was implemented to facilitate the manipulation and processing of Geographical Information System (GIS) data from different sources, allowing multi resolution, variable extent, and generating results in a standard GIS format (georeferenced raster and vector). Additional options like postprocessing, dataset generation, and data validation are also incorporated.
{"title":"TreeEyed: A QGIS plugin for tree monitoring in silvopastoral systems using state of the art AI models","authors":"Andres Felipe Ruiz-Hurtado , Juliana Perez Bolaños , Darwin Alexis Arrechea-Castillo , Juan Andres Cardoso","doi":"10.1016/j.softx.2025.102071","DOIUrl":"10.1016/j.softx.2025.102071","url":null,"abstract":"<div><div>Tree monitoring is a challenging task due to the labour-intensive and time-consuming data collection methods required. We present TreeEyed, a QGIS plugin designed to facilitate the monitoring of trees using remote sensing RGB imagery and artificial intelligence models. The plugin offers several tools including tree inference process for tree segmentation and detection. This tool was implemented to facilitate the manipulation and processing of Geographical Information System (GIS) data from different sources, allowing multi resolution, variable extent, and generating results in a standard GIS format (georeferenced raster and vector). Additional options like postprocessing, dataset generation, and data validation are also incorporated.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"29 ","pages":"Article 102071"},"PeriodicalIF":2.4,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143127999","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}