Pub Date : 2023-07-11DOI: 10.3390/informatics10030060
Raza Nowrozy, K. Ahmed, Hua Wang, Timothy Mcintosh
This paper proposed a novel privacy model for Electronic Health Records (EHR) systems utilizing a conceptual privacy ontology and Machine Learning (ML) methodologies. It underscores the challenges currently faced by EHR systems such as balancing privacy and accessibility, user-friendliness, and legal compliance. To address these challenges, the study developed a universal privacy model designed to efficiently manage and share patients’ personal and sensitive data across different platforms, such as MHR and NHS systems. The research employed various BERT techniques to differentiate between legitimate and illegitimate privacy policies. Among them, Distil BERT emerged as the most accurate, demonstrating the potential of our ML-based approach to effectively identify inadequate privacy policies. This paper outlines future research directions, emphasizing the need for comprehensive evaluations, testing in real-world case studies, the investigation of adaptive frameworks, ethical implications, and fostering stakeholder collaboration. This research offers a pioneering approach towards enhancing healthcare information privacy, providing an innovative foundation for future work in this field.
{"title":"Towards a Universal Privacy Model for Electronic Health Record Systems: An Ontology and Machine Learning Approach","authors":"Raza Nowrozy, K. Ahmed, Hua Wang, Timothy Mcintosh","doi":"10.3390/informatics10030060","DOIUrl":"https://doi.org/10.3390/informatics10030060","url":null,"abstract":"This paper proposed a novel privacy model for Electronic Health Records (EHR) systems utilizing a conceptual privacy ontology and Machine Learning (ML) methodologies. It underscores the challenges currently faced by EHR systems such as balancing privacy and accessibility, user-friendliness, and legal compliance. To address these challenges, the study developed a universal privacy model designed to efficiently manage and share patients’ personal and sensitive data across different platforms, such as MHR and NHS systems. The research employed various BERT techniques to differentiate between legitimate and illegitimate privacy policies. Among them, Distil BERT emerged as the most accurate, demonstrating the potential of our ML-based approach to effectively identify inadequate privacy policies. This paper outlines future research directions, emphasizing the need for comprehensive evaluations, testing in real-world case studies, the investigation of adaptive frameworks, ethical implications, and fostering stakeholder collaboration. This research offers a pioneering approach towards enhancing healthcare information privacy, providing an innovative foundation for future work in this field.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45932148","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-09DOI: 10.3390/informatics10030059
Michail Danousis, C. Goumopoulos
With age, a decline in motor and cognitive functionality is inevitable, and it greatly affects the quality of life of the elderly and their ability to live independently. Early detection of these types of decline can enable timely interventions and support for maintaining functional independence and improving overall well-being. This paper explores the potential of the GAME2AWE platform in assessing the motor and cognitive condition of seniors based on their in-game performance data. The proposed methodology involves developing machine learning models to explore the predictive power of features that are derived from the data collected during gameplay on the GAME2AWE platform. Through a study involving fifteen elderly participants, we demonstrate that utilizing in-game data can achieve a high classification performance when predicting the motor and cognitive states. Various machine learning techniques were used but Random Forest outperformed the other models, achieving a classification accuracy ranging from 93.6% for cognitive screening to 95.6% for motor assessment. These results highlight the potential of using exergames within a technology-rich environment as an effective means of capturing the health status of seniors. This approach opens up new possibilities for objective and non-invasive health assessment, facilitating early detections and interventions to improve the well-being of seniors.
{"title":"A Machine-Learning-Based Motor and Cognitive Assessment Tool Using In-Game Data from the GAME2AWE Platform","authors":"Michail Danousis, C. Goumopoulos","doi":"10.3390/informatics10030059","DOIUrl":"https://doi.org/10.3390/informatics10030059","url":null,"abstract":"With age, a decline in motor and cognitive functionality is inevitable, and it greatly affects the quality of life of the elderly and their ability to live independently. Early detection of these types of decline can enable timely interventions and support for maintaining functional independence and improving overall well-being. This paper explores the potential of the GAME2AWE platform in assessing the motor and cognitive condition of seniors based on their in-game performance data. The proposed methodology involves developing machine learning models to explore the predictive power of features that are derived from the data collected during gameplay on the GAME2AWE platform. Through a study involving fifteen elderly participants, we demonstrate that utilizing in-game data can achieve a high classification performance when predicting the motor and cognitive states. Various machine learning techniques were used but Random Forest outperformed the other models, achieving a classification accuracy ranging from 93.6% for cognitive screening to 95.6% for motor assessment. These results highlight the potential of using exergames within a technology-rich environment as an effective means of capturing the health status of seniors. This approach opens up new possibilities for objective and non-invasive health assessment, facilitating early detections and interventions to improve the well-being of seniors.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46639281","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-07DOI: 10.3390/informatics10030058
M. Roberts, Randy W. Connolly, Joel Conley, Janet Miller
Over the past two decades, the internet has become an increasingly important venue for political expression, community building, and social activism. Scholars in a wide range of disciplines have endeavored to understand and measure how these transformations have affected individuals’ civic attitudes and behaviors. The Digital Citizenship Scale (original and revised form) has become one of the most widely used instruments for measuring and evaluating these changes, but to date, no study has investigated how digital citizenship behaviors relate to exogenous variables. Using the classic Big Five Factor model of personality (Openness to experience, Conscientiousness, Extroversion, Agreeableness, and Neuroticism), this study investigated how personality traits relate to the key components of digital citizenship. Survey results were gathered across three countries (n = 1820), and analysis revealed that personality traits map uniquely on to digital citizenship in comparison to traditional forms of civic engagement. The implications of these findings are discussed.
{"title":"Digital Citizenship and the Big Five Personality Traits","authors":"M. Roberts, Randy W. Connolly, Joel Conley, Janet Miller","doi":"10.3390/informatics10030058","DOIUrl":"https://doi.org/10.3390/informatics10030058","url":null,"abstract":"Over the past two decades, the internet has become an increasingly important venue for political expression, community building, and social activism. Scholars in a wide range of disciplines have endeavored to understand and measure how these transformations have affected individuals’ civic attitudes and behaviors. The Digital Citizenship Scale (original and revised form) has become one of the most widely used instruments for measuring and evaluating these changes, but to date, no study has investigated how digital citizenship behaviors relate to exogenous variables. Using the classic Big Five Factor model of personality (Openness to experience, Conscientiousness, Extroversion, Agreeableness, and Neuroticism), this study investigated how personality traits relate to the key components of digital citizenship. Survey results were gathered across three countries (n = 1820), and analysis revealed that personality traits map uniquely on to digital citizenship in comparison to traditional forms of civic engagement. The implications of these findings are discussed.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44331366","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-07DOI: 10.3390/informatics10030057
Marina Aivazidi, C. Michalakelis
Innovative learning methods including the increasing use of Information and Communication Technologies (ICT) applications are transforming the contemporary educational process. Teachers’ perceptions of ICT, self-efficacy on computers and demographics are some of the factors that have been found to impact the use of ICT in the educational process. The aim of the present research is to analyze the perceptions of primary school teachers about ICT and how they affect their use in the educational process, through the case of Greece. To do so, primary research was carried out. Data from 285 valid questionnaires were statistically analyzed using descriptive statistics, principal components analysis, correlation and regression analysis. The main results were in accordance with the relevant literature, indicating the impact of teachers’ self-efficacy, perceptions and demographics on ICT use in the educational process. These results provide useful insights for the achievement of a successful implementation of ICT in education.
{"title":"Information and Communication Technologies in Primary Education: Teachers’ Perceptions in Greece","authors":"Marina Aivazidi, C. Michalakelis","doi":"10.3390/informatics10030057","DOIUrl":"https://doi.org/10.3390/informatics10030057","url":null,"abstract":"Innovative learning methods including the increasing use of Information and Communication Technologies (ICT) applications are transforming the contemporary educational process. Teachers’ perceptions of ICT, self-efficacy on computers and demographics are some of the factors that have been found to impact the use of ICT in the educational process. The aim of the present research is to analyze the perceptions of primary school teachers about ICT and how they affect their use in the educational process, through the case of Greece. To do so, primary research was carried out. Data from 285 valid questionnaires were statistically analyzed using descriptive statistics, principal components analysis, correlation and regression analysis. The main results were in accordance with the relevant literature, indicating the impact of teachers’ self-efficacy, perceptions and demographics on ICT use in the educational process. These results provide useful insights for the achievement of a successful implementation of ICT in education.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48874838","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-04DOI: 10.3390/informatics10030056
Celso A. R. L. Brennand, R. Meneguette, Geraldo P. Rocha Filho
Congestion in large cities is widely recognized as a problem that impacts various aspects of society, including the economy and public health. To support the urban traffic system and to mitigate traffic congestion and the damage it causes, in this article we propose an assistant Intelligent Transport Systems (ITS) service for traffic management in Vehicular Networks (VANET), which we name FOXS-GSC, for Fast Offset Xpath Service with hexaGonS Communication. FOXS-GSC uses a VANET communication and fog computing paradigm to detect and recommend an alternative vehicle route to avoid traffic jams. Unlike the previous solutions in the literature, the proposed service offers a versatile approach in which traffic road classification and route suggestions can be made by infrastructure or by the vehicle itself without compromising the quality of the route service. To achieve this, the service operates in a decentralized way, and the components of the service (vehicles/infrastructure) exchange messages containing vehicle information and regional traffic information. For communication, the proposed approach uses a new dedicated multi-hop protocol that has been specifically designed based on the characteristics and requirements of a vehicle routing service. Therefore, by adapting to the inherent characteristics of a vehicle routing service, such as the density of regions, the proposed communication protocol both enhances reliability and improves the overall efficiency of the vehicle routing service. Simulation results comparing FOXS-GSC with baseline solutions and other proposals from the literature demonstrate its significant impact, reducing network congestion by up to 95% while maintaining a coverage of 97% across various scenery characteristics. Concerning road traffic efficiency, the traffic quality is increasing by 29%, for a reduction in carbon emissions of 10%.
{"title":"FOXS-GSC—Fast Offset Xpath Service with HexagonS Communication","authors":"Celso A. R. L. Brennand, R. Meneguette, Geraldo P. Rocha Filho","doi":"10.3390/informatics10030056","DOIUrl":"https://doi.org/10.3390/informatics10030056","url":null,"abstract":"Congestion in large cities is widely recognized as a problem that impacts various aspects of society, including the economy and public health. To support the urban traffic system and to mitigate traffic congestion and the damage it causes, in this article we propose an assistant Intelligent Transport Systems (ITS) service for traffic management in Vehicular Networks (VANET), which we name FOXS-GSC, for Fast Offset Xpath Service with hexaGonS Communication. FOXS-GSC uses a VANET communication and fog computing paradigm to detect and recommend an alternative vehicle route to avoid traffic jams. Unlike the previous solutions in the literature, the proposed service offers a versatile approach in which traffic road classification and route suggestions can be made by infrastructure or by the vehicle itself without compromising the quality of the route service. To achieve this, the service operates in a decentralized way, and the components of the service (vehicles/infrastructure) exchange messages containing vehicle information and regional traffic information. For communication, the proposed approach uses a new dedicated multi-hop protocol that has been specifically designed based on the characteristics and requirements of a vehicle routing service. Therefore, by adapting to the inherent characteristics of a vehicle routing service, such as the density of regions, the proposed communication protocol both enhances reliability and improves the overall efficiency of the vehicle routing service. Simulation results comparing FOXS-GSC with baseline solutions and other proposals from the literature demonstrate its significant impact, reducing network congestion by up to 95% while maintaining a coverage of 97% across various scenery characteristics. Concerning road traffic efficiency, the traffic quality is increasing by 29%, for a reduction in carbon emissions of 10%.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43469500","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-07-03DOI: 10.3390/informatics10030055
Jie Xu, Xing He, Wei Shao, Jiang Bian, R. Terry
Up to 20% of renal masses ≤4 cm is found to be benign at the time of surgical excision, raising concern for overtreatment. However, the risk of malignancy is currently unable to be accurately predicted prior to surgery using imaging alone. The objective of this study is to propose a machine learning (ML) framework for pre-operative renal tumor classification using readily available clinical and CT imaging data. We tested both traditional ML methods (i.e., XGBoost, random forest (RF)) and deep learning (DL) methods (i.e., multilayer perceptron (MLP), 3D convolutional neural network (3DCNN)) to build the classification model. We discovered that the combination of clinical and radiomics features produced the best results (i.e., AUC [95% CI] of 0.719 [0.712–0.726], a precision [95% CI] of 0.976 [0.975–0.978], a recall [95% CI] of 0.683 [0.675–0.691], and a specificity [95% CI] of 0.827 [0.817–0.837]). Our analysis revealed that employing ML models with CT scans and clinical data holds promise for classifying the risk of renal malignancy. Future work should focus on externally validating the proposed model and features to better support clinical decision-making in renal cancer diagnosis.
{"title":"Classification of Benign and Malignant Renal Tumors Based on CT Scans and Clinical Data Using Machine Learning Methods","authors":"Jie Xu, Xing He, Wei Shao, Jiang Bian, R. Terry","doi":"10.3390/informatics10030055","DOIUrl":"https://doi.org/10.3390/informatics10030055","url":null,"abstract":"Up to 20% of renal masses ≤4 cm is found to be benign at the time of surgical excision, raising concern for overtreatment. However, the risk of malignancy is currently unable to be accurately predicted prior to surgery using imaging alone. The objective of this study is to propose a machine learning (ML) framework for pre-operative renal tumor classification using readily available clinical and CT imaging data. We tested both traditional ML methods (i.e., XGBoost, random forest (RF)) and deep learning (DL) methods (i.e., multilayer perceptron (MLP), 3D convolutional neural network (3DCNN)) to build the classification model. We discovered that the combination of clinical and radiomics features produced the best results (i.e., AUC [95% CI] of 0.719 [0.712–0.726], a precision [95% CI] of 0.976 [0.975–0.978], a recall [95% CI] of 0.683 [0.675–0.691], and a specificity [95% CI] of 0.827 [0.817–0.837]). Our analysis revealed that employing ML models with CT scans and clinical data holds promise for classifying the risk of renal malignancy. Future work should focus on externally validating the proposed model and features to better support clinical decision-making in renal cancer diagnosis.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42900932","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-29DOI: 10.37661/1816-0301-2023-20-2-28-38
M. V. Fridman, A. A. Kosareva, E. Snezhko, P. V. Kamlach, V. Kovalev
Objectives. Morphological analysis of papillary thyroid cancer is a cornerstone for further treatment planning. Traditional and neural network methods of extracting parts of images are used to automate the analysis. It is necessary to prepare a set of data for teaching neural networks to develop a system of similar anatomical region in the histopathological image. Authors discuss the second selection of signs for the marking of histological images, methodological approaches to dissect whole-slide images, how to prepare raw data for a future analysis. The influence of the representative size of the fragment of the full-to-suction image of papillary thyroid cancer on the accuracy of the classification of trained neural network EfficientNetB0 is conducted. The analysis of the resulting results is carried out, the weaknesses of the use of fragments of images of different representative size and the cause of the unsatisfactory accuracy of the classification on large increase are evaluated.Materials and methods. Histopathological whole-slide imaged of 129 patients were used. Histological micropreparations containing elements of a tumor and surrounding tissue were scanned in the Aperio AT2 (Leica Biosystems, Germany) apparatus with maximum resolution. The marking was carried out in the ASAP software package. To choose the optimal representative size of the fragment the problem of classification was solved using the pre-study neural network EfficientNetB0.Results. A methodology for preparing a database of histopathological images of papillary thyroid cancer was proposed. Experiments were conducted to determine the optimal representative size of the image fragment. The best result of the accuracy of determining the class of test sample showed the size of a representative fragment as 394.32×394.32 microns.Conclusion. The analysis of the influence of the representative sizes of fragments of histopathological images showed the problems in solving the classification tasks because of cutting and staining images specifics, morphological complex and textured differences in the images of the same class. At the same time, it was determined that the task of preparing a set of data for training neural network to solve the problem of finding invasion of vessels in a histopathological image is not trivial and it requires additional stages of data preparation.
{"title":"Papillary thyroid carcinoma whole-slide images as a basis for deep learning","authors":"M. V. Fridman, A. A. Kosareva, E. Snezhko, P. V. Kamlach, V. Kovalev","doi":"10.37661/1816-0301-2023-20-2-28-38","DOIUrl":"https://doi.org/10.37661/1816-0301-2023-20-2-28-38","url":null,"abstract":"Objectives. Morphological analysis of papillary thyroid cancer is a cornerstone for further treatment planning. Traditional and neural network methods of extracting parts of images are used to automate the analysis. It is necessary to prepare a set of data for teaching neural networks to develop a system of similar anatomical region in the histopathological image. Authors discuss the second selection of signs for the marking of histological images, methodological approaches to dissect whole-slide images, how to prepare raw data for a future analysis. The influence of the representative size of the fragment of the full-to-suction image of papillary thyroid cancer on the accuracy of the classification of trained neural network EfficientNetB0 is conducted. The analysis of the resulting results is carried out, the weaknesses of the use of fragments of images of different representative size and the cause of the unsatisfactory accuracy of the classification on large increase are evaluated.Materials and methods. Histopathological whole-slide imaged of 129 patients were used. Histological micropreparations containing elements of a tumor and surrounding tissue were scanned in the Aperio AT2 (Leica Biosystems, Germany) apparatus with maximum resolution. The marking was carried out in the ASAP software package. To choose the optimal representative size of the fragment the problem of classification was solved using the pre-study neural network EfficientNetB0.Results. A methodology for preparing a database of histopathological images of papillary thyroid cancer was proposed. Experiments were conducted to determine the optimal representative size of the image fragment. The best result of the accuracy of determining the class of test sample showed the size of a representative fragment as 394.32×394.32 microns.Conclusion. The analysis of the influence of the representative sizes of fragments of histopathological images showed the problems in solving the classification tasks because of cutting and staining images specifics, morphological complex and textured differences in the images of the same class. At the same time, it was determined that the task of preparing a set of data for training neural network to solve the problem of finding invasion of vessels in a histopathological image is not trivial and it requires additional stages of data preparation.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47795205","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-29DOI: 10.37661/1816-0301-2023-20-2-111-120
M. Chuiko, O. M. Korolyova
Objectives. A finite-difference computational algorithm is proposed for solving a mixed boundary-value problem for the Poisson equation given in two-dimensional irregular domains.Methods. To solve the problem, generalized curvilinear coordinates are used. The physical domain is mapped to the computational domain (unit square) in the space of generalized coordinates. The original problem is written in curvilinear coordinates and approximated on a uniform grid in the computational domain.The obtained results are mapped on non-uniform boundary-fitted difference grid in the physical domain.Results. The second order approximations of mixed Neumann-Dirichlet boundary conditions for the Poisson equation in the space of generalized curvilinear coordinate are constructed. To increase the order of Neumann condition approximations, an approximation of the Poisson equation on the boundary of the domain is used.Conclusions. To solve a mixed boundary value problem for the Poisson equation in two-dimensional irregular domains, the computational algorithm of second-order accuracy is constructed. The generalized curvilinear coordinates are used. The results of numerical experiments, which confirm the second order accuracy of the computational algorithm, are presented.
{"title":"Solution of the mixed boundary problem for the Poisson equation on two-dimensional irregular domains","authors":"M. Chuiko, O. M. Korolyova","doi":"10.37661/1816-0301-2023-20-2-111-120","DOIUrl":"https://doi.org/10.37661/1816-0301-2023-20-2-111-120","url":null,"abstract":"Objectives. A finite-difference computational algorithm is proposed for solving a mixed boundary-value problem for the Poisson equation given in two-dimensional irregular domains.Methods. To solve the problem, generalized curvilinear coordinates are used. The physical domain is mapped to the computational domain (unit square) in the space of generalized coordinates. The original problem is written in curvilinear coordinates and approximated on a uniform grid in the computational domain.The obtained results are mapped on non-uniform boundary-fitted difference grid in the physical domain.Results. The second order approximations of mixed Neumann-Dirichlet boundary conditions for the Poisson equation in the space of generalized curvilinear coordinate are constructed. To increase the order of Neumann condition approximations, an approximation of the Poisson equation on the boundary of the domain is used.Conclusions. To solve a mixed boundary value problem for the Poisson equation in two-dimensional irregular domains, the computational algorithm of second-order accuracy is constructed. The generalized curvilinear coordinates are used. The results of numerical experiments, which confirm the second order accuracy of the computational algorithm, are presented.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48337879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-29DOI: 10.37661/1816-0301-2023-20-2-7-27
I. Belokonov, A. Krot, S. V. Kozlov, Y. A. Kapliarchuk, I. E. Savinykh, А. S. Shapkin
Objectives. The problem of developing hardware effective method for estimating the total electron content in the ionosphere based on retransmission of the L1, L2 signals of the global navigation satellite system GPS using a repeater nanosatellite is solved.Methods. It is shown that with the retransmission of L1, L2 signals at frequencies of 150/400 MHz allocated for geophysical research, a coherent multi-position radar system is formed, including navigation satellites (NS) – signal sources, repeater nanosatellite (SR) and ground receiving points (RP). The delay time and phase of the four received signals contain the information about the total TEC on the propagation paths NS – SR and SR – RP. It is shown that due to retransmission and subsequent processing, it is possible to isolate TECs on each of the propagation paths as well as determination of the coordinates of the SR.Results. The content of the method, the procedure for evaluating TEC based on the results of processing the relayed signals, and the technical requirements for the relay equipment are determined. The accuracy characteristics of the proposed method are obtained. Simulation results are given.Conclusion. The information presented in the article may be useful for specialists and researchers who interested in the issues of radio tomographic research of the ionosphere and forecasting hazardous natural phenomena.
{"title":"A method for estimating the total electron content in the ionosphere based on the retransmission of signals from the global navigation satellite system GPS","authors":"I. Belokonov, A. Krot, S. V. Kozlov, Y. A. Kapliarchuk, I. E. Savinykh, А. S. Shapkin","doi":"10.37661/1816-0301-2023-20-2-7-27","DOIUrl":"https://doi.org/10.37661/1816-0301-2023-20-2-7-27","url":null,"abstract":"Objectives. The problem of developing hardware effective method for estimating the total electron content in the ionosphere based on retransmission of the L1, L2 signals of the global navigation satellite system GPS using a repeater nanosatellite is solved.Methods. It is shown that with the retransmission of L1, L2 signals at frequencies of 150/400 MHz allocated for geophysical research, a coherent multi-position radar system is formed, including navigation satellites (NS) – signal sources, repeater nanosatellite (SR) and ground receiving points (RP). The delay time and phase of the four received signals contain the information about the total TEC on the propagation paths NS – SR and SR – RP. It is shown that due to retransmission and subsequent processing, it is possible to isolate TECs on each of the propagation paths as well as determination of the coordinates of the SR.Results. The content of the method, the procedure for evaluating TEC based on the results of processing the relayed signals, and the technical requirements for the relay equipment are determined. The accuracy characteristics of the proposed method are obtained. Simulation results are given.Conclusion. The information presented in the article may be useful for specialists and researchers who interested in the issues of radio tomographic research of the ionosphere and forecasting hazardous natural phenomena.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48928014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-29DOI: 10.37661/1816-0301-2023-20-2-96-110
S. V. Vаlevich, K. S. Dzick, I. I. Pilecki, I. Kruse, R. Asimov, V. Asipovich
Objectives. In connection with the increase in the number of solar power plants, the automation of monitoring their performance becomes an urgent task. The search for anomalies in the operation of solar power plants is one of the main components of monitoring. The purpose of the study is to develop new methods and software algorithms for finding anomalies in the operation of solar panels based on the results of a digital twin created and trained according to the telemetry data of a solar power plant.Methods. The developed technique is based on statistical studies of deviations of power values at the point of maximum efficient operation of the solar panel calculated by the digital twin. In addition, a normalized value of the power in the maximum efficient operation of the solar panel was introduced for more accurate clustering and anomaly search.Results. Using the developed method of static search for half a year of observations, 18 anomalies were detected in the operation of the solar panels of the power plant. All cases are analyzed for the causes of anomalies in the operation of solar panels.Conclusion. It has been established that when using normalized power values in the analysis of deviations at the point of maximum power PN, it is possible to detect abnormal operation of individual panels. The level of deviation of the normalized values at the point of maximum power was calculated, indicating the presence of an anomaly in the operation of solar panel.
{"title":"Methods and software for anomalies searching in the telemetry data of a solar power plant based on the normalized power analysis","authors":"S. V. Vаlevich, K. S. Dzick, I. I. Pilecki, I. Kruse, R. Asimov, V. Asipovich","doi":"10.37661/1816-0301-2023-20-2-96-110","DOIUrl":"https://doi.org/10.37661/1816-0301-2023-20-2-96-110","url":null,"abstract":"Objectives. In connection with the increase in the number of solar power plants, the automation of monitoring their performance becomes an urgent task. The search for anomalies in the operation of solar power plants is one of the main components of monitoring. The purpose of the study is to develop new methods and software algorithms for finding anomalies in the operation of solar panels based on the results of a digital twin created and trained according to the telemetry data of a solar power plant.Methods. The developed technique is based on statistical studies of deviations of power values at the point of maximum efficient operation of the solar panel calculated by the digital twin. In addition, a normalized value of the power in the maximum efficient operation of the solar panel was introduced for more accurate clustering and anomaly search.Results. Using the developed method of static search for half a year of observations, 18 anomalies were detected in the operation of the solar panels of the power plant. All cases are analyzed for the causes of anomalies in the operation of solar panels.Conclusion. It has been established that when using normalized power values in the analysis of deviations at the point of maximum power PN, it is possible to detect abnormal operation of individual panels. The level of deviation of the normalized values at the point of maximum power was calculated, indicating the presence of an anomaly in the operation of solar panel.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44278818","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}