A. Yurchenko, A. Rozumenko, A. Rozumenko, Roman Momot, Olena Semenikhina
The paper considers the use of cloud technologies in education through the prism of bibliographic analysis. The article characterizes the current state of cloud technologies in education, summarizes the trends, and forecasts the directions of recent scientific research. The leading research methods were bibliographic (visual and quantitative) analysis of keyword networks and qualitative discussion. The bibliographic analysis is based on publications indexed by the scientometric database Web Of Science over the past 20 years. The sample for analysis was formed by searching for the words cloud technology, education, learning, and teaching. The results of the study showed: a significant increase in the popularity of cloud technologies in education in recent years; an increase in the number of studies related to various aspects of educational activities under the influence of Industry 4.0; a gradual increase in the number of studies on the virtualization of the educational process and the use of artificial intelligence in education; dissemination of research on the effectiveness of various types of training using cloud services and teaching methods based on artificial intelligence; the relevance of the trend of visualization of educational material and visual analysis in education. The qualitative discussion provided grounds to identify general trends regarding future research directions.: development of mass online courses and learning technologies (immersive, the use of virtual, augmented, and mixed reality, gaming learning technologies, BYOD approach); further virtualization of universities; development of inclusive education, educational analytics, and assessment (formative and adaptive computer assessment); early training of teachers to use cloud technologies and specialized services in subject learning; research related to visualization (big data, design, simulation, simulation of various processes, etc.) and the designing of relevant new academic disciplines; research of STEM and STEAM education.
本文从文献分析的角度探讨了云技术在教育领域的应用。文章描述了云技术在教育领域的现状,总结了发展趋势,并预测了近期科学研究的方向。主要研究方法是对关键词网络进行书目(可视化和定量)分析和定性讨论。书目分析基于科学计量数据库 Web Of Science 在过去 20 年中索引的出版物。分析样本是通过搜索云技术、教育、学习和教学等词形成的。研究结果表明:近年来,云技术在教育领域的普及程度显著提高;在工业 4.0 的影响下,与教育活动各个方面相关的研究数量增加;有关教育过程虚拟化和人工智能在教育中的应用的研究数量逐渐增加;有关使用云服务和基于人工智能的教学方法进行各类培训的有效性的研究得到传播;教材可视化和可视化分析的趋势在教育中具有相关性。定性讨论为确定未来研究方向的总体趋势提供了依据。这些趋势包括:大众在线课程和学习技术的发展(沉浸式、虚拟、增强和混合现实的使用、游戏学习技术、BYOD 方法);大学的进一步虚拟化;全纳教育、教育分析和评估(形成性和适应性计算机评估)的发展;教师在学科学习中使用云技术和专业服务的早期培训;与可视化有关的研究(大数据、设计、模拟、各种过程的模拟等)以及相关新学科的设计;STEM 和 STEAM 教育的研究。
{"title":"CLOUD TECHNOLOGIES IN EDUCATION: THE BIBLIOGRAPHIC REVIEW","authors":"A. Yurchenko, A. Rozumenko, A. Rozumenko, Roman Momot, Olena Semenikhina","doi":"10.35784/iapgos.4421","DOIUrl":"https://doi.org/10.35784/iapgos.4421","url":null,"abstract":"The paper considers the use of cloud technologies in education through the prism of bibliographic analysis. The article characterizes the current state of cloud technologies in education, summarizes the trends, and forecasts the directions of recent scientific research. The leading research methods were bibliographic (visual and quantitative) analysis of keyword networks and qualitative discussion. The bibliographic analysis is based on publications indexed by the scientometric database Web Of Science over the past 20 years. The sample for analysis was formed by searching for the words cloud technology, education, learning, and teaching. The results of the study showed: a significant increase in the popularity of cloud technologies in education in recent years; an increase in the number of studies related to various aspects of educational activities under the influence of Industry 4.0; a gradual increase in the number of studies on the virtualization of the educational process and the use of artificial intelligence in education; dissemination of research on the effectiveness of various types of training using cloud services and teaching methods based on artificial intelligence; the relevance of the trend of visualization of educational material and visual analysis in education. The qualitative discussion provided grounds to identify general trends regarding future research directions.: development of mass online courses and learning technologies (immersive, the use of virtual, augmented, and mixed reality, gaming learning technologies, BYOD approach); further virtualization of universities; development of inclusive education, educational analytics, and assessment (formative and adaptive computer assessment); early training of teachers to use cloud technologies and specialized services in subject learning; research related to visualization (big data, design, simulation, simulation of various processes, etc.) and the designing of relevant new academic disciplines; research of STEM and STEAM education.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"1085 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139169987","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Petro Loboda, I. Starovit, O. Shushura, Yevhen Havrylko, M. Saveliev, Natalia Sachaniuk-Kavets’ka, Oleksandr Neprytskyi, Dina Oralbekova, D. Mussayeva
The accident at the Chornobyl Nuclear Power Plant (ChNPP) in Ukraine in 1986 became one of the largest technological disasters in human history. During the accident cleanup, a special protective structure called the Shelter Object was built to isolate the destroyed reactor from the environment. However, the planned operational lifespan of the Shelter Object was only 30 years. Therefore, with the assistance of the international community, a new protective structure called the New Safe Confinement (NSC) was constructed and put into operation in 2019. The NSC is a large and complex system that relies on a significant number of various tools and subsystems to function. Due to temperature fluctuations and the influence of wind, hydraulic processes occur within the NSC, which can lead to the release of radioactive aerosols into the environment. The personnel of the NSC prevents these leaks, including through ventilation management. Considering the long planned operational term of the NSC, the development and improvement of information technologies for its process automation is a relevant task. The purpose of this paper is to develop a method for managing the ventilation system of the NSC based on neuro-fuzzy networks. An investigation of the current state of ventilation control in the NSC has been conducted, and automation tools for the process have been proposed. Using an adaptive neuro-fuzzy inference system (ANFIS) and statistical data on the NSC's operation, neuro-fuzzy models have been formed, which allows to calculate the expenses of the ventilation system using the Takagi-Sugeno method. The verification of the proposed approaches on a test data sample demonstrated sufficiently high accuracy of the calculations, confirming the potential practical utility in decision-making regarding NSC’s ventilation management. The results of this paper can be useful in the development of digital twins of the NSC for process management and personnel training.
{"title":"VENTILATION CONTROL OF THE NEW SAFE CONFINEMENT OF THE CHORNOBYL NUCLEAR POWER PLANT BASED ON NEURO-FUZZY NETWORKS","authors":"Petro Loboda, I. Starovit, O. Shushura, Yevhen Havrylko, M. Saveliev, Natalia Sachaniuk-Kavets’ka, Oleksandr Neprytskyi, Dina Oralbekova, D. Mussayeva","doi":"10.35784/iapgos.5375","DOIUrl":"https://doi.org/10.35784/iapgos.5375","url":null,"abstract":"The accident at the Chornobyl Nuclear Power Plant (ChNPP) in Ukraine in 1986 became one of the largest technological disasters in human history. During the accident cleanup, a special protective structure called the Shelter Object was built to isolate the destroyed reactor from the environment. However, the planned operational lifespan of the Shelter Object was only 30 years. Therefore, with the assistance of the international community, a new protective structure called the New Safe Confinement (NSC) was constructed and put into operation in 2019. The NSC is a large and complex system that relies on a significant number of various tools and subsystems to function. Due to temperature fluctuations and the influence of wind, hydraulic processes occur within the NSC, which can lead to the release of radioactive aerosols into the environment. The personnel of the NSC prevents these leaks, including through ventilation management. Considering the long planned operational term of the NSC, the development and improvement of information technologies for its process automation is a relevant task. The purpose of this paper is to develop a method for managing the ventilation system of the NSC based on neuro-fuzzy networks. An investigation of the current state of ventilation control in the NSC has been conducted, and automation tools for the process have been proposed. Using an adaptive neuro-fuzzy inference system (ANFIS) and statistical data on the NSC's operation, neuro-fuzzy models have been formed, which allows to calculate the expenses of the ventilation system using the Takagi-Sugeno method. The verification of the proposed approaches on a test data sample demonstrated sufficiently high accuracy of the calculations, confirming the potential practical utility in decision-making regarding NSC’s ventilation management. The results of this paper can be useful in the development of digital twins of the NSC for process management and personnel training.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"606 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139170054","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
L. Matviichuk, O. Liutak, Yuliia Y. Dashchuk, Mykhailo Lepkiy, Svitlana Sidoruk
The purpose of this article is to study the main problems and prospects of ensuring the competitiveness of the hospitality industry of the regions of Ukraine in modern conditions, taking into account international experience in the context of deepening integration ties. The work carried out a diagnosis of the level of competitiveness of the hospitality industry of the regions of Ukraine, based on the developed information system of indicators for assessing the conditions of the competitiveness of the hospitality industry of the region and the formed matrix of the competitiveness of the hospitality industry of the region. A comparison was made of the conditions for ensuring competitiveness, the level of competitive advantages and the level of competitiveness of the hospitality industry of the regions of the state in the pre-war period, as well as the diagnosis of the competitiveness of the hospitality industry of the regions of the state was carried out. The positions of the regions in the pre-war and war periods in terms of the level of competitiveness and availability of tourism potential were determined. A matrix for the selection of target indicators of the integration strategy of regions of Ukraine that have preserved their tourist potential is proposed. The results of the study revealed the main problems of ensuring the competitiveness of the hospitality industry in the regions of Ukraine, and highlighted the potential prospects of the studied processes taking into account the conditions of European integration.
{"title":"INFORMATION SYSTEM FOR DIAGNOSTIC COMPETITIVENESS OF THE HOSPITALITY INDUSTRY OF THE REGIONS OF UKRAINE","authors":"L. Matviichuk, O. Liutak, Yuliia Y. Dashchuk, Mykhailo Lepkiy, Svitlana Sidoruk","doi":"10.35784/iapgos.5394","DOIUrl":"https://doi.org/10.35784/iapgos.5394","url":null,"abstract":"The purpose of this article is to study the main problems and prospects of ensuring the competitiveness of the hospitality industry of the regions of Ukraine in modern conditions, taking into account international experience in the context of deepening integration ties. The work carried out a diagnosis of the level of competitiveness of the hospitality industry of the regions of Ukraine, based on the developed information system of indicators for assessing the conditions of the competitiveness of the hospitality industry of the region and the formed matrix of the competitiveness of the hospitality industry of the region. A comparison was made of the conditions for ensuring competitiveness, the level of competitive advantages and the level of competitiveness of the hospitality industry of the regions of the state in the pre-war period, as well as the diagnosis of the competitiveness of the hospitality industry of the regions of the state was carried out. The positions of the regions in the pre-war and war periods in terms of the level of competitiveness and availability of tourism potential were determined. A matrix for the selection of target indicators of the integration strategy of regions of Ukraine that have preserved their tourist potential is proposed. The results of the study revealed the main problems of ensuring the competitiveness of the hospitality industry in the regions of Ukraine, and highlighted the potential prospects of the studied processes taking into account the conditions of European integration.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"1082 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139169988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Valerii Kozlovskiy, V. Kozlovskiy, Oleksii Nimych, Lyudmila Klobukova, Natalia Yakymchuk
To protect the antenna systems of modern aircraft, radio-transparent dielectric fairings are widely used. At low flight speeds, when designing and evaluating the characteristics of the fairing-antenna, it is assumed that the dielectric constant is a constant value and does not depend on the aircraft's flight speed. As the flight speed increases, as a result of aerodynamic heating of the fairing, its dielectric permeability changes, which leads to errors in the processing of received signals. Currently, to take into account the effect of dielectric coatings heating when designing antenna systems, the temperature of the fairing wall is averaged over its thickness. This method during maneuvering and at high flight speeds leads to large errors in determining the characteristics of the fairing antenna since the nature of the temperature distribution along the thickness of the fairing wall is not taken into account. A new approach to the analysis of dielectric layers with their uneven heating along the thickness is proposed. The obtained results make it possible to adjust the signal processing algorithms with analog and digital matrices, as a result of taking into account the emerging heat flows affecting the fairing of the aviation antenna, which leads to the improvement of the characteristics of the antenna systems.
{"title":"MODEL OF THE FLAT FAIRING ANTENNA DIELECTRIC LAYER WITH AERODYNAMIC HEATING","authors":"Valerii Kozlovskiy, V. Kozlovskiy, Oleksii Nimych, Lyudmila Klobukova, Natalia Yakymchuk","doi":"10.35784/iapgos.5302","DOIUrl":"https://doi.org/10.35784/iapgos.5302","url":null,"abstract":"To protect the antenna systems of modern aircraft, radio-transparent dielectric fairings are widely used. At low flight speeds, when designing and evaluating the characteristics of the fairing-antenna, it is assumed that the dielectric constant is a constant value and does not depend on the aircraft's flight speed. As the flight speed increases, as a result of aerodynamic heating of the fairing, its dielectric permeability changes, which leads to errors in the processing of received signals. Currently, to take into account the effect of dielectric coatings heating when designing antenna systems, the temperature of the fairing wall is averaged over its thickness. This method during maneuvering and at high flight speeds leads to large errors in determining the characteristics of the fairing antenna since the nature of the temperature distribution along the thickness of the fairing wall is not taken into account. A new approach to the analysis of dielectric layers with their uneven heating along the thickness is proposed. The obtained results make it possible to adjust the signal processing algorithms with analog and digital matrices, as a result of taking into account the emerging heat flows affecting the fairing of the aviation antenna, which leads to the improvement of the characteristics of the antenna systems.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"35 6","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139168134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The rapid growth and development of AI-based applications introduce a wide range of deep and transfer learning model architectures. Selecting an optimal optimizer is still challenging to improve any classification type's performance efficiency and accuracy. This paper proposes an intelligent optimizer selection technique using a new search algorithm to overcome this difficulty. A dataset used in this work was collected and customized for controlling and monitoring roads, especially when emergency vehicles are approaching. In this regard, several deep and transfer learning models have been compared for accurate detection and classification. Furthermore, DenseNet201 layers are frizzed to choose the perfect optimizer. The main goal is to improve the performance accuracy of emergency car classification by performing the test of various optimization methods, including (Adam, Adamax, Nadam, and RMSprob). The evaluation metrics utilized for the model’s comparison with other deep learning techniques are based on classification accuracy, precision, recall, and F1-Score. Test results show that the proposed selection-based optimizer increased classification accuracy and reached 98.84%.
人工智能应用的快速增长和发展引入了多种深度学习和迁移学习模型架构。要提高任何分类类型的性能效率和准确性,选择最佳优化器仍是一项挑战。本文提出了一种使用新搜索算法的智能优化器选择技术,以克服这一困难。这项工作中使用的数据集是为控制和监控道路而收集和定制的,尤其是在紧急车辆接近时。在这方面,对几种深度学习和迁移学习模型进行了比较,以实现准确的检测和分类。此外,还对 DenseNet201 层进行了模糊处理,以选择完美的优化器。主要目标是通过对各种优化方法(包括 Adam、Adamax、Nadam 和 RMSprob)进行测试,提高紧急车辆分类的性能准确性。该模型与其他深度学习技术的比较所使用的评价指标是基于分类准确率、精确度、召回率和 F1 分数。测试结果表明,所提出的基于选择的优化器提高了分类准确率,达到了 98.84%。
{"title":"SMART OPTIMIZER SELECTION TECHNIQUE: A COMPARATIVE STUDY OF MODIFIED DENSNET201 WITH OTHER DEEP LEARNING MODELS","authors":"Kamaran H. Manguri, Aree A. Mohammed","doi":"10.35784/iapgos.5332","DOIUrl":"https://doi.org/10.35784/iapgos.5332","url":null,"abstract":"The rapid growth and development of AI-based applications introduce a wide range of deep and transfer learning model architectures. Selecting an optimal optimizer is still challenging to improve any classification type's performance efficiency and accuracy. This paper proposes an intelligent optimizer selection technique using a new search algorithm to overcome this difficulty. A dataset used in this work was collected and customized for controlling and monitoring roads, especially when emergency vehicles are approaching. In this regard, several deep and transfer learning models have been compared for accurate detection and classification. Furthermore, DenseNet201 layers are frizzed to choose the perfect optimizer. The main goal is to improve the performance accuracy of emergency car classification by performing the test of various optimization methods, including (Adam, Adamax, Nadam, and RMSprob). The evaluation metrics utilized for the model’s comparison with other deep learning techniques are based on classification accuracy, precision, recall, and F1-Score. Test results show that the proposed selection-based optimizer increased classification accuracy and reached 98.84%.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"133 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139171036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Reliability is one of the key factors used to gauge software quality. Software defect prediction (SDP) is one of the most important factors which affects measuring software's reliability. Additionally, the high dimensionality of the features has a direct effect on the accuracy of SDP models. The objective of this paper is to propose a hybrid binary whale optimization algorithm (BWOA) based on taper-shape transfer functions for solving feature selection problems and dimension reduction with a KNN classifier as a new software defect prediction method. In this paper, the values of a real vector that represents the individual encoding have been converted to binary vector by using the four types of Taper-shaped transfer functions to enhance the performance of BWOA to reduce the dimension of the search space. The performance of the suggested method (T-BWOA-KNN) was evaluated using eleven standard software defect prediction datasets from the PROMISE and NASA repositories depending on the K-Nearest Neighbor (KNN) classifier. Seven evaluation metrics have been used to assess the effectiveness of the suggested method. The experimental results have shown that the performance of T-BWOA-KNN produced promising results compared to other methods including ten methods from the literature, four types of T-BWOA with the KNN classifier. In addition, the obtained results are compared and analyzed with other methods from the literature in terms of the average number of selected features (SF) and accuracy rate (ACC) using the Kendall W test. In this paper, a new hybrid software defect prediction method called T-BWOA-KNN has been proposed which is concerned with the feature selection problem. The experimental results have proved that T-BWOA-KNN produced promising performance compared with other methods for most datasets.
{"title":"HYBRID BINARY WHALE OPTIMIZATION ALGORITHM BASED ON TAPER SHAPED TRANSFER FUNCTION FOR SOFTWARE DEFECT PREDICTION","authors":"Zakaria A. Hamed Alnaish, Safwan O. Hasoon","doi":"10.35784/iapgos.4569","DOIUrl":"https://doi.org/10.35784/iapgos.4569","url":null,"abstract":"Reliability is one of the key factors used to gauge software quality. Software defect prediction (SDP) is one of the most important factors which affects measuring software's reliability. Additionally, the high dimensionality of the features has a direct effect on the accuracy of SDP models. The objective of this paper is to propose a hybrid binary whale optimization algorithm (BWOA) based on taper-shape transfer functions for solving feature selection problems and dimension reduction with a KNN classifier as a new software defect prediction method. In this paper, the values of a real vector that represents the individual encoding have been converted to binary vector by using the four types of Taper-shaped transfer functions to enhance the performance of BWOA to reduce the dimension of the search space. The performance of the suggested method (T-BWOA-KNN) was evaluated using eleven standard software defect prediction datasets from the PROMISE and NASA repositories depending on the K-Nearest Neighbor (KNN) classifier. Seven evaluation metrics have been used to assess the effectiveness of the suggested method. The experimental results have shown that the performance of T-BWOA-KNN produced promising results compared to other methods including ten methods from the literature, four types of T-BWOA with the KNN classifier. In addition, the obtained results are compared and analyzed with other methods from the literature in terms of the average number of selected features (SF) and accuracy rate (ACC) using the Kendall W test. In this paper, a new hybrid software defect prediction method called T-BWOA-KNN has been proposed which is concerned with the feature selection problem. The experimental results have proved that T-BWOA-KNN produced promising performance compared with other methods for most datasets.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"8 7","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139169636","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
New engineering technologies allow the creation of diagnostic devices for predicting the development of acute tissue ischemia of the extremities and determining the residual time until the removal of the tourniquet, and solving these tasks is particularly relevant during military actions. Acute limb ischemia is a sudden critical decrease in perfusion that threatens the viability of the limb. The incidence of this condition is 1.5 cases per 10 000 people per year. Acute ischemia occurs due to the blockage of blood flow in major arteries (embolism, thrombosis, trauma), leading to the cessation of adequate blood supply to metabolically active tissues of the limb, including the skin, muscles, and nerve endings. To address these issues, the article analyzes the changes in the impedance of biological tissue. The introduction and use of the coefficient of relative electrical conductivity, denoted as k, as a diagnostic criterion parameter, are justified. Experimental studies of changes in the coefficient of relative electrical conductivity k were conducted, confirming that the transition from exponential to linear dependencies of the coefficient establishes the degree of viability of the biological cell (tissue) and the moment of occurrence of reperfusion syndrome. It has been established that a deviation of the value of k by 10–15% from its unit value diagnoses the initial process of blood perfusion impairment and the development of ischemic tissue disease. The rate of change of k serves as a criterion for predicting the progression of the disease and as a corrective factor for therapeutic treatment.
新的工程技术可以制造出诊断设备,用于预测肢体急性组织缺血的发展情况,并确定拆除止血带前的剩余时间,而解决这些任务在军事行动中尤为重要。急性肢体缺血是指血流灌注突然严重减少,威胁到肢体的存活。这种情况的发病率为每年每 10 000 人中有 1.5 例。急性缺血发生的原因是主要动脉血流受阻(栓塞、血栓形成、创伤),导致肢体新陈代谢活跃的组织(包括皮肤、肌肉和神经末梢)无法获得充足的血液供应。为了解决这些问题,文章分析了生物组织阻抗的变化。引入并使用相对电导系数(用 k 表示)作为诊断标准参数是有道理的。对相对电导率系数 k 的变化进行了实验研究,证实该系数从指数关系到线性关系的转变确定了生物细胞(组织)的存活程度和再灌注综合征发生的时刻。已经确定,如果 k 值偏离其单位值 10-15%,就可以诊断出血液灌注受损的初始过程和缺血性组织疾病的发展。k 的变化率是预测疾病进展的标准,也是治疗过程中的校正因素。
{"title":"A USAGE OF THE IMPEDANCE METHOD FOR DETECTING CIRCULATORY DISORDERS TO DETERMINE THE DEGREE OF LIMB ISCHEMIA","authors":"Valerіi Kryvonosov, Oleg Avrunin, Serhii Sander, Volodymyr Pavlov, Liliia Martyniuk, B. Zhumazhanov","doi":"10.35784/iapgos.5393","DOIUrl":"https://doi.org/10.35784/iapgos.5393","url":null,"abstract":"New engineering technologies allow the creation of diagnostic devices for predicting the development of acute tissue ischemia of the extremities and determining the residual time until the removal of the tourniquet, and solving these tasks is particularly relevant during military actions. Acute limb ischemia is a sudden critical decrease in perfusion that threatens the viability of the limb. The incidence of this condition is 1.5 cases per 10 000 people per year. Acute ischemia occurs due to the blockage of blood flow in major arteries (embolism, thrombosis, trauma), leading to the cessation of adequate blood supply to metabolically active tissues of the limb, including the skin, muscles, and nerve endings. To address these issues, the article analyzes the changes in the impedance of biological tissue. The introduction and use of the coefficient of relative electrical conductivity, denoted as k, as a diagnostic criterion parameter, are justified. Experimental studies of changes in the coefficient of relative electrical conductivity k were conducted, confirming that the transition from exponential to linear dependencies of the coefficient establishes the degree of viability of the biological cell (tissue) and the moment of occurrence of reperfusion syndrome. It has been established that a deviation of the value of k by 10–15% from its unit value diagnoses the initial process of blood perfusion impairment and the development of ischemic tissue disease. The rate of change of k serves as a criterion for predicting the progression of the disease and as a corrective factor for therapeutic treatment.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"106 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139171154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The websites of higher education institutions, due to the fact that they are addressed to multiple stakeholder groups, not only need to have an appropriately designed information structure but must also be useful. Additionally, in the case of public universities, their services are expected to be accessible to the widest possible audience, especially for people with disabilities. The accessibility tools used on websites should be quickly located, easily identifiable and user-friendly. So far, no standards have been developed regarding these issues, and therefore, there are various solutions on the web. The objective of this study is to analyze various implementations of accessibility tools on university websites in terms of their location, form of presentation and ways that enable access to them. A study was conducted in which web interfaces were evaluated with the participation of users. The experiment consisted of two parts: the first one used the eye tracking technique, whereas in the second one, a survey was conducted. The research material was prototypes of websites from four different universities. Each website had two versions differing in implementation of accessibility tools. In the study, 35 participants were divided into two groups of people. Each group was shown one of the two sets of website prototypes and the users were tasked with finding and activating a specific accessibility tool. After exploring the websites, each participant completed a questionnaire that pertained to their opinions regarding aspects such as appearance, placement and a way to access tools dedicated to people with disabilities. The obtained data, processed to the form of heatmaps and fixation maps, were subjected to a qualitative analysis. The survey results and eye tracking data were analyzed quantitatively. On the basis of performed analyzes it can be concluded that the following factors have an impact on the reduction in efficiency and productivity of users: placement of accessibility tools on university websites in a place other than the upper right corner, an indirect access to these tools or their non-standard appearance.
{"title":"AN ANALYSIS OF THE IMPLEMENTATION OF ACCESSIBILITY TOOLS ON WEBSITES","authors":"Marcin Cieśla, M. Dzieńkowski","doi":"10.35784/iapgos.4459","DOIUrl":"https://doi.org/10.35784/iapgos.4459","url":null,"abstract":"The websites of higher education institutions, due to the fact that they are addressed to multiple stakeholder groups, not only need to have an appropriately designed information structure but must also be useful. Additionally, in the case of public universities, their services are expected to be accessible to the widest possible audience, especially for people with disabilities. The accessibility tools used on websites should be quickly located, easily identifiable and user-friendly. So far, no standards have been developed regarding these issues, and therefore, there are various solutions on the web. The objective of this study is to analyze various implementations of accessibility tools on university websites in terms of their location, form of presentation and ways that enable access to them. A study was conducted in which web interfaces were evaluated with the participation of users. The experiment consisted of two parts: the first one used the eye tracking technique, whereas in the second one, a survey was conducted. The research material was prototypes of websites from four different universities. Each website had two versions differing in implementation of accessibility tools. In the study, 35 participants were divided into two groups of people. Each group was shown one of the two sets of website prototypes and the users were tasked with finding and activating a specific accessibility tool. After exploring the websites, each participant completed a questionnaire that pertained to their opinions regarding aspects such as appearance, placement and a way to access tools dedicated to people with disabilities. The obtained data, processed to the form of heatmaps and fixation maps, were subjected to a qualitative analysis. The survey results and eye tracking data were analyzed quantitatively. On the basis of performed analyzes it can be concluded that the following factors have an impact on the reduction in efficiency and productivity of users: placement of accessibility tools on university websites in a place other than the upper right corner, an indirect access to these tools or their non-standard appearance.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"323 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139170634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mohamed Bal-Ghaoui, My Hachem El Yousfi Alaoui, A. Jilbab, Abdennaser Bourouhou
Transfer Learning (TL) is a popular deep learning technique used in medical image analysis, especially when data is limited. It leverages pre-trained knowledge from State-Of-The-Art (SOTA) models and applies it to specific applications through Fine-Tuning (FT). However, fine-tuning large models can be time-consuming, and determining which layers to use can be challenging. This study explores different fine-tuning strategies for five SOTA models (VGG16, VGG19, ResNet50, ResNet101, and InceptionV3) pre-trained on ImageNet. It also investigates the impact of the classifier by using a linear SVM for classification. The experiments are performed on four open-access ultrasound datasets related to breast cancer, thyroid nodules cancer, and salivary glands cancer. Results are evaluated using a five-fold stratified cross-validation technique, and metrics like accuracy, precision, and recall are computed. The findings show that fine-tuning 15% of the last layers in ResNet50 and InceptionV3 achieves good results. Using SVM for classification further improves overall performance by 6% for the two best-performing models. This research provides insights into fine-tuning strategies and the importance of the classifier in transfer learning for ultrasound image classification.
{"title":"OPTIMIZING ULTRASOUND IMAGE CLASSIFICATION THROUGH TRANSFER LEARNING: FINE-TUNING STRATEGIES AND CLASSIFIER IMPACT ON PRE-TRAINED INNER-LAYERS","authors":"Mohamed Bal-Ghaoui, My Hachem El Yousfi Alaoui, A. Jilbab, Abdennaser Bourouhou","doi":"10.35784/iapgos.4464","DOIUrl":"https://doi.org/10.35784/iapgos.4464","url":null,"abstract":"Transfer Learning (TL) is a popular deep learning technique used in medical image analysis, especially when data is limited. It leverages pre-trained knowledge from State-Of-The-Art (SOTA) models and applies it to specific applications through Fine-Tuning (FT). However, fine-tuning large models can be time-consuming, and determining which layers to use can be challenging. This study explores different fine-tuning strategies for five SOTA models (VGG16, VGG19, ResNet50, ResNet101, and InceptionV3) pre-trained on ImageNet. It also investigates the impact of the classifier by using a linear SVM for classification. The experiments are performed on four open-access ultrasound datasets related to breast cancer, thyroid nodules cancer, and salivary glands cancer. Results are evaluated using a five-fold stratified cross-validation technique, and metrics like accuracy, precision, and recall are computed. The findings show that fine-tuning 15% of the last layers in ResNet50 and InceptionV3 achieves good results. Using SVM for classification further improves overall performance by 6% for the two best-performing models. This research provides insights into fine-tuning strategies and the importance of the classifier in transfer learning for ultrasound image classification.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"45 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139168076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Swarajya Madhuri Rayavarapu, Tammineni Shanmukha Prasanthi, G. S. Kumar, G. Sasibhushana Rao, Gottapu Prashanti
In order to diagnose a range of cardiac conditions, it is important to conduct an accurate evaluation of either phonocardiogram (PCG) and electrocardiogram (ECG) data. Artificial intelligence and machine learning-based computer-assisted diagnostics are becoming increasingly commonplace in modern medicine, assisting clinicians in making life-or-death decisions. The requirement for an enormous amount of information for training to establish the framework for a deep learning-based technique is an empirical challenge in the field of medicine. This increases the risk of personal information being misused. As a direct result of this issue, there has been an explosion in the study of methods for creating synthetic patient data. Researchers have attempted to generate synthetic ECG or PCG readings. To balance the dataset, ECG data were first created on the MIT-BIH arrhythmia database using LS GAN and Cycle GAN. Next, using VGGNet, studies were conducted to classify arrhythmias for the synthesized ECG signals. The synthesized signals performed well and resembled the original signal and the obtained precision of 91.20%, recall of 89.52% and an F1 score of 90.35%.
为了诊断一系列心脏疾病,必须对心电图(PCG)和心电图(ECG)数据进行准确评估。基于人工智能和机器学习的计算机辅助诊断在现代医学中越来越普遍,可协助临床医生做出生死攸关的决定。要为基于深度学习的技术建立框架,需要大量信息进行训练,这是医学领域的一个经验性挑战。这增加了个人信息被滥用的风险。这一问题的直接结果是,对创建合成患者数据的方法的研究激增。研究人员尝试生成合成心电图或 PCG 读数。为了平衡数据集,首先使用 LS GAN 和 Cycle GAN 在麻省理工学院-BIH 心律失常数据库上创建了心电图数据。然后,使用 VGGNet 对合成的心电信号进行心律失常分类研究。合成信号表现良好,与原始信号相似,精确度为 91.20%,召回率为 89.52%,F1 分数为 90.35%。
{"title":"A GENERATIVE MODEL FOR DEEP FAKE AUGMENTATION OF PHONOCARDIOGRAM AND ELECTROCARDIOGRAM SIGNALS USING LSGAN AND CYCLE GAN","authors":"Swarajya Madhuri Rayavarapu, Tammineni Shanmukha Prasanthi, G. S. Kumar, G. Sasibhushana Rao, Gottapu Prashanti","doi":"10.35784/iapgos.3783","DOIUrl":"https://doi.org/10.35784/iapgos.3783","url":null,"abstract":"In order to diagnose a range of cardiac conditions, it is important to conduct an accurate evaluation of either phonocardiogram (PCG) and electrocardiogram (ECG) data. Artificial intelligence and machine learning-based computer-assisted diagnostics are becoming increasingly commonplace in modern medicine, assisting clinicians in making life-or-death decisions. The requirement for an enormous amount of information for training to establish the framework for a deep learning-based technique is an empirical challenge in the field of medicine. This increases the risk of personal information being misused. As a direct result of this issue, there has been an explosion in the study of methods for creating synthetic patient data. Researchers have attempted to generate synthetic ECG or PCG readings. To balance the dataset, ECG data were first created on the MIT-BIH arrhythmia database using LS GAN and Cycle GAN. Next, using VGGNet, studies were conducted to classify arrhythmias for the synthesized ECG signals. The synthesized signals performed well and resembled the original signal and the obtained precision of 91.20%, recall of 89.52% and an F1 score of 90.35%.","PeriodicalId":504633,"journal":{"name":"Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska","volume":"18 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139168208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}