Pub Date : 2023-06-29DOI: 10.37661/1816-0301-2023-20-2-85-95
S. F. Lipnitsky
Objectives. The problem of mathematical modeling and algorithmization of the processes of correction of requests in the system of information support for decision-making is being solved. At the same time, three main goals are pursued: building of a generalized information retrieval model, development of algorithms for pre-search query correction and development of algorithms for post-search query correction.Methods. Methods of set theory and probability theory are used.Results. A generalized information retrieval model has been developed. Within the framework of the model, the concepts of search function, issuance criterion, relevance and pertinence of search results are formalized. Algorithms for pre-search and post-search correction of queries in the information decision support system are proposed.Conclusion. A mathematical model for correcting user requests in the information decision support system has been developed. Within the framework of the model, the efficiency of search processes in terms of the relevance and pertinence of the information found has been studied. Necessary and sufficient optimality of search functions are proved.
{"title":"Correction of requests in the information system decision support","authors":"S. F. Lipnitsky","doi":"10.37661/1816-0301-2023-20-2-85-95","DOIUrl":"https://doi.org/10.37661/1816-0301-2023-20-2-85-95","url":null,"abstract":"Objectives. The problem of mathematical modeling and algorithmization of the processes of correction of requests in the system of information support for decision-making is being solved. At the same time, three main goals are pursued: building of a generalized information retrieval model, development of algorithms for pre-search query correction and development of algorithms for post-search query correction.Methods. Methods of set theory and probability theory are used.Results. A generalized information retrieval model has been developed. Within the framework of the model, the concepts of search function, issuance criterion, relevance and pertinence of search results are formalized. Algorithms for pre-search and post-search correction of queries in the information decision support system are proposed.Conclusion. A mathematical model for correcting user requests in the information decision support system has been developed. Within the framework of the model, the efficiency of search processes in terms of the relevance and pertinence of the information found has been studied. Necessary and sufficient optimality of search functions are proved.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49359789","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-29DOI: 10.37661/1816-0301-2023-20-2-39-64
P. Bibilo
Objectives. The problem of circuit implementation of incompletely specified (partial) k-valued logic functions given by tabular representations is considered. The stage of technologically independent optimization is studied to obtain minimized representations of systems of completely specified Boolean functions from tabular representations of partial functions of k-valued logic. According to these representations of Boolean functions, technological mapping is performed at the second stage of the synthesis of logic circuits.Methods. Using additional definitions of Multi-valued Decision Diagrams (MDD) representing partial functions of k-valued logic, and Binary Decision Diagrams (BDD) representing partial systems of Boolean functions at the stage of technologically independent optimization is proposed. The task of additional definition of MDD is oriented to reducing the number of vertices of the MDD graph that correspond to the cofactors of the Shannon expansion of a multi-valued function.Results. The MDD minimization problem is reduced to solving the problems of coloring undirected graphs of incompatibility of cofactors by minimum number of colors. Encoding of multi-valued values of arguments and values of functions of k-valued logic by binary codes leads to systems of partial Boolean functions, which are also further defined in order to minimize their multi-level BDD representations.Conclusion. The proposed approach makes it possible to define partial multi-valued functions to fully defined Boolean functions in two stages. At the second stage, well-known and effective methods are used to redefine BDD representing systems of partial Boolean functions. As a result of this two-step approach, minimized BDD representations of systems of completely defined functions are obtained. According to completely defined Boolean functions, a technological mapping into a given library of logical elements is performed, i.e. the optimized descriptions of Boolean function systems are covered with descriptions of logical elements
{"title":"Application of decision diagrams of incompletely specified of k-valued logic functions in the synthesis of logical circuits","authors":"P. Bibilo","doi":"10.37661/1816-0301-2023-20-2-39-64","DOIUrl":"https://doi.org/10.37661/1816-0301-2023-20-2-39-64","url":null,"abstract":"Objectives. The problem of circuit implementation of incompletely specified (partial) k-valued logic functions given by tabular representations is considered. The stage of technologically independent optimization is studied to obtain minimized representations of systems of completely specified Boolean functions from tabular representations of partial functions of k-valued logic. According to these representations of Boolean functions, technological mapping is performed at the second stage of the synthesis of logic circuits.Methods. Using additional definitions of Multi-valued Decision Diagrams (MDD) representing partial functions of k-valued logic, and Binary Decision Diagrams (BDD) representing partial systems of Boolean functions at the stage of technologically independent optimization is proposed. The task of additional definition of MDD is oriented to reducing the number of vertices of the MDD graph that correspond to the cofactors of the Shannon expansion of a multi-valued function.Results. The MDD minimization problem is reduced to solving the problems of coloring undirected graphs of incompatibility of cofactors by minimum number of colors. Encoding of multi-valued values of arguments and values of functions of k-valued logic by binary codes leads to systems of partial Boolean functions, which are also further defined in order to minimize their multi-level BDD representations.Conclusion. The proposed approach makes it possible to define partial multi-valued functions to fully defined Boolean functions in two stages. At the second stage, well-known and effective methods are used to redefine BDD representing systems of partial Boolean functions. As a result of this two-step approach, minimized BDD representations of systems of completely defined functions are obtained. According to completely defined Boolean functions, a technological mapping into a given library of logical elements is performed, i.e. the optimized descriptions of Boolean function systems are covered with descriptions of logical elements","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45418697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-29DOI: 10.37661/1816-0301-2023-20-2-65-84
A. Prihozhy
Objectives. The problem of parallelizing computations on multicore systems is considered. On the Floyd – Warshall blocked algorithm of shortest paths search in dense graphs of large size, two types of parallelism are compared: fork-join and network dataflow. Using the CAL programming language, a method of developing actors and an algorithm of generating parallel dataflow networks are proposed. The objective is to improve performance of parallel implementations of algorithms which have the property of partial order of computations on multicore processors.Methods. Methods of graph theory, algorithm theory, parallelization theory and formal language theory are used.Results. Claims about the possibility of reordering calculations in the blocked Floyd – Warshall algorithm are proved, which make it possible to achieve a greater load of cores during algorithm execution. Based on the claims, a method of constructing actors in the CAL language is developed and an algorithm for automatic generation of dataflow CAL networks for various configurations of block matrices describing the lengths of the shortest paths is proposed. It is proved that the networks have the properties of rate consistency, boundedness, and liveness. In actors running in parallel, the order of execution of actions with asynchronous behavior can change dynamically, resulting in efficient use of caches and increased core load. To implement the new features of actors, networks and the method of their generation, a tunable multi-threaded CAL engine has been developed that implements a static dataflow model of computation with bounded sizes of buffers. From the experimental results obtained on four types of multi-core processors it follows that there is an optimal size of the network matrix of actors for which the performance is maximum, and the size depends on the number of cores and the size of graph.Conclusion. It has been shown that dataflow networks of actors are an effective means to parallelize computationally intensive algorithms that describe a partial order of computations over decomposed data. The results obtained on the blocked algorithm of shortest paths search prove that the parallelism of dataflow networks gives higher performance of software implementations on multicore processors in comparison with the fork-join parallelism of OpenMP.
{"title":"Generation of shortest path search dataflow networks of actors for parallel multi-core implementation","authors":"A. Prihozhy","doi":"10.37661/1816-0301-2023-20-2-65-84","DOIUrl":"https://doi.org/10.37661/1816-0301-2023-20-2-65-84","url":null,"abstract":"Objectives. The problem of parallelizing computations on multicore systems is considered. On the Floyd – Warshall blocked algorithm of shortest paths search in dense graphs of large size, two types of parallelism are compared: fork-join and network dataflow. Using the CAL programming language, a method of developing actors and an algorithm of generating parallel dataflow networks are proposed. The objective is to improve performance of parallel implementations of algorithms which have the property of partial order of computations on multicore processors.Methods. Methods of graph theory, algorithm theory, parallelization theory and formal language theory are used.Results. Claims about the possibility of reordering calculations in the blocked Floyd – Warshall algorithm are proved, which make it possible to achieve a greater load of cores during algorithm execution. Based on the claims, a method of constructing actors in the CAL language is developed and an algorithm for automatic generation of dataflow CAL networks for various configurations of block matrices describing the lengths of the shortest paths is proposed. It is proved that the networks have the properties of rate consistency, boundedness, and liveness. In actors running in parallel, the order of execution of actions with asynchronous behavior can change dynamically, resulting in efficient use of caches and increased core load. To implement the new features of actors, networks and the method of their generation, a tunable multi-threaded CAL engine has been developed that implements a static dataflow model of computation with bounded sizes of buffers. From the experimental results obtained on four types of multi-core processors it follows that there is an optimal size of the network matrix of actors for which the performance is maximum, and the size depends on the number of cores and the size of graph.Conclusion. It has been shown that dataflow networks of actors are an effective means to parallelize computationally intensive algorithms that describe a partial order of computations over decomposed data. The results obtained on the blocked algorithm of shortest paths search prove that the parallelism of dataflow networks gives higher performance of software implementations on multicore processors in comparison with the fork-join parallelism of OpenMP.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":" ","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47347375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Risk management is a highly important issue for Fintech companies; moreover, it is very specific and puts forward the serious requirements toward the top management of any financial institution. This study was devoted to specifying the risk factors affecting the finance and capital adequacy of financial institutions. The authors considered the different types of risks in combination, whereas other scholars usually analyze risks in isolation; however, the authors believe that it is necessary to consider their mutual impact. The risks were estimated using the PLS-SEM method in Smart PLS-4 software. The quality of the obtained model is very high according to all indicators. Five hypotheses related to finance and five hypotheses related to capital adequacy were considered. The impact of AML, cyber, and governance risks on capital adequacy was confirmed; the effect of governance and operational risks on finance was also confirmed. Other risks have no impact on finance and capital adequacy. It is interesting that risks associated with staff have no impact on finance and capital adequacy. The findings of this study can be easily applied by any financial institution for risk analysis. Moreover, this study can serve toward a better collaboration of scholars investigating the Fintech activities and practitioners working in this sphere. The authors present a novel approach for enhancing key performance indicators (KPIs) for Fintech companies, proposing utilizing metrics that are derived from the company’s specific risks, thereby introducing an innovative method for selecting KPIs based on the inherent risks associated with the Fintech’s business model. This model aligns the KPIs with the unique risk profile of the company, fostering a fresh perspective on performance measurement within the Fintech industry.
{"title":"Risk-Based Approach for Selecting Company Key Performance Indicator in an Example of Financial Services","authors":"Olegs Cernisevs, Yelena Popova, Dmitrijs Cernisevs","doi":"10.3390/informatics10020054","DOIUrl":"https://doi.org/10.3390/informatics10020054","url":null,"abstract":"Risk management is a highly important issue for Fintech companies; moreover, it is very specific and puts forward the serious requirements toward the top management of any financial institution. This study was devoted to specifying the risk factors affecting the finance and capital adequacy of financial institutions. The authors considered the different types of risks in combination, whereas other scholars usually analyze risks in isolation; however, the authors believe that it is necessary to consider their mutual impact. The risks were estimated using the PLS-SEM method in Smart PLS-4 software. The quality of the obtained model is very high according to all indicators. Five hypotheses related to finance and five hypotheses related to capital adequacy were considered. The impact of AML, cyber, and governance risks on capital adequacy was confirmed; the effect of governance and operational risks on finance was also confirmed. Other risks have no impact on finance and capital adequacy. It is interesting that risks associated with staff have no impact on finance and capital adequacy. The findings of this study can be easily applied by any financial institution for risk analysis. Moreover, this study can serve toward a better collaboration of scholars investigating the Fintech activities and practitioners working in this sphere. The authors present a novel approach for enhancing key performance indicators (KPIs) for Fintech companies, proposing utilizing metrics that are derived from the company’s specific risks, thereby introducing an innovative method for selecting KPIs based on the inherent risks associated with the Fintech’s business model. This model aligns the KPIs with the unique risk profile of the company, fostering a fresh perspective on performance measurement within the Fintech industry.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":"10 1","pages":"54"},"PeriodicalIF":3.1,"publicationDate":"2023-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44186613","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-16DOI: 10.3390/informatics10020053
Muhammad Mahreza Maulana, A. Suroso, Yani Nurhadryani, K. Seminar
One way to improve Indonesia’s ranking in terms of ease of conducting business is by taking a closer look at the business licensing process. This study aims to carry out an assessment using a smart governance framework and recommendation capabilities from the Enterprise System (ES). As a result, the recommendations for improvement with the expected priority are generated. The stages of this research are observing the process of making bio-business permits, followed by interviews related to several Enterprise Architecture (EA) capabilities, and providing recommendations based on the results of the maturity level of IT governance. These recommendations are then mapped into an impact—effort matrix for program prioritization. The recommendations for bio-business licenses can also be used to improve the process for other business licenses. Implementation of the EA framework has been proven to align technology, organization, and processes so that it can support continuous improvement processes.
{"title":"The Smart Governance Framework and Enterprise System's Capability for Improving Bio-Business Licensing Services","authors":"Muhammad Mahreza Maulana, A. Suroso, Yani Nurhadryani, K. Seminar","doi":"10.3390/informatics10020053","DOIUrl":"https://doi.org/10.3390/informatics10020053","url":null,"abstract":"One way to improve Indonesia’s ranking in terms of ease of conducting business is by taking a closer look at the business licensing process. This study aims to carry out an assessment using a smart governance framework and recommendation capabilities from the Enterprise System (ES). As a result, the recommendations for improvement with the expected priority are generated. The stages of this research are observing the process of making bio-business permits, followed by interviews related to several Enterprise Architecture (EA) capabilities, and providing recommendations based on the results of the maturity level of IT governance. These recommendations are then mapped into an impact—effort matrix for program prioritization. The recommendations for bio-business licenses can also be used to improve the process for other business licenses. Implementation of the EA framework has been proven to align technology, organization, and processes so that it can support continuous improvement processes.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":"10 1","pages":"53"},"PeriodicalIF":3.1,"publicationDate":"2023-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42061808","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-14DOI: 10.3390/informatics10020052
W. Villegas-Ch., Isabel Urbina-Camacho, J. Garcia-Ortiz
Using camera-based algorithms to detect abnormal patterns in children’s handwriting has become a promising tool in education and occupational therapy. This study analyzes the performance of a camera- and tablet-based handwriting verification algorithm to detect abnormal patterns in handwriting samples processed from 71 students of different grades. The study results revealed that the algorithm saw abnormal patterns in 20% of the handwriting samples processed, which included practices such as delayed typing speed, excessive pen pressure, irregular slant, and lack of word spacing. In addition, it was observed that the detection accuracy of the algorithm was 95% when comparing the camera data with the abnormal patterns detected, which indicates a high reliability in the results obtained. The highlight of the study was the feedback provided to children and teachers on the camera data and any abnormal patterns detected. This can significantly impact students’ awareness and improvement of writing skills by providing real-time feedback on their writing and allowing them to adjust to correct detected abnormal patterns.
{"title":"Detection of Abnormal Patterns in Children's Handwriting by Using an Artificial-Intelligence-Based Method","authors":"W. Villegas-Ch., Isabel Urbina-Camacho, J. Garcia-Ortiz","doi":"10.3390/informatics10020052","DOIUrl":"https://doi.org/10.3390/informatics10020052","url":null,"abstract":"Using camera-based algorithms to detect abnormal patterns in children’s handwriting has become a promising tool in education and occupational therapy. This study analyzes the performance of a camera- and tablet-based handwriting verification algorithm to detect abnormal patterns in handwriting samples processed from 71 students of different grades. The study results revealed that the algorithm saw abnormal patterns in 20% of the handwriting samples processed, which included practices such as delayed typing speed, excessive pen pressure, irregular slant, and lack of word spacing. In addition, it was observed that the detection accuracy of the algorithm was 95% when comparing the camera data with the abnormal patterns detected, which indicates a high reliability in the results obtained. The highlight of the study was the feedback provided to children and teachers on the camera data and any abnormal patterns detected. This can significantly impact students’ awareness and improvement of writing skills by providing real-time feedback on their writing and allowing them to adjust to correct detected abnormal patterns.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":"10 1","pages":"52"},"PeriodicalIF":3.1,"publicationDate":"2023-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44859880","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-12DOI: 10.3390/informatics10020050
K. Raghavendran, Ahmed Elragal
In the context of developing machine learning models, until and unless we have the required data engineering and machine learning development competencies as well as the time to train and test different machine learning models and tune their hyperparameters, it is worth trying out the automatic machine learning features provided by several cloud-based and cloud-agnostic platforms. This paper explores the possibility of generating automatic machine learning models with low-code experience. We developed criteria to compare different machine learning platforms for generating automatic machine learning models and presenting their results. Thereafter, lessons learned by developing automatic machine learning models from a sample dataset across four different machine learning platforms were elucidated. We also interviewed machine learning experts to conceptualize their domain-specific problems that automatic machine learning platforms can address. Results showed that automatic machine learning platforms can provide a fast track for organizations seeking the digitalization of their businesses. Automatic machine learning platforms help produce results, especially for time-constrained projects where resources are lacking. The contribution of this paper is in the form of a lab experiment in which we demonstrate how low-code platforms can provide a viable option to many business cases and, henceforth, provide a lane that is faster than the usual hiring and training of already scarce data scientists and to analytics projects that suffer from overruns.
{"title":"Low-Code Machine Learning Platforms: A Fastlane to Digitalization","authors":"K. Raghavendran, Ahmed Elragal","doi":"10.3390/informatics10020050","DOIUrl":"https://doi.org/10.3390/informatics10020050","url":null,"abstract":"In the context of developing machine learning models, until and unless we have the required data engineering and machine learning development competencies as well as the time to train and test different machine learning models and tune their hyperparameters, it is worth trying out the automatic machine learning features provided by several cloud-based and cloud-agnostic platforms. This paper explores the possibility of generating automatic machine learning models with low-code experience. We developed criteria to compare different machine learning platforms for generating automatic machine learning models and presenting their results. Thereafter, lessons learned by developing automatic machine learning models from a sample dataset across four different machine learning platforms were elucidated. We also interviewed machine learning experts to conceptualize their domain-specific problems that automatic machine learning platforms can address. Results showed that automatic machine learning platforms can provide a fast track for organizations seeking the digitalization of their businesses. Automatic machine learning platforms help produce results, especially for time-constrained projects where resources are lacking. The contribution of this paper is in the form of a lab experiment in which we demonstrate how low-code platforms can provide a viable option to many business cases and, henceforth, provide a lane that is faster than the usual hiring and training of already scarce data scientists and to analytics projects that suffer from overruns.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":"10 1","pages":"50"},"PeriodicalIF":3.1,"publicationDate":"2023-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44180556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"AI Chatbots: Threat or Opportunity?","authors":"A. Bryant","doi":"10.3390/informatics10020049","DOIUrl":"https://doi.org/10.3390/informatics10020049","url":null,"abstract":"In November 2022, OpenAI launched ChatGPT, an AI chatbot that gained over 100 million users by February 2023 [...]","PeriodicalId":37100,"journal":{"name":"Informatics","volume":"10 1","pages":"49"},"PeriodicalIF":3.1,"publicationDate":"2023-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48008978","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-12DOI: 10.3390/informatics10020051
T. A. B. Wirayuda, R. Munir, A. I. Kistijantoro
In computer vision, ethnicity classification tasks utilize images containing human faces to extract ethnicity labels. Ethnicity is one of the soft biometric feature categories useful in data analysis for commercial, public, and health sectors. Ethnicity classification begins with face detection as a preprocessing process to determine a human’s presence; then, the feature representation is extracted from the isolated facial image to predict the ethnicity class. This study utilized four handcrafted features (multi-local binary pattern (MLBP), histogram of gradient (HOG), color histogram, and speeded-up-robust-features-based (SURF-based)) as the basis for the generation of a compact-fusion feature. The compact-fusion framework involves optimal feature selection, compact feature extraction, and compact-fusion feature representation. The final feature representation was trained and tested with the SVM One Versus All classifier for ethnicity classification. When it was evaluated in two large datasets, UTKFace and Fair Face, the proposed framework achieved accuracy levels of 89.14%, 82.19%, and 73.87%, respectively, for the UTKFace dataset with four or five classes and the Fair Face dataset with four classes. Furthermore, the compact-fusion feature with a small number of features at 4790, constructed based on conventional handcrafted features, achieved competitive results compared with state-of-the-art methods using a deep-learning-based approach.
{"title":"Compact-Fusion Feature Framework for Ethnicity Classification","authors":"T. A. B. Wirayuda, R. Munir, A. I. Kistijantoro","doi":"10.3390/informatics10020051","DOIUrl":"https://doi.org/10.3390/informatics10020051","url":null,"abstract":"In computer vision, ethnicity classification tasks utilize images containing human faces to extract ethnicity labels. Ethnicity is one of the soft biometric feature categories useful in data analysis for commercial, public, and health sectors. Ethnicity classification begins with face detection as a preprocessing process to determine a human’s presence; then, the feature representation is extracted from the isolated facial image to predict the ethnicity class. This study utilized four handcrafted features (multi-local binary pattern (MLBP), histogram of gradient (HOG), color histogram, and speeded-up-robust-features-based (SURF-based)) as the basis for the generation of a compact-fusion feature. The compact-fusion framework involves optimal feature selection, compact feature extraction, and compact-fusion feature representation. The final feature representation was trained and tested with the SVM One Versus All classifier for ethnicity classification. When it was evaluated in two large datasets, UTKFace and Fair Face, the proposed framework achieved accuracy levels of 89.14%, 82.19%, and 73.87%, respectively, for the UTKFace dataset with four or five classes and the Fair Face dataset with four classes. Furthermore, the compact-fusion feature with a small number of features at 4790, constructed based on conventional handcrafted features, achieved competitive results compared with state-of-the-art methods using a deep-learning-based approach.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":"10 1","pages":"51"},"PeriodicalIF":3.1,"publicationDate":"2023-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46185480","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-05DOI: 10.3390/informatics10020048
Charlee Kaewrat, P. Boonbrahm, Bukhoree Sahoh
Unsuitable shoe shapes and sizes are a critical reason for unhealthy feet, may severely contribute to chronic injuries such as foot ulcers in susceptible people (e.g., diabetes patients), and thus need accurate measurements in the manner of expert-based procedures. However, the manual measure of such accurate shapes and sizes is labor-intensive, time-consuming, and impractical to apply in a real-time system. This research proposes a foot-detection approach using expert-like measurements to address this concern. It combines the seven-foot dimensions model and the light detection and ranging sensor to encode foot shapes and sizes and detect the dimension surfaces. The graph-based algorithms are developed to present seven-foot dimensions and visualize the shoe’s model based on the augmented reality (AR) technique. The results show that our approach can detect shapes and sizes more effectively than the traditional approach, helps the system imitate expert-like measurements accurately, and can be employed in intelligent applications for susceptible people-based feet measurements.
{"title":"The Design and Development of a Foot-Detection Approach Based on Seven-Foot Dimensions: A Case Study of a Virtual Try-On Shoe System Using Augmented Reality Techniques","authors":"Charlee Kaewrat, P. Boonbrahm, Bukhoree Sahoh","doi":"10.3390/informatics10020048","DOIUrl":"https://doi.org/10.3390/informatics10020048","url":null,"abstract":"Unsuitable shoe shapes and sizes are a critical reason for unhealthy feet, may severely contribute to chronic injuries such as foot ulcers in susceptible people (e.g., diabetes patients), and thus need accurate measurements in the manner of expert-based procedures. However, the manual measure of such accurate shapes and sizes is labor-intensive, time-consuming, and impractical to apply in a real-time system. This research proposes a foot-detection approach using expert-like measurements to address this concern. It combines the seven-foot dimensions model and the light detection and ranging sensor to encode foot shapes and sizes and detect the dimension surfaces. The graph-based algorithms are developed to present seven-foot dimensions and visualize the shoe’s model based on the augmented reality (AR) technique. The results show that our approach can detect shapes and sizes more effectively than the traditional approach, helps the system imitate expert-like measurements accurately, and can be employed in intelligent applications for susceptible people-based feet measurements.","PeriodicalId":37100,"journal":{"name":"Informatics","volume":"10 1","pages":"48"},"PeriodicalIF":3.1,"publicationDate":"2023-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49222031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}