Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2416-332-339
M. Nikitina, Y. Ivashkin
One of the main directions of statistics in sensory evaluation is an assessment of the dependence between experimental variables and measured characteristics. Statistical criteria are used to assess a degree of interaction between variables, a level of experimental effects, and allow accepting or rejecting hypothesis proposed. In sensory evaluation, people act as measurement instruments, and a variation associated with the human factor arises. This proves that the use of statistical methods is necessary. This article represents a network computer system for collection and evaluation of food sensory indicators based on the methods of rank correlation and multifactorial analysis of variance in real time. The article describes information technology of expert sensory evaluation of food quality by individual panelists and sensory panels regarding the indicators that are not measured by technical means of control, based on client-server network architecture. The software implementation of system for collecting and statistical processing of sensory data based on the principles of multifactorial analysis of variance in real-time mode makes it possible to evaluate the influence of the human factor on objectiveness and reliability of sensory evaluation results, as well as to visualize the data of expert scores by various expert panels.
{"title":"Expert system of food sensory evaluation for mobile and tablet","authors":"M. Nikitina, Y. Ivashkin","doi":"10.18287/1613-0073-2019-2416-332-339","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2416-332-339","url":null,"abstract":"One of the main directions of statistics in sensory evaluation is an assessment of the dependence between experimental variables and measured characteristics. Statistical criteria are used to assess a degree of interaction between variables, a level of experimental effects, and allow accepting or rejecting hypothesis proposed. In sensory evaluation, people act as measurement instruments, and a variation associated with the human factor arises. This proves that the use of statistical methods is necessary. This article represents a network computer system for collection and evaluation of food sensory indicators based on the methods of rank correlation and multifactorial analysis of variance in real time. The article describes information technology of expert sensory evaluation of food quality by individual panelists and sensory panels regarding the indicators that are not measured by technical means of control, based on client-server network architecture. The software implementation of system for collecting and statistical processing of sensory data based on the principles of multifactorial analysis of variance in real-time mode makes it possible to evaluate the influence of the human factor on objectiveness and reliability of sensory evaluation results, as well as to visualize the data of expert scores by various expert panels.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77328879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2416-74-79
N. Ilyasova, A. Shirokanev, I. Klimov
In this work, we proposed a new approach to analyzing eye fundus images that relies upon the use of a convolutional neural network (CNN). The CNN architecture was constructed, followed by network learning on a balanced dataset composed of four classes of images, composed of thick and thin blood vessels, healthy areas, and exudate areas. The learning was conducted on 12x12 images because an experimental study showed them to be optimal for the purpose. The test error was no higher than 4% for all sizes of the samples. Segmentation of eye fundus images was performed using the CNN. Considering that exudates are a primary target of laser coagulation surgery, the segmentation error was calculated on the exudate class, amounting to 5%. In the course of this research, the HSL color system was found to be most informative, using which the segmentation error was reduced to 3%.
{"title":"Application of convolution neural networks in eye fundus image analysis","authors":"N. Ilyasova, A. Shirokanev, I. Klimov","doi":"10.18287/1613-0073-2019-2416-74-79","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2416-74-79","url":null,"abstract":"In this work, we proposed a new approach to analyzing eye fundus images that relies upon the use of a convolutional neural network (CNN). The CNN architecture was constructed, followed by network learning on a balanced dataset composed of four classes of images, composed of thick and thin blood vessels, healthy areas, and exudate areas. The learning was conducted on 12x12 images because an experimental study showed them to be optimal for the purpose. The test error was no higher than 4% for all sizes of the samples. Segmentation of eye fundus images was performed using the CNN. Considering that exudates are a primary target of laser coagulation surgery, the segmentation error was calculated on the exudate class, amounting to 5%. In the course of this research, the HSL color system was found to be most informative, using which the segmentation error was reduced to 3%.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73348964","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2391-342-349
M. Bolotov, V. Pechenin, N. V. Ruzanov, E. Kolchina
To predict the quality parameters of products (in particular, the assembly parameters) mathematical models were implemented in the form of computer models. To ensure the adequacy of calculations, it is necessary to have information about the actual geometry of the parts, which can be obtained using noncontact measurements of parts of the assembly. As a result of measuring parts and components using optical or laser scanner, a large dimension array of measured points is formed. After standard processing (e.g. noise removal, combining the scans, smoothing, creating triangulation mesh), the recognition of individual surfaces of parts becomes necessary. This paper presents a neural network model that allows the recognition of elements based on an array of measured points obtained by scanning.
{"title":"Surface recognition of machine parts based on the results of optical scanning","authors":"M. Bolotov, V. Pechenin, N. V. Ruzanov, E. Kolchina","doi":"10.18287/1613-0073-2019-2391-342-349","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2391-342-349","url":null,"abstract":"To predict the quality parameters of products (in particular, the assembly parameters) mathematical models were implemented in the form of computer models. To ensure the adequacy of calculations, it is necessary to have information about the actual geometry of the parts, which can be obtained using noncontact measurements of parts of the assembly. As a result of measuring parts and components using optical or laser scanner, a large dimension array of measured points is formed. After standard processing (e.g. noise removal, combining the scans, smoothing, creating triangulation mesh), the recognition of individual surfaces of parts becomes necessary. This paper presents a neural network model that allows the recognition of elements based on an array of measured points obtained by scanning.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75328643","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2391-269-274
D. Murashov, A. Morozov, F. D. Murashov
In this paper, a new technique for detecting concealed objects in the images acquired by a passive THz imaging system is proposed. The technique is based on a method for mutual information maximization successfully used for image matching. For reducing computational expenses, we propose to analyze the mutual information at local maxima of the crosscorrelation function computed in the Fourier domain. The proposed technique does not require parameter tuning. A computing experiment approved the efficiency of the proposed technique and the possibility of its implementation in security systems.
{"title":"A technique for detecting concealed objects in terahertz images based on information measure","authors":"D. Murashov, A. Morozov, F. D. Murashov","doi":"10.18287/1613-0073-2019-2391-269-274","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2391-269-274","url":null,"abstract":"In this paper, a new technique for detecting concealed objects in the images acquired by a passive THz imaging system is proposed. The technique is based on a method for mutual information maximization successfully used for image matching. For reducing computational expenses, we propose to analyze the mutual information at local maxima of the crosscorrelation function computed in the Fourier domain. The proposed technique does not require parameter tuning. A computing experiment approved the efficiency of the proposed technique and the possibility of its implementation in security systems.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74853128","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2416-189-198
D. A. Shkirdov, E. Sagatov, P. S. Dmitrenko
This paper presents the results of data analysis from a geographically distributed honeypot network. Such honeypot servers were deployed in Samara, Rostov on Don, Crimea and the USA two years ago. Methods for processing statistics are discussed in detail for secure remote access SSH. Lists of attacking addresses are highlighted, and their geographical affiliation is determined. Rank distributions were used as the basis for statistical analysis. The intensity of requests to each of the 10 installed services was then calculated.
{"title":"Trap method in ensuring data security","authors":"D. A. Shkirdov, E. Sagatov, P. S. Dmitrenko","doi":"10.18287/1613-0073-2019-2416-189-198","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2416-189-198","url":null,"abstract":"This paper presents the results of data analysis from a geographically distributed honeypot network. Such honeypot servers were deployed in Samara, Rostov on Don, Crimea and the USA two years ago. Methods for processing statistics are discussed in detail for secure remote access SSH. Lists of attacking addresses are highlighted, and their geographical affiliation is determined. Rank distributions were used as the basis for statistical analysis. The intensity of requests to each of the 10 installed services was then calculated.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73482728","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2416-361-367
D. A. Zhukov, V. Klyachkin, V. Krasheninnikov, Yu E Kuvayskova
The basic data in the problem of the prediction of technical object’s state of health based on the known indicators of its operation are the known results of the object state estimation by information about previous service. The problem may be solved using the machine learning methods, it reduces to binary classification of states of the object. The research was conducted in the Matlab environment, ten various basic methods of machine learning were used: naive Bayes classifier, neural networks, bagging of decision trees and others. In order to improve quality of healthy state identification, it has been suggested that aggregated methods combining several basic classifiers should be used. This paper addresses the issue of selection of the best aggregated classifier. The effectiveness of such approach has been confirmed by numerous tests of real-world objects.
{"title":"Selection of aggregated classifiers for the prediction of the state of technical objects","authors":"D. A. Zhukov, V. Klyachkin, V. Krasheninnikov, Yu E Kuvayskova","doi":"10.18287/1613-0073-2019-2416-361-367","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2416-361-367","url":null,"abstract":"The basic data in the problem of the prediction of technical object’s state of health based on the known indicators of its operation are the known results of the object state estimation by information about previous service. The problem may be solved using the machine learning methods, it reduces to binary classification of states of the object. The research was conducted in the Matlab environment, ten various basic methods of machine learning were used: naive Bayes classifier, neural networks, bagging of decision trees and others. In order to improve quality of healthy state identification, it has been suggested that aggregated methods combining several basic classifiers should be used. This paper addresses the issue of selection of the best aggregated classifier. The effectiveness of such approach has been confirmed by numerous tests of real-world objects.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72887353","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2416-354-360
S. Stepanenko, P. Yakimov
Object classification with use of neural networks is extremely current today. YOLO is one of the most often used frameworks for object classification. It produces high accuracy but the processing speed is not high enough especially in conditions of limited performance of a computer. This article researches use of a framework called NVIDIA TensorRT to optimize YOLO with the aim of increasing the image processing speed. Saving efficiency and quality of the neural network work TensorRT allows us to increase the processing speed using an optimization of the architecture and an optimization of calculations on a GPU.
{"title":"Using high-performance deep learning platform to accelerate object detection","authors":"S. Stepanenko, P. Yakimov","doi":"10.18287/1613-0073-2019-2416-354-360","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2416-354-360","url":null,"abstract":"Object classification with use of neural networks is extremely current today. YOLO is one of the most often used frameworks for object classification. It produces high accuracy but the processing speed is not high enough especially in conditions of limited performance of a computer. This article researches use of a framework called NVIDIA TensorRT to optimize YOLO with the aim of increasing the image processing speed. Saving efficiency and quality of the neural network work TensorRT allows us to increase the processing speed using an optimization of the architecture and an optimization of calculations on a GPU.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84658063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2391-258-263
N. Evdokimova, V. Myasnikov
In the paper, the image series forgery detection algorithm based on the analysis of camera pattern noise is proposed. Distribution characteristics of the camera pattern noise are obtained by extracting the noise component of images from the non-tampered image series. A noise residual of a forgery image is compared with the camera pattern noise. We compare various noise filtering algorithms to choose the one that achieves the best performance of the proposed method. The proposed algorithm is tested both on examples of copy-move forgeries and forgery fragments which were inserted from an image not included in the image series.
{"title":"The image series forgery detection algorithm based on the camera pattern noise analysis","authors":"N. Evdokimova, V. Myasnikov","doi":"10.18287/1613-0073-2019-2391-258-263","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2391-258-263","url":null,"abstract":"In the paper, the image series forgery detection algorithm based on the analysis of camera pattern noise is proposed. Distribution characteristics of the camera pattern noise are obtained by extracting the noise component of images from the non-tampered image series. A noise residual of a forgery image is compared with the camera pattern noise. We compare various noise filtering algorithms to choose the one that achieves the best performance of the proposed method. The proposed algorithm is tested both on examples of copy-move forgeries and forgery fragments which were inserted from an image not included in the image series.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87635059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2416-1-9
D. Samoilov, V. A. Semenova, S. Smirnov, Y. Mezentsev, D. Zhukov, E. Zentsova, Y. Goshin, K. Pugachev, A. Korobeynikov, A. Menlitdinov, V. Lyuminarskiy, Yu Kuzelin, O. A. Kuznetsova, A. Yumaganov
The research field is the problem of extracting from the initial empirical material the formal concept lattice, which can serve as the basis of the formal ontology of the studied subject domain. The initial empirical material, i.e. the data of multidimensional observations and experiments, is characterized by incompleteness and inconsistency, conditioned by realities of empirical information accumulation. This leads to the fact that required for lattice building formal context can be previously presented only within the framework of some multivalued logic. It needs to be approximated in binary logic, since effective methods for derivation of formal concepts are developed only for unambiguous (binary) formal contexts. The exact solution of this problem, considering the properties existence constraints of objects in the studied subject domain, is difficult and in a certain sense is inadequate to expectations of subject exploring the subject domain. For defuzzification of the initial formal context heuristic was proposed, idea of which is to localize the approximation task of "soft" context within every group of dependent properties of each object of learning sample. The model reflecting such restrictions is formed as hierarchy of groups of dependent properties, which predetermines the recursive and multi-pass nature of the developed defuzzification algorithm.
{"title":"Defuzzification of the initial context in Formal Concept Analysis","authors":"D. Samoilov, V. A. Semenova, S. Smirnov, Y. Mezentsev, D. Zhukov, E. Zentsova, Y. Goshin, K. Pugachev, A. Korobeynikov, A. Menlitdinov, V. Lyuminarskiy, Yu Kuzelin, O. A. Kuznetsova, A. Yumaganov","doi":"10.18287/1613-0073-2019-2416-1-9","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2416-1-9","url":null,"abstract":"The research field is the problem of extracting from the initial empirical material the formal concept lattice, which can serve as the basis of the formal ontology of the studied subject domain. The initial empirical material, i.e. the data of multidimensional observations and experiments, is characterized by incompleteness and inconsistency, conditioned by realities of empirical information accumulation. This leads to the fact that required for lattice building formal context can be previously presented only within the framework of some multivalued logic. It needs to be approximated in binary logic, since effective methods for derivation of formal concepts are developed only for unambiguous (binary) formal contexts. The exact solution of this problem, considering the properties existence constraints of objects in the studied subject domain, is difficult and in a certain sense is inadequate to expectations of subject exploring the subject domain. For defuzzification of the initial formal context heuristic was proposed, idea of which is to localize the approximation task of \"soft\" context within every group of dependent properties of each object of learning sample. The model reflecting such restrictions is formed as hierarchy of groups of dependent properties, which predetermines the recursive and multi-pass nature of the developed defuzzification algorithm.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90474552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-01-01DOI: 10.18287/1613-0073-2019-2416-252-259
N. Yarushkina, V. Moshkin, I. Andreev, G. I. Ishmuratova
The article provides a formal description of fuzzy ontologies and features of the representation of elements of fuzzy axioms in FuzzyOWL notation. An ontological model for assessing the state of helicopter units has been developed. According to the proposed approach, the summarizing of the state of a complex technical system is carried out by means of an inference based on a fuzzy ontology. As part of this work, experiments were conducted to search for anomalous situations and search for possible faulty helicopter units using the developed approach to the integration of fuzzy time series and fuzzy ontology. The proposed approach of hybridization of fuzzy time series and fuzzy ontologies made it possible to reliably recognize anomalous situations with a certain degree of truth, and to find possible faulty aggregates corresponding to each anomalous situation.
{"title":"Hybridization of fuzzy time series and fuzzy ontologies in the diagnosis of complex technical systems","authors":"N. Yarushkina, V. Moshkin, I. Andreev, G. I. Ishmuratova","doi":"10.18287/1613-0073-2019-2416-252-259","DOIUrl":"https://doi.org/10.18287/1613-0073-2019-2416-252-259","url":null,"abstract":"The article provides a formal description of fuzzy ontologies and features of the representation of elements of fuzzy axioms in FuzzyOWL notation. An ontological model for assessing the state of helicopter units has been developed. According to the proposed approach, the summarizing of the state of a complex technical system is carried out by means of an inference based on a fuzzy ontology. As part of this work, experiments were conducted to search for anomalous situations and search for possible faulty helicopter units using the developed approach to the integration of fuzzy time series and fuzzy ontology. The proposed approach of hybridization of fuzzy time series and fuzzy ontologies made it possible to reliably recognize anomalous situations with a certain degree of truth, and to find possible faulty aggregates corresponding to each anomalous situation.","PeriodicalId":10486,"journal":{"name":"Collection of selected papers of the III International Conference on Information Technology and Nanotechnology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81873300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}