Decoding the human genome in the past decades has brought into focus a computationally intensive operation through DNA profiling. The typical search space for these kinds of problems is extremely large and requires specialized hardware and algorithms to perform the necessary sequence analysis. In this paper, we propose an innovative and scalable approach to exact multi-pattern matching of nucleotide sequences by harnessing the massively parallel computing power found in commodity graphical processing units. Our approach places careful consideration on preprocessing of DNA datasets and runtime performance, while exploiting the full capabilities of the heterogeneous platform it runs on. Finally, we evaluate our models against real-world DNA sequences.
{"title":"Towards real-time DNA biometrics using GPU-accelerated processing","authors":"Mario Reja, Ciprian-Petrisor Pungila, V. Negru","doi":"10.1093/jigpal/jzaa034","DOIUrl":"https://doi.org/10.1093/jigpal/jzaa034","url":null,"abstract":"\u0000 Decoding the human genome in the past decades has brought into focus a computationally intensive operation through DNA profiling. The typical search space for these kinds of problems is extremely large and requires specialized hardware and algorithms to perform the necessary sequence analysis. In this paper, we propose an innovative and scalable approach to exact multi-pattern matching of nucleotide sequences by harnessing the massively parallel computing power found in commodity graphical processing units. Our approach places careful consideration on preprocessing of DNA datasets and runtime performance, while exploiting the full capabilities of the heterogeneous platform it runs on. Finally, we evaluate our models against real-world DNA sequences.","PeriodicalId":304915,"journal":{"name":"Log. J. IGPL","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133134269","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Electroencephalogram (EEG) plays an essential role in analysing and recognizing brain-related diseases. EEG has been increasingly used as a new type of biometrics in person identification and verification systems. These EEG-based systems are important components in applications for both police and civilian works, and both areas process a huge amount of EEG data. Storing and transmitting these huge amounts of data are significant challenges for data compression techniques. Lossy compression is used for EEG data as it provides a higher compression ratio (CR) than lossless compression techniques. However, lossy compression can negatively influence the performance of EEG-based person identification and verification systems via the loss of information in the reconstructed data. To address this, we propose introducing performance measures as additional features in evaluating lossy compression techniques for EEG data. Our research explores if a common value of CR exists for different systems using datasets with lossy compression that could provide almost the same system performance with those using datasets without lossy compression. We performed experiments on EEG-based person identification and verification systems using two large EEG datasets, CHB MIT Scalp and Alcoholism, to investigate the relationship between standard lossy compression measures and our proposed system performance measures with the two lossy compression techniques, discrete wavelet transform—adaptive arithmetic coding and discrete wavelet transform—set partitioning in hierarchical trees. Our experimental results showed a common value of CR exists for different systems, specifically, 70 for person identification systems and 50 for person verification systems.
{"title":"Biometric recognition system performance measures for lossy compression on EEG signals","authors":"Binh Nguyen, Wanli Ma, D. Tran","doi":"10.1093/jigpal/jzaa033","DOIUrl":"https://doi.org/10.1093/jigpal/jzaa033","url":null,"abstract":"Electroencephalogram (EEG) plays an essential role in analysing and recognizing brain-related diseases. EEG has been increasingly used as a new type of biometrics in person identification and verification systems. These EEG-based systems are important components in applications for both police and civilian works, and both areas process a huge amount of EEG data. Storing and transmitting these huge amounts of data are significant challenges for data compression techniques. Lossy compression is used for EEG data as it provides a higher compression ratio (CR) than lossless compression techniques. However, lossy compression can negatively influence the performance of EEG-based person identification and verification systems via the loss of information in the reconstructed data. To address this, we propose introducing performance measures as additional features in evaluating lossy compression techniques for EEG data. Our research explores if a common value of CR exists for different systems using datasets with lossy compression that could provide almost the same system performance with those using datasets without lossy compression. We performed experiments on EEG-based person identification and verification systems using two large EEG datasets, CHB MIT Scalp and Alcoholism, to investigate the relationship between standard lossy compression measures and our proposed system performance measures with the two lossy compression techniques, discrete wavelet transform—adaptive arithmetic coding and discrete wavelet transform—set partitioning in hierarchical trees. Our experimental results showed a common value of CR exists for different systems, specifically, 70 for person identification systems and 50 for person verification systems.","PeriodicalId":304915,"journal":{"name":"Log. J. IGPL","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127318589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yeray Mezquita, Roberto Casado-Vara, Alfonso González-Briones, Javier Prieto, J. Corchado
Logistics services involve a wide range of transport operations between distributors and clients. Currently, the large number of intermediaries are a challenge for this sector, as it makes all the processes more complicated. To face that problem, we propose a system that uses smart contracts to remove intermediaries and speed up logistics activities. Our new model combines smart contracts and a multi-agent system in a single platform to improve the current logistics system by increasing organization, security and getting rid of several human intermediaries to automate its processes, making distribution times significantly faster. Also, with this kind of approach, it is possible to apply penalties to parties that do not comply with the terms of using this platform.
{"title":"Blockchain-based architecture for the control of logistics activities: Pharmaceutical utilities case study","authors":"Yeray Mezquita, Roberto Casado-Vara, Alfonso González-Briones, Javier Prieto, J. Corchado","doi":"10.1093/jigpal/jzaa039","DOIUrl":"https://doi.org/10.1093/jigpal/jzaa039","url":null,"abstract":"Logistics services involve a wide range of transport operations between distributors and clients. Currently, the large number of intermediaries are a challenge for this sector, as it makes all the processes more complicated. To face that problem, we propose a system that uses smart contracts to remove intermediaries and speed up logistics activities. Our new model combines smart contracts and a multi-agent system in a single platform to improve the current logistics system by increasing organization, security and getting rid of several human intermediaries to automate its processes, making distribution times significantly faster. Also, with this kind of approach, it is possible to apply penalties to parties that do not comply with the terms of using this platform.","PeriodicalId":304915,"journal":{"name":"Log. J. IGPL","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124463548","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Crişan, C. Pintea, A. Calinescu, Corina Pop Sitar, P. Pop
Meeting the security requests in transportation is nowadays a must. The intelligent transport systems (ITSs) represent the support for addressing such a challenge, due to their ability to make real-time adaptive decisions. We propose a new variant of the travelling salesman problem (TSP) integrating security constraints inspired from ITSs. This optimization problem is called the secure TSP and considers a set of security constraints on its integer variables. Similarities with fuzzy logic are presented alongside the mathematical model of the introduced TSP variant.
{"title":"Secure traveling salesman problem with intelligent transport systems features","authors":"G. Crişan, C. Pintea, A. Calinescu, Corina Pop Sitar, P. Pop","doi":"10.1093/jigpal/jzaa035","DOIUrl":"https://doi.org/10.1093/jigpal/jzaa035","url":null,"abstract":"\u0000 Meeting the security requests in transportation is nowadays a must. The intelligent transport systems (ITSs) represent the support for addressing such a challenge, due to their ability to make real-time adaptive decisions. We propose a new variant of the travelling salesman problem (TSP) integrating security constraints inspired from ITSs. This optimization problem is called the secure TSP and considers a set of security constraints on its integer variables. Similarities with fuzzy logic are presented alongside the mathematical model of the introduced TSP variant.","PeriodicalId":304915,"journal":{"name":"Log. J. IGPL","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116351956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We prove the equivalence of the semantic version of Tarski’s theorem on the undefinability of truth with the semantic version of the diagonal lemma and also show the equivalence of a syntactic version of Tarski’s undefinability theorem with a weak syntactic diagonal lemma. We outline two seemingly diagonal-free proofs for these theorems from the literature and show that the syntactic version of Tarski’s theorem can deliver Gödel–Rosser’s incompleteness theorem.
{"title":"Tarski's Undefinability Theorem and the Diagonal Lemma","authors":"Saeed Salehi","doi":"10.1093/jigpal/jzab016","DOIUrl":"https://doi.org/10.1093/jigpal/jzab016","url":null,"abstract":"\u0000 We prove the equivalence of the semantic version of Tarski’s theorem on the undefinability of truth with the semantic version of the diagonal lemma and also show the equivalence of a syntactic version of Tarski’s undefinability theorem with a weak syntactic diagonal lemma. We outline two seemingly diagonal-free proofs for these theorems from the literature and show that the syntactic version of Tarski’s theorem can deliver Gödel–Rosser’s incompleteness theorem.","PeriodicalId":304915,"journal":{"name":"Log. J. IGPL","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124987791","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Miguel Pérez-Gaspar, Alejandro Hernández-Tello, J. R. A. Ramírez, Mauricio Osorio
In memoriam José Arrazola Ramírez (1962–2018) The logic $textbf{G}^{prime}_3$ was introduced by Osorio et al. in 2008; it is a three-valued logic, closely related to the paraconsistent logic $textbf{CG}^{prime}_3$ introduced by Osorio et al. in 2014. The logic $textbf{CG}^{prime}_3$ is defined in terms of a multi-valued semantics and has the property that each theorem in $textbf{G}^{prime}_3$ is a theorem in $textbf{CG}^{prime}_3$. Kripke-type semantics has been given to $textbf{CG}^{prime}_3$ in two different ways by Borja et al. in 2016. In this work, we continue the study of $textbf{CG}^{prime}_3$, obtaining a Hilbert-type axiomatic system and proving a soundness and completeness theorem for this logic.
{"title":"An axiomatic approach to CG′3 logic","authors":"Miguel Pérez-Gaspar, Alejandro Hernández-Tello, J. R. A. Ramírez, Mauricio Osorio","doi":"10.1093/jigpal/jzaa014","DOIUrl":"https://doi.org/10.1093/jigpal/jzaa014","url":null,"abstract":"\u0000 In memoriam José Arrazola Ramírez (1962–2018) The logic $textbf{G}^{prime}_3$ was introduced by Osorio et al. in 2008; it is a three-valued logic, closely related to the paraconsistent logic $textbf{CG}^{prime}_3$ introduced by Osorio et al. in 2014. The logic $textbf{CG}^{prime}_3$ is defined in terms of a multi-valued semantics and has the property that each theorem in $textbf{G}^{prime}_3$ is a theorem in $textbf{CG}^{prime}_3$. Kripke-type semantics has been given to $textbf{CG}^{prime}_3$ in two different ways by Borja et al. in 2016. In this work, we continue the study of $textbf{CG}^{prime}_3$, obtaining a Hilbert-type axiomatic system and proving a soundness and completeness theorem for this logic.","PeriodicalId":304915,"journal":{"name":"Log. J. IGPL","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128465371","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Gonzalo de la Torre-Abaitua, L. F. Lago-Fernández, David Arroyo
In cybersecurity, there is a call for adaptive, accurate and efficient procedures to identifying performance shortcomings and security breaches. The increasing complexity of both Internet services and traffic determines a scenario that in many cases impedes the proper deployment of intrusion detection and prevention systems. Although it is a common practice to monitor network and applications activity, there is not a general methodology to codify and interpret the recorded events. Moreover, this lack of methodology somehow erodes the possibility of diagnosing whether event detection and recording is adequately performed. As a result, there is an urge to construct general codification and classification procedures to be applied on any type of security event in any activity log. This work is focused on defining such a method using the so-called normalized compression distance (NCD). NCD is parameter-free and can be applied to determine the distance between events expressed using strings. As a first step in the concretion of a methodology for the integral interpretation of security events, this work is devoted to the characterization of web logs. On the grounds of the NCD, we propose an anomaly-based procedure for identifying web attacks from web logs. Given a web query as stored in a security log, a NCD-based feature vector is created and classified using a support vector machine. The method is tested using the CSIC-2010 data set, and the results are analyzed with respect to similar proposals.
{"title":"On the application of compression-based metrics to identifying anomalous behaviour in web traffic","authors":"Gonzalo de la Torre-Abaitua, L. F. Lago-Fernández, David Arroyo","doi":"10.1093/jigpal/jzz062","DOIUrl":"https://doi.org/10.1093/jigpal/jzz062","url":null,"abstract":"In cybersecurity, there is a call for adaptive, accurate and efficient procedures to identifying performance shortcomings and security breaches. The increasing complexity of both Internet services and traffic determines a scenario that in many cases impedes the proper deployment of intrusion detection and prevention systems. Although it is a common practice to monitor network and applications activity, there is not a general methodology to codify and interpret the recorded events. Moreover, this lack of methodology somehow erodes the possibility of diagnosing whether event detection and recording is adequately performed. As a result, there is an urge to construct general codification and classification procedures to be applied on any type of security event in any activity log. This work is focused on defining such a method using the so-called normalized compression distance (NCD). NCD is parameter-free and can be applied to determine the distance between events expressed using strings. As a first step in the concretion of a methodology for the integral interpretation of security events, this work is devoted to the characterization of web logs. On the grounds of the NCD, we propose an anomaly-based procedure for identifying web attacks from web logs. Given a web query as stored in a security log, a NCD-based feature vector is created and classified using a support vector machine. The method is tested using the CSIC-2010 data set, and the results are analyzed with respect to similar proposals.","PeriodicalId":304915,"journal":{"name":"Log. J. IGPL","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121284996","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Esteban Jove, J. Casteleiro-Roca, Héctor Quintián-Pardo, D. Simić, J. A. M. Pérez, J. Calvo-Rolle
A large part of technological advances, especially in the field of industry, have been focused on the optimization of productive processes. However, the detection of anomalies has turned out to be a great challenge in fields like industry, medicine or stock markets. The present work addresses anomaly detection on a control level plant. We propose the application of different intelligent techniques, which allow to obtain one-class classifiers using real data taken from the correct plant operation. The performance of each classifier is assessed and validated with real created faults, achieving successful overall results.
{"title":"Anomaly detection based on one-class intelligent techniques over a control level plant","authors":"Esteban Jove, J. Casteleiro-Roca, Héctor Quintián-Pardo, D. Simić, J. A. M. Pérez, J. Calvo-Rolle","doi":"10.1093/jigpal/jzz057","DOIUrl":"https://doi.org/10.1093/jigpal/jzz057","url":null,"abstract":"A large part of technological advances, especially in the field of industry, have been focused on the optimization of productive processes. However, the detection of anomalies has turned out to be a great challenge in fields like industry, medicine or stock markets. The present work addresses anomaly detection on a control level plant. We propose the application of different intelligent techniques, which allow to obtain one-class classifiers using real data taken from the correct plant operation. The performance of each classifier is assessed and validated with real created faults, achieving successful overall results.","PeriodicalId":304915,"journal":{"name":"Log. J. IGPL","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121087135","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper is about a support information management system for a wind power (WP) producer having an energy storage system (ESS) and participating in a day-ahead electricity market. Energy storage can play not only a leading role in mitigation of the effect of uncertainty faced by a WP producer, but also allow for conversion of wind energy into electric energy to be stored and then released at favourable hours. This storage provides capability for arbitrage, allowing an increase on profit of a WP producer, but must be supported by a convenient problem formulation. The formulation proposed for the support information management system is based on an approach of stochasticity written as a mixed integer linear programming problem. WP and market prices are considered as stochastic processes represented by a set of scenarios. The charging/discharging of the ESS are considered dependent on scenarios of market prices and on scenarios of WP. The effectiveness of the proposed formulation is tested by comparison of case studies using data from the Iberian Electricity Market. The comparison is in favour of the proposed consideration of stochasticity.
{"title":"Wind Power with Energy Storage Arbitrage in Day-ahead Market by a Stochastic MILP Approach","authors":"I. Gomes, R. Melício, V. Mendes, H. Pousinho","doi":"10.1093/jigpal/jzz054","DOIUrl":"https://doi.org/10.1093/jigpal/jzz054","url":null,"abstract":"This paper is about a support information management system for a wind power (WP) producer having an energy storage system (ESS) and participating in a day-ahead electricity market. Energy storage can play not only a leading role in mitigation of the effect of uncertainty faced by a WP producer, but also allow for conversion of wind energy into electric energy to be stored and then released at favourable hours. This storage provides capability for arbitrage, allowing an increase on profit of a WP producer, but must be supported by a convenient problem formulation. The formulation proposed for the support information management system is based on an approach of stochasticity written as a mixed integer linear programming problem. WP and market prices are considered as stochastic processes represented by a set of scenarios. The charging/discharging of the ESS are considered dependent on scenarios of market prices and on scenarios of WP. The effectiveness of the proposed formulation is tested by comparison of case studies using data from the Iberian Electricity Market. The comparison is in favour of the proposed consideration of stochasticity.","PeriodicalId":304915,"journal":{"name":"Log. J. IGPL","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131918611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}