Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.42
I. Batyrshin
It is surprising that last two decades many works in time series data mining and clustering were concerned with measures of similarity of time series but not with measures of association that can be used for measuring possible direct and inverse relationships between time series. Inverse relationships can exist between dynamics of prices and sell volumes, between growth patterns of competitive companies, between well production data in oilfields, between wind velocity and air pollution concentration etc. The paper develops a theoretical basis for analysis and construction of time series shape association measures. Starting from the axioms of time series shape association measures it studies the methods of construction of measures satisfying these axioms. Several general methods of construction of such measures suitable for measuring time series shape similarity and shape association are proposed. Time series shape association measures based on Minkowski distance and data standardization methods are considered. The cosine similarity and the Pearson's correlation coefficient are obtained as partial cases of the proposed general methods that can be used also for construction of new association measures in data analysis.
{"title":"Constructing Time Series Shape Association Measures: Minkowski Distance and Data Standardization","authors":"I. Batyrshin","doi":"10.1109/BRICS-CCI-CBIC.2013.42","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.42","url":null,"abstract":"It is surprising that last two decades many works in time series data mining and clustering were concerned with measures of similarity of time series but not with measures of association that can be used for measuring possible direct and inverse relationships between time series. Inverse relationships can exist between dynamics of prices and sell volumes, between growth patterns of competitive companies, between well production data in oilfields, between wind velocity and air pollution concentration etc. The paper develops a theoretical basis for analysis and construction of time series shape association measures. Starting from the axioms of time series shape association measures it studies the methods of construction of measures satisfying these axioms. Several general methods of construction of such measures suitable for measuring time series shape similarity and shape association are proposed. Time series shape association measures based on Minkowski distance and data standardization methods are considered. The cosine similarity and the Pearson's correlation coefficient are obtained as partial cases of the proposed general methods that can be used also for construction of new association measures in data analysis.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130214966","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.101
Vinicius Prado da Fonseca, P. Rosa
A RSSI-based localization system on a home wireless sensor network is proposed in this work. In order to support a robot assistant in pick-and-place tasks, our current system is capable of estimating the localization of an object using the signal strength received by a mobile device in a ZigBee sensor network. Two models were utilized (a) log-distance path loss - model in which signal lost has a random influence with log-normal distribution, and (b) free space decay law - based on the decay law for a signal on an open space. RSSI measurements were done in laboratory for applying the estimation method. Moreover experiments with satisfactory results were done with a public dataset to benchmark our results.
{"title":"Tracking Objects in a Smart Home","authors":"Vinicius Prado da Fonseca, P. Rosa","doi":"10.1109/BRICS-CCI-CBIC.2013.101","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.101","url":null,"abstract":"A RSSI-based localization system on a home wireless sensor network is proposed in this work. In order to support a robot assistant in pick-and-place tasks, our current system is capable of estimating the localization of an object using the signal strength received by a mobile device in a ZigBee sensor network. Two models were utilized (a) log-distance path loss - model in which signal lost has a random influence with log-normal distribution, and (b) free space decay law - based on the decay law for a signal on an open space. RSSI measurements were done in laboratory for applying the estimation method. Moreover experiments with satisfactory results were done with a public dataset to benchmark our results.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128691197","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.77
Bilzã Araújo, Liang Zhao
Network-based Semi-Supervised Learning (NbSSL) propagates labels in affinity-networks by taking advantage of the network topology likewise information spreading in trust networks. In NbSSL, not only the unlabeled data instances, but also the labeled ones, are able to bias the classification performance. Herein, we show some results and discussion on this phenomenon. Even the suitability of the free parameters of the NbSSL algorithms varies according to the available labeled data. Indeed, we propose a method for selecting representative data instances for labeling for NbSSL. In our sense the represent ability of a node is related to how inhomogeneous is its profile concerning the whole network. The proposed method uses Complex Networks centrality measures to identify which nodes present inhomogeneous profile. We perform this study by applying three NbSSL algorithms on Girvan-Newman and Lancichinetti-Fortunato-Radicchi modular networks. In the former, the nodes with high clustering coefficient are good representatives of the data and the nodes with high betweenness are the good representatives ones in the later. A high clustering coefficient means that the node lies in a much connected motif (clique) whereas a high betweenness means that the node lies interconnecting the modular structures. These results reveal the ability to improve the NbSSL performance by selecting representative data instances for manual labeling.
{"title":"Selecting Nodes with Inhomogeneous Profile for Labeling for Network-Based Semi-supervised Learning","authors":"Bilzã Araújo, Liang Zhao","doi":"10.1109/BRICS-CCI-CBIC.2013.77","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.77","url":null,"abstract":"Network-based Semi-Supervised Learning (NbSSL) propagates labels in affinity-networks by taking advantage of the network topology likewise information spreading in trust networks. In NbSSL, not only the unlabeled data instances, but also the labeled ones, are able to bias the classification performance. Herein, we show some results and discussion on this phenomenon. Even the suitability of the free parameters of the NbSSL algorithms varies according to the available labeled data. Indeed, we propose a method for selecting representative data instances for labeling for NbSSL. In our sense the represent ability of a node is related to how inhomogeneous is its profile concerning the whole network. The proposed method uses Complex Networks centrality measures to identify which nodes present inhomogeneous profile. We perform this study by applying three NbSSL algorithms on Girvan-Newman and Lancichinetti-Fortunato-Radicchi modular networks. In the former, the nodes with high clustering coefficient are good representatives of the data and the nodes with high betweenness are the good representatives ones in the later. A high clustering coefficient means that the node lies in a much connected motif (clique) whereas a high betweenness means that the node lies interconnecting the modular structures. These results reveal the ability to improve the NbSSL performance by selecting representative data instances for manual labeling.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130782090","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.65
Paulo Pereira, S. Leitão, E. Pires
This paper makes a study about optimal supply of the energy service, using simulations of network operation scenarios, in order to optimize resources and minimize the variables: operation cost, energy losses, generation cost and consumers shedding. These simulations create optimal operation models of the network, allowing the system operator obtain knowledge to take pre-established procedures that must be performed in situations of contingency in order to forecast and minimize drawbacks. The simulations were performed using a multiobjective particle swarm optimization algorithm. The algorithm was applied to the IEEE 14 Bus network where the optimal power flow was evaluated by MATPOWER tool to establish an optimal electrical working model to minimize the associated costs.
{"title":"State Operation Optimization in Electrical Networks","authors":"Paulo Pereira, S. Leitão, E. Pires","doi":"10.1109/BRICS-CCI-CBIC.2013.65","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.65","url":null,"abstract":"This paper makes a study about optimal supply of the energy service, using simulations of network operation scenarios, in order to optimize resources and minimize the variables: operation cost, energy losses, generation cost and consumers shedding. These simulations create optimal operation models of the network, allowing the system operator obtain knowledge to take pre-established procedures that must be performed in situations of contingency in order to forecast and minimize drawbacks. The simulations were performed using a multiobjective particle swarm optimization algorithm. The algorithm was applied to the IEEE 14 Bus network where the optimal power flow was evaluated by MATPOWER tool to establish an optimal electrical working model to minimize the associated costs.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128504609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.96
C. Silva, Frederico Gadelha Guimarães
Power Distribution Network Reconfiguration demands the change of current state of the network in order to reach optimal operation according to some previouly defined figures of merit. This paper presents a new methodology based on Multi-Agent Systems for power distribution network reconfiguration aiming at minimizing power losses based on game theory. The principal characteristic of the game is the interpretation of the payoff matrix as having physical meaning. This way allowed better decisions to be taken in order to improve the overall performance of the network. Test cases with 100 buses/1 feeder and 83 buses/11 feeders in operation mode were taken as example of application of the proposed algorithm and to illustrate its success.
{"title":"Reconfiguration of Power Distribution Networks by Multi-agent Systems","authors":"C. Silva, Frederico Gadelha Guimarães","doi":"10.1109/BRICS-CCI-CBIC.2013.96","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.96","url":null,"abstract":"Power Distribution Network Reconfiguration demands the change of current state of the network in order to reach optimal operation according to some previouly defined figures of merit. This paper presents a new methodology based on Multi-Agent Systems for power distribution network reconfiguration aiming at minimizing power losses based on game theory. The principal characteristic of the game is the interpretation of the payoff matrix as having physical meaning. This way allowed better decisions to be taken in order to improve the overall performance of the network. Test cases with 100 buses/1 feeder and 83 buses/11 feeders in operation mode were taken as example of application of the proposed algorithm and to illustrate its success.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132739481","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.69
Y. Orlova
The paper describes the computer-aided system for automation of the initial stages of multi-component software design, which is based on the semantic analysis of text. Automation lead to increasing in the quality of software development. The paper considers an approach to automation of the initial stages of software design. Program tools, which provide automated semantic analysis of technical documentation, automated construction of models, synthesis of structure and natural language description of the program software, are developed.
{"title":"Approach to Automation of the Initial Stages of Software Design","authors":"Y. Orlova","doi":"10.1109/BRICS-CCI-CBIC.2013.69","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.69","url":null,"abstract":"The paper describes the computer-aided system for automation of the initial stages of multi-component software design, which is based on the semantic analysis of text. Automation lead to increasing in the quality of software development. The paper considers an approach to automation of the initial stages of software design. Program tools, which provide automated semantic analysis of technical documentation, automated construction of models, synthesis of structure and natural language description of the program software, are developed.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129319471","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.51
J. Grobler, A. Engelbrecht
This paper investigates various strategies for the management of solution space diversity within the context of a meta-hyper heuristic algorithm. The adaptive local search meta-hyper heuristic (ALSHH), which adaptively applies a local search algorithm when the population diversity strays outside a predetermined solution space diversity profile, is proposed. ALSHH was shown to compare favourably with algorithms making use of local search and diversity maintenance strategies applied at constant intervals throughout the optimization run. Good performance is also demonstrated with respect to two other popular multi-method algorithms.
{"title":"Solution Space Diversity Management in a Meta-hyperheuristic Framework","authors":"J. Grobler, A. Engelbrecht","doi":"10.1109/BRICS-CCI-CBIC.2013.51","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.51","url":null,"abstract":"This paper investigates various strategies for the management of solution space diversity within the context of a meta-hyper heuristic algorithm. The adaptive local search meta-hyper heuristic (ALSHH), which adaptively applies a local search algorithm when the population diversity strays outside a predetermined solution space diversity profile, is proposed. ALSHH was shown to compare favourably with algorithms making use of local search and diversity maintenance strategies applied at constant intervals throughout the optimization run. Good performance is also demonstrated with respect to two other popular multi-method algorithms.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"213 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132129406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.92
T. F. Oliveira, Ricardo T. A. De Oliveira, P. Firmino, Paulo S. G. de Mattos Neto, T. Ferreira
Artificial neural networks (ANN) have been paramount for modeling and forecasting time series phenomena. In this way it has been usual to suppose that each ANN model generates a white noise as prediction error. However, mostly because of disturbances not captured by each model, it is yet possible that such supposition is violated. On the other hand, to adopt a single ANN model may lead to statistical bias and underestimation of uncertainty. The present paper introduces a two-step maximum likelihood method for correcting and combining ANN models. Applications involving single ANN models for Dow Jones Industrial Average Index and S&P500 series illustrate the usefulness of the proposed framework.
{"title":"Combination of Biased Artificial Neural Network Forecasters","authors":"T. F. Oliveira, Ricardo T. A. De Oliveira, P. Firmino, Paulo S. G. de Mattos Neto, T. Ferreira","doi":"10.1109/BRICS-CCI-CBIC.2013.92","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.92","url":null,"abstract":"Artificial neural networks (ANN) have been paramount for modeling and forecasting time series phenomena. In this way it has been usual to suppose that each ANN model generates a white noise as prediction error. However, mostly because of disturbances not captured by each model, it is yet possible that such supposition is violated. On the other hand, to adopt a single ANN model may lead to statistical bias and underestimation of uncertainty. The present paper introduces a two-step maximum likelihood method for correcting and combining ANN models. Applications involving single ANN models for Dow Jones Industrial Average Index and S&P500 series illustrate the usefulness of the proposed framework.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129183222","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.39
M. Perkusich, A. Perkusich, Hyggo Oliveira de Almeida
Recently, Bayesian networks became a popular technique to represent knowledge about uncertain domains and have been successfully used for applications in various areas. Even though there are several cases of success and Bayesian networks have been proved to be capable of representing uncertainty in many different domains, there are still two significant barriers to build large-scale Bayesian networks: building the Directed Acyclic Graph (DAG) and the Node Probability Tables (NPTs). In this paper, we focus on the second barrier and present a method that generates NPTs through weighted expressions generated using data collected from domain experts through a survey. Our method is limited to Bayesian networks composed only of ranked nodes. It consists of five steps: (i) define network's DAG, (ii) run the survey, (iii) order the NPTs' relationships given their relative magnitudes, (iv) generate weighted functions and (v) generate NPTs. The advantage of our method, comparing with existing ones that use weighted expressions to generate NPTs, is the ability to quickly collect data from domain experts located around the world. We describe one case in which the method was used for validation purposes and showed that this method requires less time from each domain expert than other existing methods.
{"title":"Using Survey and Weighted Functions to Generate Node Probability Tables for Bayesian Networks","authors":"M. Perkusich, A. Perkusich, Hyggo Oliveira de Almeida","doi":"10.1109/BRICS-CCI-CBIC.2013.39","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.39","url":null,"abstract":"Recently, Bayesian networks became a popular technique to represent knowledge about uncertain domains and have been successfully used for applications in various areas. Even though there are several cases of success and Bayesian networks have been proved to be capable of representing uncertainty in many different domains, there are still two significant barriers to build large-scale Bayesian networks: building the Directed Acyclic Graph (DAG) and the Node Probability Tables (NPTs). In this paper, we focus on the second barrier and present a method that generates NPTs through weighted expressions generated using data collected from domain experts through a survey. Our method is limited to Bayesian networks composed only of ranked nodes. It consists of five steps: (i) define network's DAG, (ii) run the survey, (iii) order the NPTs' relationships given their relative magnitudes, (iv) generate weighted functions and (v) generate NPTs. The advantage of our method, comparing with existing ones that use weighted expressions to generate NPTs, is the ability to quickly collect data from domain experts located around the world. We describe one case in which the method was used for validation purposes and showed that this method requires less time from each domain expert than other existing methods.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129114171","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-09-08DOI: 10.1109/BRICS-CCI-CBIC.2013.78
Rodrigo Silva, H. S. Lopes, W. Godoy
Vehicle Ad hoc Network (VANET) provides an opportunity for innovation in the transportation area, enabling services for Intelligent Transportation System (ITS). Because of VANET features, such as highly dynamic networks topology and frequent discontinuity, it is desirable to establish, at a given moment, routes for fast delivery of messages, having a low probability of disconnection. This leads to a multiobjective problem. In this work we propose multiobjective heuristic algorithm, based on ACO (Ant Colony Optimization) to find routes considering the best commitment between the shortest path (number of nodes in a route) and the lowest probability of disconnection. Simulations were done with three different scenarios: static routing, static routing with obstacles, and dynamic routing. Results were very promising, obtained with small computational effort, and allowing the use of the algorithm for real-time optimization.
{"title":"A Heuristic Algorithm Based on Ant Colony Optimization for Multi-objective Routing in Vehicle Ad Hoc Networks","authors":"Rodrigo Silva, H. S. Lopes, W. Godoy","doi":"10.1109/BRICS-CCI-CBIC.2013.78","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.78","url":null,"abstract":"Vehicle Ad hoc Network (VANET) provides an opportunity for innovation in the transportation area, enabling services for Intelligent Transportation System (ITS). Because of VANET features, such as highly dynamic networks topology and frequent discontinuity, it is desirable to establish, at a given moment, routes for fast delivery of messages, having a low probability of disconnection. This leads to a multiobjective problem. In this work we propose multiobjective heuristic algorithm, based on ACO (Ant Colony Optimization) to find routes considering the best commitment between the shortest path (number of nodes in a route) and the lowest probability of disconnection. Simulations were done with three different scenarios: static routing, static routing with obstacles, and dynamic routing. Results were very promising, obtained with small computational effort, and allowing the use of the algorithm for real-time optimization.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131959642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}