Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670653
Renzo Stanley, H. Astudillo
The increasing globalization has made the preservation of the Intangible Cultural Heritage (ICH) an urgent need, and the UNESCO's states parties have compromised to make collaborative inventories of ICH. Many traditional inventories become obsolete quickly because they present rigid data models and/or because data adquisition from scarce specialists are required. This article proposes to introduce a participative inventory as a semantic wiki, combining the familiarity of the audience with the free text, the expressive power of ontologies, and the benefits of wikis for the controled social enrichment. The scope has been validated in a pilot inventory with public access, that combine an Ontowiki instance, a new ontology based on UNESCO's clasification, and the data of an existent Chilean ICH inventory. Preliminary results indicate that the semantic-based catalog is much more flexible than existing traditional catalogs. Using these collaborative enrichment will enable active involvement to citizenship in recording their immaterial heritage.
{"title":"Ontology and semantic wiki for an Intangible Cultural Heritage inventory","authors":"Renzo Stanley, H. Astudillo","doi":"10.1109/CLEI.2013.6670653","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670653","url":null,"abstract":"The increasing globalization has made the preservation of the Intangible Cultural Heritage (ICH) an urgent need, and the UNESCO's states parties have compromised to make collaborative inventories of ICH. Many traditional inventories become obsolete quickly because they present rigid data models and/or because data adquisition from scarce specialists are required. This article proposes to introduce a participative inventory as a semantic wiki, combining the familiarity of the audience with the free text, the expressive power of ontologies, and the benefits of wikis for the controled social enrichment. The scope has been validated in a pilot inventory with public access, that combine an Ontowiki instance, a new ontology based on UNESCO's clasification, and the data of an existent Chilean ICH inventory. Preliminary results indicate that the semantic-based catalog is much more flexible than existing traditional catalogs. Using these collaborative enrichment will enable active involvement to citizenship in recording their immaterial heritage.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"133 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127305500","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670617
G. Bianchini, Paola Caymes-Scutari
Forest fires are a major risk factor with strong impact at ecological-environmental and socio-economical levels, reasons why their study and modeling is very important. However, the models frequently have a certain level of uncertainty in some input parameters given that they must be approximated or estimated, as a consequence of diverse difficulties to accurately measure the conditions of the phenomenon in real time. This has resulted in the development of several methods of uncertainty reduction, whose trade-off between accuracy and complexity can vary significantly. The system ESS (Evolutionary-Statistical System) is a method whose aim is to reduce the uncertainty, by combining Statistical Analysis, High Performance Computing (HPC) and Parallel Evolutionary Algorithms (PEA). The PEA use several parameters that require adjustment and that determine the quality of their use. The calibration of the parameters is a crucial task for reaching a good performance. This paper presents an empirical study of the parameters tuning to evaluate the effectiveness of different configurations and the impact on their use in the Forest Fires prediction.
{"title":"Calibration of the parameters of ESS system for Forest Fire prediction","authors":"G. Bianchini, Paola Caymes-Scutari","doi":"10.1109/CLEI.2013.6670617","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670617","url":null,"abstract":"Forest fires are a major risk factor with strong impact at ecological-environmental and socio-economical levels, reasons why their study and modeling is very important. However, the models frequently have a certain level of uncertainty in some input parameters given that they must be approximated or estimated, as a consequence of diverse difficulties to accurately measure the conditions of the phenomenon in real time. This has resulted in the development of several methods of uncertainty reduction, whose trade-off between accuracy and complexity can vary significantly. The system ESS (Evolutionary-Statistical System) is a method whose aim is to reduce the uncertainty, by combining Statistical Analysis, High Performance Computing (HPC) and Parallel Evolutionary Algorithms (PEA). The PEA use several parameters that require adjustment and that determine the quality of their use. The calibration of the parameters is a crucial task for reaching a good performance. This paper presents an empirical study of the parameters tuning to evaluate the effectiveness of different configurations and the impact on their use in the Forest Fires prediction.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"94 51","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114052314","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670666
S. Cardona, J. Velez, Sergio Tobón
The competence-based formation model has oriented educational policies in different countries during the last decades. From this model, the socio-formative approach has its basis and works as a referent in Latin America to orient the formation and competence assessment. The socio-formative approach uses the methodology of the formative projects by means of an articulated set of pedagogical strategies that are used along the time to solve the problems of the context. This paper presents a model for competences development and assessment using formative projects as part of a technological architecture to personalize the students' educational experience through a virtual learning environment. The model is structured considering the pedagogical guidelines of the University Corporation CIFE, in Mexico. Finally, a case study is presented to validate the proposed model with formation methodology in a virtual course in the CIFE.
{"title":"Towards a model for the development and assessment of competencies through formative projects","authors":"S. Cardona, J. Velez, Sergio Tobón","doi":"10.1109/CLEI.2013.6670666","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670666","url":null,"abstract":"The competence-based formation model has oriented educational policies in different countries during the last decades. From this model, the socio-formative approach has its basis and works as a referent in Latin America to orient the formation and competence assessment. The socio-formative approach uses the methodology of the formative projects by means of an articulated set of pedagogical strategies that are used along the time to solve the problems of the context. This paper presents a model for competences development and assessment using formative projects as part of a technological architecture to personalize the students' educational experience through a virtual learning environment. The model is structured considering the pedagogical guidelines of the University Corporation CIFE, in Mexico. Finally, a case study is presented to validate the proposed model with formation methodology in a virtual course in the CIFE.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123196220","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670605
José Tomás Cadenas, Leonid José Tineo Rodríguez, R. T. Pereira
Traditional Relational Database Management System (RDBMS) query languages suffer from rigidity. There are some proposals for solving this problem using fuzzy sets, such as SQLf. The trouble with fuzzy querying languages is that they have been proposed to be implemented with loosely or middle coupling architecture, adding undesired overhead. This paper deals with the theoretical and pragmatic foundations for providing SQLf with tight coupling. We present an original experimental study that evidences performance of tight coupling approach for fuzzy querying.
{"title":"About fuzzy query processing","authors":"José Tomás Cadenas, Leonid José Tineo Rodríguez, R. T. Pereira","doi":"10.1109/CLEI.2013.6670605","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670605","url":null,"abstract":"Traditional Relational Database Management System (RDBMS) query languages suffer from rigidity. There are some proposals for solving this problem using fuzzy sets, such as SQLf. The trouble with fuzzy querying languages is that they have been proposed to be implemented with loosely or middle coupling architecture, adding undesired overhead. This paper deals with the theoretical and pragmatic foundations for providing SQLf with tight coupling. We present an original experimental study that evidences performance of tight coupling approach for fuzzy querying.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"5 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123898238","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670649
C. Rangel, F. Pacheco, J. Aguilar, Mariela Cerrada-Lozada, J. Altamiranda
This paper proposes a methodology to identify the feasibility of applying Data Mining techniques (DM) in an organization (institutions or enterprises). This methodology can be applied when there are different sources of data and knowledge, and their interactions are not well identified or defined. The methodology consists of five phases defined to know and characterize the enterprise processes, its its relationships, its actors, and sources of knowledge and data. As a final result, the methodology defines the problems of the institution/company that can be improved with DM tasks MD. The methodology selects the processes of interest based on prioritization, and identifies DM tasks to perform, from the present and expected future scenarios. The utilization of this proposition is illustrated in two different types of organization, health and petroleum.
{"title":"Methodology for detecting the feasibility of using data mining in an organization","authors":"C. Rangel, F. Pacheco, J. Aguilar, Mariela Cerrada-Lozada, J. Altamiranda","doi":"10.1109/CLEI.2013.6670649","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670649","url":null,"abstract":"This paper proposes a methodology to identify the feasibility of applying Data Mining techniques (DM) in an organization (institutions or enterprises). This methodology can be applied when there are different sources of data and knowledge, and their interactions are not well identified or defined. The methodology consists of five phases defined to know and characterize the enterprise processes, its its relationships, its actors, and sources of knowledge and data. As a final result, the methodology defines the problems of the institution/company that can be improved with DM tasks MD. The methodology selects the processes of interest based on prioritization, and identifies DM tasks to perform, from the present and expected future scenarios. The utilization of this proposition is illustrated in two different types of organization, health and petroleum.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126267345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670669
F. Mata, A. Quesada
This paper presents results about the use and barriers that face free and open source software in the local governments in Costa Rica. Through an electronic survey, information was gathered from such governments, allowing to make a diagnostic of the situation.
{"title":"Usage and limitations of free and open-source software in Costa Rican local governmentsi","authors":"F. Mata, A. Quesada","doi":"10.1109/CLEI.2013.6670669","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670669","url":null,"abstract":"This paper presents results about the use and barriers that face free and open source software in the local governments in Costa Rica. Through an electronic survey, information was gathered from such governments, allowing to make a diagnostic of the situation.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"21 6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125842395","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670631
Mayela Delgado H, F. Losavio, A. Matteo
Goals analysis bases on goals identification and refinement processes to guide software development. In some cases, goals are explicitly formulated, in others cases goals are underlying inside organizational actors' needs. Different goals may be expressed in different levels of abstraction; it leads to decomposition in sub-goals for treatment of complexity. This research proposes a model for organizational context definition as a tool to support requirements engineers to understand the organizational elements and their relationships and, also, drive goals analysis. The proposed model states different levels of abstraction and links goals with specific organizational elements and contribute with the refinement. In this work, the foremost Goals-oriented Requirements Engineering techniques are explicated, including their refinement approach and their applicability for goals analysis expressed in different levels of abstraction.
{"title":"Goal oriented techniques and methods: Goal refinement and levels of abstraction","authors":"Mayela Delgado H, F. Losavio, A. Matteo","doi":"10.1109/CLEI.2013.6670631","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670631","url":null,"abstract":"Goals analysis bases on goals identification and refinement processes to guide software development. In some cases, goals are explicitly formulated, in others cases goals are underlying inside organizational actors' needs. Different goals may be expressed in different levels of abstraction; it leads to decomposition in sub-goals for treatment of complexity. This research proposes a model for organizational context definition as a tool to support requirements engineers to understand the organizational elements and their relationships and, also, drive goals analysis. The proposed model states different levels of abstraction and links goals with specific organizational elements and contribute with the refinement. In this work, the foremost Goals-oriented Requirements Engineering techniques are explicated, including their refinement approach and their applicability for goals analysis expressed in different levels of abstraction.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129183605","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670652
R. G. Bianchi, Guilherme Yuki Hatano, Thiago Luís Lopes Siqueira
Spatial data warehouses provide a means of carrying out spatial analysis together with agile and flexible multidimensional analytical queries performed by SOLAP tools. However, queries are processed with slow query response times and functionalities of these tools are still insufficient, such as the support for a variety of spatial data warehouse schemas. In this paper, we conduct an experimental evaluation of existing open source software as SOLAP tool and database management system, to analyze its functionalities and query processing performance. Furthermore, we describe and evaluate our novel SOLAP tool called MapQuery that reuses efficient indices to boost query processing performance. Results derived from our performance evaluation indicated that MapQuery shortened the response time of queries from at least 63% up to 92% if compared to existing solutions.
{"title":"On the performance and use of spatial OLAP tools","authors":"R. G. Bianchi, Guilherme Yuki Hatano, Thiago Luís Lopes Siqueira","doi":"10.1109/CLEI.2013.6670652","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670652","url":null,"abstract":"Spatial data warehouses provide a means of carrying out spatial analysis together with agile and flexible multidimensional analytical queries performed by SOLAP tools. However, queries are processed with slow query response times and functionalities of these tools are still insufficient, such as the support for a variety of spatial data warehouse schemas. In this paper, we conduct an experimental evaluation of existing open source software as SOLAP tool and database management system, to analyze its functionalities and query processing performance. Furthermore, we describe and evaluate our novel SOLAP tool called MapQuery that reuses efficient indices to boost query processing performance. Results derived from our performance evaluation indicated that MapQuery shortened the response time of queries from at least 63% up to 92% if compared to existing solutions.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130435341","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670659
Brenda Diaz-Baez, D. Pinto, C. V. Lücken
A network design is robust if it is able to deal with any traffic requirement under certain bounds and physical network conditions. The robust network design is a complex problem of growing importance where, in general, the only information available are traffic bounds of the network links. This work proposes a Genetic Algorithm to design robust networks with optimal capacity of links considering a stable routing with uncertain traffic that can be divided in k sub-routes. The uncertain traffic is handled by usign the hose model which imposes a maximum input/output traffic for each network node. Experimental results for a set of instances with different number of routes show the convenience of a stable routing against a non-divisible routing (k = 1). However, an increasing of the k value implies an increasing of the number of viable solutions, thus, a trade-off relation between k and the quality of solutions obtained by the proposed algorithm was detected.
{"title":"Robust network design under uncertain taffic an approach based on Genetic Algorithm","authors":"Brenda Diaz-Baez, D. Pinto, C. V. Lücken","doi":"10.1109/CLEI.2013.6670659","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670659","url":null,"abstract":"A network design is robust if it is able to deal with any traffic requirement under certain bounds and physical network conditions. The robust network design is a complex problem of growing importance where, in general, the only information available are traffic bounds of the network links. This work proposes a Genetic Algorithm to design robust networks with optimal capacity of links considering a stable routing with uncertain traffic that can be divided in k sub-routes. The uncertain traffic is handled by usign the hose model which imposes a maximum input/output traffic for each network node. Experimental results for a set of instances with different number of routes show the convenience of a stable routing against a non-divisible routing (k = 1). However, an increasing of the k value implies an increasing of the number of viable solutions, thus, a trade-off relation between k and the quality of solutions obtained by the proposed algorithm was detected.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125388311","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-11-21DOI: 10.1109/CLEI.2013.6670595
Marlene Arangú, M. Salido
Most arc-consistency algorithms take for granted that CSPs are binary (all constraints involve two variables) and normalized (two different constraints do not involve exactly the same variables). When these algorithms perform pruning of values, propagation mechanisms are activated at both levels: value (fine-grained), and constraint (coarse-grained). Thus, values that might become inconsistent because of the pruning are re-checked to ensure their consistency. In this paper, we relax the assumption that the constraints are normalized and we work on problems with non-normalized constraints (there may be more than one constraint that involves the same two variables). In this type of problems, arc consistency techniques are not able to perform the same amount of pruning as 2-consistency techniques, unless a normalization process is performed previously. In this paper we propose the Algorithm 2-C6, which is a reformulation of AC6. The algorithm 2-C6 achieves 2-consistency and performs the finegrained propagations. In empirical evaluations, we compare the performance of the proposed algorithm 2-C6 with the following arc-consistency algorithms: AC3, AC6 and AC7 (coarse-grained and fine-grained, respectively) and with 2-C3, which is a 2-consistency coarse-grained algorithm. From these evaluations, we conclude that the 2-consistency techniques are more appropriated for this type of problem.
{"title":"2-C6: An fine-grained algorithm to achieve 2-consistency","authors":"Marlene Arangú, M. Salido","doi":"10.1109/CLEI.2013.6670595","DOIUrl":"https://doi.org/10.1109/CLEI.2013.6670595","url":null,"abstract":"Most arc-consistency algorithms take for granted that CSPs are binary (all constraints involve two variables) and normalized (two different constraints do not involve exactly the same variables). When these algorithms perform pruning of values, propagation mechanisms are activated at both levels: value (fine-grained), and constraint (coarse-grained). Thus, values that might become inconsistent because of the pruning are re-checked to ensure their consistency. In this paper, we relax the assumption that the constraints are normalized and we work on problems with non-normalized constraints (there may be more than one constraint that involves the same two variables). In this type of problems, arc consistency techniques are not able to perform the same amount of pruning as 2-consistency techniques, unless a normalization process is performed previously. In this paper we propose the Algorithm 2-C6, which is a reformulation of AC6. The algorithm 2-C6 achieves 2-consistency and performs the finegrained propagations. In empirical evaluations, we compare the performance of the proposed algorithm 2-C6 with the following arc-consistency algorithms: AC3, AC6 and AC7 (coarse-grained and fine-grained, respectively) and with 2-C3, which is a 2-consistency coarse-grained algorithm. From these evaluations, we conclude that the 2-consistency techniques are more appropriated for this type of problem.","PeriodicalId":184399,"journal":{"name":"2013 XXXIX Latin American Computing Conference (CLEI)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125659299","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}