Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7805104
Amal Rifai, M'hamed Bakrim, R. Messoussi, A. Sadiq
In this paper, we will model a learning scenario adopting case study strategy by using recursive entity modeling method (REMM). The purpose is, on the one hand, to test the modifications that we have brought to REMM in the previous work, as well as to demonstrate the ability of this method to model active learning strategies, and on the other hand, to design a pedagogical scenario model based on case study strategy by using REMM which will be useful in design of scenarios adopting this strategy.
{"title":"Modeling of learning scenario adopting case study strategy by recursive entity modeling method","authors":"Amal Rifai, M'hamed Bakrim, R. Messoussi, A. Sadiq","doi":"10.1109/CIST.2016.7805104","DOIUrl":"https://doi.org/10.1109/CIST.2016.7805104","url":null,"abstract":"In this paper, we will model a learning scenario adopting case study strategy by using recursive entity modeling method (REMM). The purpose is, on the one hand, to test the modifications that we have brought to REMM in the previous work, as well as to demonstrate the ability of this method to model active learning strategies, and on the other hand, to design a pedagogical scenario model based on case study strategy by using REMM which will be useful in design of scenarios adopting this strategy.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"57 4 Suppl 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116515837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7804850
Houda Chakiri, M. E. Mohajir
We have developed and deployed an e-Government system named E-FES that facilitates citizens' and employees' access to local government information and services in a local registry office (Bureau d'état Civil “BEC”). E-FES system is running since 2006 in different municipalities in both rural and urban areas. Accordingly, we have been able to use a considerable amount of data which enabled us to design and implement a data warehouse in order to monitor the BEC performance. The objective of establishing a data warehouse is to enable easy access to measurable, accurate, consistent and integrated local government data for better and faster decision making and for statistical aims, and also to assess and improve the usability of the system and BEC performance. Moreover, the reports issued from the data warehouse serves as metrics to measure local good governance. In this paper, we are trying to present the different steps towards building a data warehouse system related to BEC monitoring, and relate the results found to assess good governance attributes as defined by the UNDP.
{"title":"A data warehouse for local good governance monitoring and assessement — Case study of local registry office in Morocco","authors":"Houda Chakiri, M. E. Mohajir","doi":"10.1109/CIST.2016.7804850","DOIUrl":"https://doi.org/10.1109/CIST.2016.7804850","url":null,"abstract":"We have developed and deployed an e-Government system named E-FES that facilitates citizens' and employees' access to local government information and services in a local registry office (Bureau d'état Civil “BEC”). E-FES system is running since 2006 in different municipalities in both rural and urban areas. Accordingly, we have been able to use a considerable amount of data which enabled us to design and implement a data warehouse in order to monitor the BEC performance. The objective of establishing a data warehouse is to enable easy access to measurable, accurate, consistent and integrated local government data for better and faster decision making and for statistical aims, and also to assess and improve the usability of the system and BEC performance. Moreover, the reports issued from the data warehouse serves as metrics to measure local good governance. In this paper, we are trying to present the different steps towards building a data warehouse system related to BEC monitoring, and relate the results found to assess good governance attributes as defined by the UNDP.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116292865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7805112
Soundous Zougari, Mariam Tanana, A. Lyhyaoui
In this work, we address the issue of automatic assessment for programming assignments. The objective is to provide immediate feedback to the learners and save teachers from manually managing all the students' solutions. We will present a method merging results from dynamic and static analysis to ensure a reliable and objective evaluation job. While dynamic analysis is based on unit testing framework, the static analysis will focus on finding the adequate structural similarity measure after transforming the programs into control flow graphs.
{"title":"Hybrid assessment method for programming assignments","authors":"Soundous Zougari, Mariam Tanana, A. Lyhyaoui","doi":"10.1109/CIST.2016.7805112","DOIUrl":"https://doi.org/10.1109/CIST.2016.7805112","url":null,"abstract":"In this work, we address the issue of automatic assessment for programming assignments. The objective is to provide immediate feedback to the learners and save teachers from manually managing all the students' solutions. We will present a method merging results from dynamic and static analysis to ensure a reliable and objective evaluation job. While dynamic analysis is based on unit testing framework, the static analysis will focus on finding the adequate structural similarity measure after transforming the programs into control flow graphs.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126168998","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7805062
Zakariae Alami Merrouni, B. Frikh, B. Ouhbi
Keyphrases are useful for a variety of tasks in information retrieval systems and natural language processing, such as text summarization, automatic indexing, clustering/classification, ontology learning and building and conceptualizing particular knowledge domains, etc. However, assigning these keyphrases manually is time consuming and expensive in term of human resources. Therefore, there is a need to automate the task of extracting keyphrases. A wide range of techniques of keyphrase extraction have been proposed, but they are still suffering from the low accuracy rate and poor performance. This paper presents a state of the art of automatic keyphrase extraction approaches to identify their strengths and weaknesses. We also discuss why some techniques perform better than others and how can we improve the task of automatic keyphrase extraction.
{"title":"Automatic keyphrase extraction: An overview of the state of the art","authors":"Zakariae Alami Merrouni, B. Frikh, B. Ouhbi","doi":"10.1109/CIST.2016.7805062","DOIUrl":"https://doi.org/10.1109/CIST.2016.7805062","url":null,"abstract":"Keyphrases are useful for a variety of tasks in information retrieval systems and natural language processing, such as text summarization, automatic indexing, clustering/classification, ontology learning and building and conceptualizing particular knowledge domains, etc. However, assigning these keyphrases manually is time consuming and expensive in term of human resources. Therefore, there is a need to automate the task of extracting keyphrases. A wide range of techniques of keyphrase extraction have been proposed, but they are still suffering from the low accuracy rate and poor performance. This paper presents a state of the art of automatic keyphrase extraction approaches to identify their strengths and weaknesses. We also discuss why some techniques perform better than others and how can we improve the task of automatic keyphrase extraction.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124603487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7804999
Karim El Mokhtari, S. Reboul, J. Choquel, B. Amami, M. Benjelloun
This article presents the implementation of an indoor localization approach that combines map matching and a circular particle filter defined in a Bayesian framework. The technique relies only on velocity and heading observations coupled with a map of the road network. No prior knowledge of the initial position is given. A circular distribution is used to match the vehicle's heading with the roads direction. This allows to detect turns and provide a more accurate position estimate. The algorithm is assessed with a synthetic dataset in a real context.
{"title":"Indoor localization by particle map matching","authors":"Karim El Mokhtari, S. Reboul, J. Choquel, B. Amami, M. Benjelloun","doi":"10.1109/CIST.2016.7804999","DOIUrl":"https://doi.org/10.1109/CIST.2016.7804999","url":null,"abstract":"This article presents the implementation of an indoor localization approach that combines map matching and a circular particle filter defined in a Bayesian framework. The technique relies only on velocity and heading observations coupled with a map of the road network. No prior knowledge of the initial position is given. A circular distribution is used to match the vehicle's heading with the roads direction. This allows to detect turns and provide a more accurate position estimate. The algorithm is assessed with a synthetic dataset in a real context.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124646108","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7805054
Lamiae Chtioui, Amine Baïna, M. Bellafkih
Critical infrastructures are the set of essential systems for the normal functioning of our modern society. Their failure could have severe consequences on economy and population and may undermine security. Indeed, their protection is a requirement. Nowadays Critical infrastructures are becoming tightly coupled and interdependent due to the exchange of products and services, which leads to the hardness of their protection. Therefore, a failure of a component of an infrastructure may impact the performance of other infrastructures. The work presented in this paper deals with Critical Infrastructures Protection regarding interdependencies. Within it, we compare between some modeling approaches used to study interdependencies in critical infrastructures, then we model them using UML sequence diagram to understand their operation, in order to extract their properties and strengths and use them in our proposed approach. Our purpose is to conceive a modeling approach to understand and reduce interdependencies' vulnerabilities, which must be general enough to be applied to any critical infrastructure case study.
{"title":"Towards the conception of a new approach for modeling interdependencies in Critical Infrastructures","authors":"Lamiae Chtioui, Amine Baïna, M. Bellafkih","doi":"10.1109/CIST.2016.7805054","DOIUrl":"https://doi.org/10.1109/CIST.2016.7805054","url":null,"abstract":"Critical infrastructures are the set of essential systems for the normal functioning of our modern society. Their failure could have severe consequences on economy and population and may undermine security. Indeed, their protection is a requirement. Nowadays Critical infrastructures are becoming tightly coupled and interdependent due to the exchange of products and services, which leads to the hardness of their protection. Therefore, a failure of a component of an infrastructure may impact the performance of other infrastructures. The work presented in this paper deals with Critical Infrastructures Protection regarding interdependencies. Within it, we compare between some modeling approaches used to study interdependencies in critical infrastructures, then we model them using UML sequence diagram to understand their operation, in order to extract their properties and strengths and use them in our proposed approach. Our purpose is to conceive a modeling approach to understand and reduce interdependencies' vulnerabilities, which must be general enough to be applied to any critical infrastructure case study.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114562496","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7804957
Biao Yin, M. Dridi, A. E. Moudni
In this paper, we mainly focus on a comparison of three types of dynamic programming based algorithms for optimal and near-optimal solutions of traffic signal control problem. The algorithms are backward dynamic programming (BDP), forward dynamic programming (FDP), and approximate dynamic programming (ADP). The traffic signal control model at isolated intersection is formulated by discrete-time Markov decision process in stochastic traffic environment. Optimal solutions by BDP and FDP algorithms are considered in traffic system for stochastic state transition and deterministic state transition, respectively. A near-optimal solution by ADP for problem control adopts a linear function approximation in order to overcome computational complexity. In simulation, these three control algorithms are compared in different traffic scenarios with performances of average traffic delay and vehicle stops.
{"title":"Comparing dynamic programming based algorithms in traffic signal control system","authors":"Biao Yin, M. Dridi, A. E. Moudni","doi":"10.1109/CIST.2016.7804957","DOIUrl":"https://doi.org/10.1109/CIST.2016.7804957","url":null,"abstract":"In this paper, we mainly focus on a comparison of three types of dynamic programming based algorithms for optimal and near-optimal solutions of traffic signal control problem. The algorithms are backward dynamic programming (BDP), forward dynamic programming (FDP), and approximate dynamic programming (ADP). The traffic signal control model at isolated intersection is formulated by discrete-time Markov decision process in stochastic traffic environment. Optimal solutions by BDP and FDP algorithms are considered in traffic system for stochastic state transition and deterministic state transition, respectively. A near-optimal solution by ADP for problem control adopts a linear function approximation in order to overcome computational complexity. In simulation, these three control algorithms are compared in different traffic scenarios with performances of average traffic delay and vehicle stops.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121860208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7804963
Ouadi Belmokhtar, D. Chiadmi
Cloud Computing is going to be the next IT. Using Cloud services has become a key action to carry out the business strategy to the next level, whether for academic institutions or industry, small or large scale businesses. For a Cloud Service Vendor (CSV), to build a Cloud software solution, in most situations an atomic Cloud service could not meet all customer's requirements, due to its basic offered functionality. Thus, CSV ought to have efficient Cloud services composition techniques and tools to handle customer's excessive requests for Cloud services-based software solutions. Cloud services composition is a research area that has emerged in the last few years. However, the available Cloud services composition approaches are oriented to be used by Cloud users. In literature, basically, these approaches are only based on user's preferences to propose a Cloud solution that meets user's needs. This paper aims to demonstrate our initial work to propose a novel Cloud services composition approach which is based on the injection of customer models and CSV rules and specifications during the composition process stages. Our proposed approach is designed to be used by CSVs. To demonstrate its feasibility, we suggest a primal framework architecture. Which is designed by four key components. Firstly, throughout the GUI component, the CSV submits a request that expresses customer preferences, and CSV rules and specifications. After handling the request, the framework core component will either look up the candidate Cloud services in its own local repository or collaborates with third-party CSVs. Secondly, once the right services discovered and selected, at the composition stage, the framework will inject a set of CSV composition rules and specifications combined with a set of customer preferences. Finally, the framework core will launch an execution plan to build a workflow diagram-which formalizes the outcome composite Cloud service-that will be displayed in the GUI.
{"title":"A cloud services composition approach based on customer's models and cloud services vendor's rules and specifications","authors":"Ouadi Belmokhtar, D. Chiadmi","doi":"10.1109/CIST.2016.7804963","DOIUrl":"https://doi.org/10.1109/CIST.2016.7804963","url":null,"abstract":"Cloud Computing is going to be the next IT. Using Cloud services has become a key action to carry out the business strategy to the next level, whether for academic institutions or industry, small or large scale businesses. For a Cloud Service Vendor (CSV), to build a Cloud software solution, in most situations an atomic Cloud service could not meet all customer's requirements, due to its basic offered functionality. Thus, CSV ought to have efficient Cloud services composition techniques and tools to handle customer's excessive requests for Cloud services-based software solutions. Cloud services composition is a research area that has emerged in the last few years. However, the available Cloud services composition approaches are oriented to be used by Cloud users. In literature, basically, these approaches are only based on user's preferences to propose a Cloud solution that meets user's needs. This paper aims to demonstrate our initial work to propose a novel Cloud services composition approach which is based on the injection of customer models and CSV rules and specifications during the composition process stages. Our proposed approach is designed to be used by CSVs. To demonstrate its feasibility, we suggest a primal framework architecture. Which is designed by four key components. Firstly, throughout the GUI component, the CSV submits a request that expresses customer preferences, and CSV rules and specifications. After handling the request, the framework core component will either look up the candidate Cloud services in its own local repository or collaborates with third-party CSVs. Secondly, once the right services discovered and selected, at the composition stage, the framework will inject a set of CSV composition rules and specifications combined with a set of customer preferences. Finally, the framework core will launch an execution plan to build a workflow diagram-which formalizes the outcome composite Cloud service-that will be displayed in the GUI.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130695588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7805077
S. Olivieri, Ivana Pepe, Ilaria Cicola
Text encoding is considered as the most functional outset to store and retrieve data, with trees of information and lists of concordances as its first immediate results, but there is a wide range of possible results opening up when a complete encoding process is accomplished. The three case studies described in this paper are meant to give an overall view on the preliminary steps of a wider project on tagset fine-tuning and adjustment of already existing tools for data analysis.
{"title":"Encoding Arabic rhetorical structure: A methodology for the extraction of Arabic lexical information from TEI-encoded classical sources","authors":"S. Olivieri, Ivana Pepe, Ilaria Cicola","doi":"10.1109/CIST.2016.7805077","DOIUrl":"https://doi.org/10.1109/CIST.2016.7805077","url":null,"abstract":"Text encoding is considered as the most functional outset to store and retrieve data, with trees of information and lists of concordances as its first immediate results, but there is a wide range of possible results opening up when a complete encoding process is accomplished. The three case studies described in this paper are meant to give an overall view on the preliminary steps of a wider project on tagset fine-tuning and adjustment of already existing tools for data analysis.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117279961","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/CIST.2016.7804967
H. Manssouri, S. Farrah, E. Ziyati, M. Ouzzif
Analysis of Arabic language has become a necessity because of its big evolution; we propose in this paper a novel classification algorithm called EST.Stemmer of Arabic text, it presents many important improvement areas which deals with some issues in Heavy Stemmers and Light Stemmers.
{"title":"Proposition of improvement areas in most heavy an light stemmer algorithms novel stemmer: EST.Stemmer","authors":"H. Manssouri, S. Farrah, E. Ziyati, M. Ouzzif","doi":"10.1109/CIST.2016.7804967","DOIUrl":"https://doi.org/10.1109/CIST.2016.7804967","url":null,"abstract":"Analysis of Arabic language has become a necessity because of its big evolution; we propose in this paper a novel classification algorithm called EST.Stemmer of Arabic text, it presents many important improvement areas which deals with some issues in Heavy Stemmers and Light Stemmers.","PeriodicalId":196827,"journal":{"name":"2016 4th IEEE International Colloquium on Information Science and Technology (CiSt)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129388054","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}