Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5648076
Souhila Sadeg, Mohamed Gougache, N. Mansouri, H. Drias
DNA cryptography is a new promising direction in cryptography research that emerged with the progress in DNA computing field. DNA can be used not only to store and transmit information, but also to perform computations. The massive parallelism and extraordinary information density inherent in this molecule are exploited for cryptographic purposes, and several DNA based algorithms are proposed for encryption, authentification and so on. The current main difficulties of DNA cryptography are the absence of theoretical basis, the high tech lab requirements and computation limitations. In this paper, a symmetric key bloc cipher algorithm is proposed. It includes a step that simulates ideas from the processes of transcription (transfer from DNA to mRNA) and translation (from mRNA into amino acids). This algorithm is, we believe, efficient in computation and very secure, since it was designed following recommendations of experts in cryptography and focuses on the application of the fundamental principles of Shannon: Confusion and diffusion. Tests were conducted and the results are very satisfactory.
{"title":"An encryption algorithm inspired from DNA","authors":"Souhila Sadeg, Mohamed Gougache, N. Mansouri, H. Drias","doi":"10.1109/ICMWI.2010.5648076","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5648076","url":null,"abstract":"DNA cryptography is a new promising direction in cryptography research that emerged with the progress in DNA computing field. DNA can be used not only to store and transmit information, but also to perform computations. The massive parallelism and extraordinary information density inherent in this molecule are exploited for cryptographic purposes, and several DNA based algorithms are proposed for encryption, authentification and so on. The current main difficulties of DNA cryptography are the absence of theoretical basis, the high tech lab requirements and computation limitations. In this paper, a symmetric key bloc cipher algorithm is proposed. It includes a step that simulates ideas from the processes of transcription (transfer from DNA to mRNA) and translation (from mRNA into amino acids). This algorithm is, we believe, efficient in computation and very secure, since it was designed following recommendations of experts in cryptography and focuses on the application of the fundamental principles of Shannon: Confusion and diffusion. Tests were conducted and the results are very satisfactory.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127151538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5648157
Rasha Al Dam, A. Guessoum
This paper presents a Transfer Module for an English-to-Arabic Machine Translation System (MTS) using an English-to-Arabic Bilingual Corpus. We propose an approach to build a transfer module by building a new transfer-based system for machine translation using Artificial Neural Networks (ANN). The idea is to allow the ANN-based transfer module to automatically learn correspondences between source and target language structures using a large set of English sentences and their Arabic translations. The paper presents the methodology for corpus building. It then introduces the approach that has been followed to develop the transfer module. It finally presents the experimental results which are very encouraging.
{"title":"Building a neural network-based English-to-Arabic transfer module from an unrestricted domain","authors":"Rasha Al Dam, A. Guessoum","doi":"10.1109/ICMWI.2010.5648157","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5648157","url":null,"abstract":"This paper presents a Transfer Module for an English-to-Arabic Machine Translation System (MTS) using an English-to-Arabic Bilingual Corpus. We propose an approach to build a transfer module by building a new transfer-based system for machine translation using Artificial Neural Networks (ANN). The idea is to allow the ANN-based transfer module to automatically learn correspondences between source and target language structures using a large set of English sentences and their Arabic translations. The paper presents the methodology for corpus building. It then introduces the approach that has been followed to develop the transfer module. It finally presents the experimental results which are very encouraging.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127455357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5648011
H. Faycal, D. Habiba, Mellah Hakima
This paper presents an approach based on multi-agent system (MAS) for encapsulating the features of traditional applications also called legacy systems. We focus our interest particularly on legacy based on Comment Object Request Broker Architecture (CORBA) technology. The encapsulation main objective is to simplify the possibilities for integrating this kind of application in a service-oriented architecture (SOA). We design an interface using Java Agent DEvelopment framework (JADE), which enables automatic generation of code for CORBA clients and ontology classes. The proposed system creates for each feature to wrap a representative agent, and allows the composition of functions according to predefined templates. The system uses the Web Service Integration Gateway (WSIG) to publish capacities of representative agents as web services that will be used in a SOA.
{"title":"Integrating legacy systems in a SOA using an agent based approach for information system agility","authors":"H. Faycal, D. Habiba, Mellah Hakima","doi":"10.1109/ICMWI.2010.5648011","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5648011","url":null,"abstract":"This paper presents an approach based on multi-agent system (MAS) for encapsulating the features of traditional applications also called legacy systems. We focus our interest particularly on legacy based on Comment Object Request Broker Architecture (CORBA) technology. The encapsulation main objective is to simplify the possibilities for integrating this kind of application in a service-oriented architecture (SOA). We design an interface using Java Agent DEvelopment framework (JADE), which enables automatic generation of code for CORBA clients and ontology classes. The proposed system creates for each feature to wrap a representative agent, and allows the composition of functions according to predefined templates. The system uses the Web Service Integration Gateway (WSIG) to publish capacities of representative agents as web services that will be used in a SOA.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117057208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5647909
Kahina Achour, Louiza Slaouti, D. Boughaci
Grid computing is an innovative approach permitting the use of computing resources which are far apart and connected by Wide Area Networks. This recent technology has become extremely popular to optimize computing resources and manage data and computing workloads. The aim of this paper is to propose a metacomputing approach for the winner determination problem in combinatorial auctions (WDP). The proposed approach is a hybrid genetic algorithm adapted to the WDP and implemented on a grid computing platform.
{"title":"A metacomputing approach for the winner determination problem in combinatorial auctions","authors":"Kahina Achour, Louiza Slaouti, D. Boughaci","doi":"10.1109/ICMWI.2010.5647909","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5647909","url":null,"abstract":"Grid computing is an innovative approach permitting the use of computing resources which are far apart and connected by Wide Area Networks. This recent technology has become extremely popular to optimize computing resources and manage data and computing workloads. The aim of this paper is to propose a metacomputing approach for the winner determination problem in combinatorial auctions (WDP). The proposed approach is a hybrid genetic algorithm adapted to the WDP and implemented on a grid computing platform.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131113985","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5647851
George Adam, C. Bouras, V. Poulopoulos
The expansion of the World Wide Web has led to a state where a vast amount of Internet users face and have to overcome the major problem of discovering desired information. It is inevitable that hundreds of web pages and weblogs are generated daily or changing on a daily basis. The main problem that arises from the continuous generation and alteration of web pages is the discovery of useful information, a task that becomes difficult even for the experienced internet users. Many mechanisms have been constructed and presented in order to overcome the puzzle of information discovery on the Internet and they are mostly based on crawlers which are browsing the WWW, downloading pages and collect the information that might be of user interest. In this manuscript we describe a mechanism that fetches web pages that include news articles from major news portals and blogs. This mechanism is constructed in order to support tools that are used to acquire news articles from all over the world, process them and present them back to the end users in a personalized manner.
{"title":"Efficient extraction of news articles based on RSS crawling","authors":"George Adam, C. Bouras, V. Poulopoulos","doi":"10.1109/ICMWI.2010.5647851","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5647851","url":null,"abstract":"The expansion of the World Wide Web has led to a state where a vast amount of Internet users face and have to overcome the major problem of discovering desired information. It is inevitable that hundreds of web pages and weblogs are generated daily or changing on a daily basis. The main problem that arises from the continuous generation and alteration of web pages is the discovery of useful information, a task that becomes difficult even for the experienced internet users. Many mechanisms have been constructed and presented in order to overcome the puzzle of information discovery on the Internet and they are mostly based on crawlers which are browsing the WWW, downloading pages and collect the information that might be of user interest. In this manuscript we describe a mechanism that fetches web pages that include news articles from major news portals and blogs. This mechanism is constructed in order to support tools that are used to acquire news articles from all over the world, process them and present them back to the end users in a personalized manner.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127040545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5647999
K. Hamadache, L. Lancieri
In this paper we define an ontological model to accurately represent the context in Pervasive Computer Supported Collaborative Work. A major issue in this domain is the mass of information required to correctly depict a situation. As we need to represent users and devices according to multiple aspects (physical, computational, social …) the amount of information can quickly become unmanageable. Besides, as a PCSCW context model has to be usable on limited resources devices such as cell phones, GPS, ADSL Modems we needed a more efficient way to represent information. In this perspective the model we propose offers the possibility to represent a situation with more or less precision; that is to say with more or less abstraction. The final goal of this work is then to provide a model able to reason with a precise or fuzzy description of a situation.
{"title":"Towards ontological model accuracy's scalability: Application to the Pervasive Computer Supported Collaborative Work","authors":"K. Hamadache, L. Lancieri","doi":"10.1109/ICMWI.2010.5647999","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5647999","url":null,"abstract":"In this paper we define an ontological model to accurately represent the context in Pervasive Computer Supported Collaborative Work. A major issue in this domain is the mass of information required to correctly depict a situation. As we need to represent users and devices according to multiple aspects (physical, computational, social …) the amount of information can quickly become unmanageable. Besides, as a PCSCW context model has to be usable on limited resources devices such as cell phones, GPS, ADSL Modems we needed a more efficient way to represent information. In this perspective the model we propose offers the possibility to represent a situation with more or less precision; that is to say with more or less abstraction. The final goal of this work is then to provide a model able to reason with a precise or fuzzy description of a situation.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122116444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5648182
Nadia Baha Touzene, S. Larabi
This work aims at defining a new approach for a dense disparity map computing based on the neural networks from a pair of stereo images. Our approach has been divided into two main tasks. The first one deals with computing the initial disparity map using a neuronal method (BP). Whereas the second one presents a simple method to refine the initial disparity map using neural refinement so that an accurate result can be acquired. In the literature, the matching score is based only on the pixel intensities. We introduce in this work two additional features: the gradient magnitude and orientation of the gradient vector of pixels which gives a true degree of similarity between pixels. Experimental results on real data sets were conducted for evaluating the proposed method.
{"title":"Disparity map estimation with neural network","authors":"Nadia Baha Touzene, S. Larabi","doi":"10.1109/ICMWI.2010.5648182","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5648182","url":null,"abstract":"This work aims at defining a new approach for a dense disparity map computing based on the neural networks from a pair of stereo images. Our approach has been divided into two main tasks. The first one deals with computing the initial disparity map using a neuronal method (BP). Whereas the second one presents a simple method to refine the initial disparity map using neural refinement so that an accurate result can be acquired. In the literature, the matching score is based only on the pixel intensities. We introduce in this work two additional features: the gradient magnitude and orientation of the gradient vector of pixels which gives a true degree of similarity between pixels. Experimental results on real data sets were conducted for evaluating the proposed method.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"225 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114378762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5647958
Cheragui Mohamed Amine, Hoceini Youssef, Abbas Moncef
In this paper, we present our work on Arabic morphology and especially the mechanisms for resolving the morphological ambiguity in Arabic text. These researches, which have given birth to TAGHIT system which is a morphosyntactic tagger for Arabic, where the originality of our work lies in the implementation of our internal system of a new approach to disambiguation different from those that currently exist, which is based on the principles and techniques issued from multicriteria decision making.
{"title":"A morphological analysis of Arabic language based on multicriteria decision making: TAGHIT system","authors":"Cheragui Mohamed Amine, Hoceini Youssef, Abbas Moncef","doi":"10.1109/ICMWI.2010.5647958","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5647958","url":null,"abstract":"In this paper, we present our work on Arabic morphology and especially the mechanisms for resolving the morphological ambiguity in Arabic text. These researches, which have given birth to TAGHIT system which is a morphosyntactic tagger for Arabic, where the originality of our work lies in the implementation of our internal system of a new approach to disambiguation different from those that currently exist, which is based on the principles and techniques issued from multicriteria decision making.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125715099","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5647925
Zakia Challal, T. Bouabana-Tebibel
The use of grid computing is becoming increasingly important in the areas requiring large quantity of data and calculation. To provide better access time and fault tolerance in such systems, the replication is one of the main issues for this purpose. The effectiveness of a replication model depends on several factors, including the replicas placement strategy. In this paper, we propose an a priori replicas placement strategy optimizing distances between the data hosted on the grid.
{"title":"A priori replica placement strategy in data grid","authors":"Zakia Challal, T. Bouabana-Tebibel","doi":"10.1109/ICMWI.2010.5647925","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5647925","url":null,"abstract":"The use of grid computing is becoming increasingly important in the areas requiring large quantity of data and calculation. To provide better access time and fault tolerance in such systems, the replication is one of the main issues for this purpose. The effectiveness of a replication model depends on several factors, including the replicas placement strategy. In this paper, we propose an a priori replicas placement strategy optimizing distances between the data hosted on the grid.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123235448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-29DOI: 10.1109/ICMWI.2010.5648152
Mahdia Bakalem, N. Benblidia, S. Oukid
The image retrieval is a particular case of information retrieval. It adds more complex mechanisms to relevance image retrieval: visual content analysis and/or additional textual content. The image auto annotation is a technique that associates text to image, and permits to retrieve image documents as textual documents, thus as in information retrieval. The image auto annotation is then an effective technology for improving the image retrieval. In this work, we propose the AnnotB-LSA algorithm in its first version for the image auto-annotation. The integration of the LSA model permits to extract the latent semantic relations in the textual describers and to minimize the ambiguousness (polysemy, synonymy) between the annotations of images.
{"title":"Latent semantic analysis-based image auto annotation","authors":"Mahdia Bakalem, N. Benblidia, S. Oukid","doi":"10.1109/ICMWI.2010.5648152","DOIUrl":"https://doi.org/10.1109/ICMWI.2010.5648152","url":null,"abstract":"The image retrieval is a particular case of information retrieval. It adds more complex mechanisms to relevance image retrieval: visual content analysis and/or additional textual content. The image auto annotation is a technique that associates text to image, and permits to retrieve image documents as textual documents, thus as in information retrieval. The image auto annotation is then an effective technology for improving the image retrieval. In this work, we propose the AnnotB-LSA algorithm in its first version for the image auto-annotation. The integration of the LSA model permits to extract the latent semantic relations in the textual describers and to minimize the ambiguousness (polysemy, synonymy) between the annotations of images.","PeriodicalId":404577,"journal":{"name":"2010 International Conference on Machine and Web Intelligence","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127488149","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}