Pub Date : 2016-04-05DOI: 10.1109/IACS.2016.7476114
P. Gyawal, A. Alsadoon, P. Prasad, L. S. Hoe, A. Elchouemi
Some individuals who have the disability of the limbs cannot use the computer with their limbs. This paper is aimed to propose a new robust method for the implementation of the camera mouse for such disabled people. This method consists of the face recognition and extraction of user eyes location. The experiment is conducted to verify the results that were more accurate than the current solution to camera mouse which is described in the Experiment and result section of this paper. It aims to introduce how this system operates and analyze its advantages of the device. It also identifies the problems associated with past developments of systems. However the main purpose of this paper is to put forward a modified version of the system.
{"title":"A novel robust camera mouse for disabled people (RCMDP)","authors":"P. Gyawal, A. Alsadoon, P. Prasad, L. S. Hoe, A. Elchouemi","doi":"10.1109/IACS.2016.7476114","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476114","url":null,"abstract":"Some individuals who have the disability of the limbs cannot use the computer with their limbs. This paper is aimed to propose a new robust method for the implementation of the camera mouse for such disabled people. This method consists of the face recognition and extraction of user eyes location. The experiment is conducted to verify the results that were more accurate than the current solution to camera mouse which is described in the Experiment and result section of this paper. It aims to introduce how this system operates and analyze its advantages of the device. It also identifies the problems associated with past developments of systems. However the main purpose of this paper is to put forward a modified version of the system.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"16 1","pages":"217-220"},"PeriodicalIF":0.0,"publicationDate":"2016-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74486231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-05DOI: 10.1109/IACS.2016.7476067
M. R. R. Lazwanthi, A. Alsadoon, P. Prasad, S. Sager, A. Elchouemi
There is an enormous growth in the number of companies that are adapting to agile methodology. However questions regarding the impact of culture factors on agile projects arise. Adapting to this agile culture can come with a few challenges. Hence for successful implementation of agile culture it is important to understand the relationship between agile culture and culture of an organization, the cultural issues related to each methodology of the agile culture and the agile practices and techniques that are necessary to address these issues. Questionnaire survey has conducted with agile experienced team members as well. Adding to these factors having a well designed Universal Agile Culture Model (UACM) that can guide the organization, management and team members to achieve higher levels of success in the progress of agile projects is important. This model has proposed based on literature review and the analysis from the questionnaire survey results.
{"title":"Cultural impact on agile projects: Universal agile culture model (UACM)","authors":"M. R. R. Lazwanthi, A. Alsadoon, P. Prasad, S. Sager, A. Elchouemi","doi":"10.1109/IACS.2016.7476067","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476067","url":null,"abstract":"There is an enormous growth in the number of companies that are adapting to agile methodology. However questions regarding the impact of culture factors on agile projects arise. Adapting to this agile culture can come with a few challenges. Hence for successful implementation of agile culture it is important to understand the relationship between agile culture and culture of an organization, the cultural issues related to each methodology of the agile culture and the agile practices and techniques that are necessary to address these issues. Questionnaire survey has conducted with agile experienced team members as well. Adding to these factors having a well designed Universal Agile Culture Model (UACM) that can guide the organization, management and team members to achieve higher levels of success in the progress of agile projects is important. This model has proposed based on literature review and the analysis from the questionnaire survey results.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"12 4 1","pages":"292-297"},"PeriodicalIF":0.0,"publicationDate":"2016-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90172786","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-05DOI: 10.1109/IACS.2016.7476109
Islam Obaidat, M. Alsmirat, Y. Jararweh
Since IEEE 802.11 standard first released, it gained a worldwide attention. As a result, a huge variety of devices have implemented it. IEEE 802.11e standard was developed as an amendment to the original 802.11 standard to provide QoS support. Several simulators provides an implementation for this standard in order to study and analyze its performance. NS-3 simulator implements IEEE 802.11e, but unfortunately, we found that the implementation is not complete. In this work, we complete the implementation of the IEEE 802.11e in NS-3.
{"title":"Completing IEEE 802.11e implementation in NS-3","authors":"Islam Obaidat, M. Alsmirat, Y. Jararweh","doi":"10.1109/IACS.2016.7476109","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476109","url":null,"abstract":"Since IEEE 802.11 standard first released, it gained a worldwide attention. As a result, a huge variety of devices have implemented it. IEEE 802.11e standard was developed as an amendment to the original 802.11 standard to provide QoS support. Several simulators provides an implementation for this standard in order to study and analyze its performance. NS-3 simulator implements IEEE 802.11e, but unfortunately, we found that the implementation is not complete. In this work, we complete the implementation of the IEEE 802.11e in NS-3.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"59 1","pages":"190-195"},"PeriodicalIF":0.0,"publicationDate":"2016-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86259854","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-05DOI: 10.1109/IACS.2016.7476082
Xu Du, A. Varde
Fine particle pollution is related to road traffic conditions. In this work, we analyze Particulate Matter with a diameter less than 2.5 micrometers, called PM2.5, along with traffic conditions. This is done for multicity data to study the relationships in the context of environmental modeling. The goal behind this modeling is to support prediction of PM2.5 concentration and resulting air quality. We deploy data mining algorithms in association rules, clustering and classification to discover knowledge from the concerned data sets. The results are used to develop a prototype tool for the prediction of PM2.5 and hence air quality for public health and safety. This paper describes our approach and experiments with examples of PM2.5 prediction that would be helpful for decision support to potential users in a smart cities context. These users include city dwellers, environmental scientists and urban planners. Novel aspects of this work are multicity PM2.5 analysis by data mining and the resulting air quality prediction tool, the first of its kind, to the best of our knowledge.
{"title":"Mining PM2.5 and traffic conditions for air quality","authors":"Xu Du, A. Varde","doi":"10.1109/IACS.2016.7476082","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476082","url":null,"abstract":"Fine particle pollution is related to road traffic conditions. In this work, we analyze Particulate Matter with a diameter less than 2.5 micrometers, called PM2.5, along with traffic conditions. This is done for multicity data to study the relationships in the context of environmental modeling. The goal behind this modeling is to support prediction of PM2.5 concentration and resulting air quality. We deploy data mining algorithms in association rules, clustering and classification to discover knowledge from the concerned data sets. The results are used to develop a prototype tool for the prediction of PM2.5 and hence air quality for public health and safety. This paper describes our approach and experiments with examples of PM2.5 prediction that would be helpful for decision support to potential users in a smart cities context. These users include city dwellers, environmental scientists and urban planners. Novel aspects of this work are multicity PM2.5 analysis by data mining and the resulting air quality prediction tool, the first of its kind, to the best of our knowledge.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"16 1","pages":"33-38"},"PeriodicalIF":0.0,"publicationDate":"2016-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89305837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In the situation of CPU and memory overcommit, it is inevitable that some NUMA nodes will be overloaded or hotspotted and become hot nodes, leading to the VM application performance degradation in virtualized NUMA (vNUMA) systems. However, the virtual machine monitor (VMM) can not be aware of the NUMA feature and the distribution array of hot memory pages amongst NUMA nodes effectively. Aiming at eliminating the hot nodes and load imbalance in a vNUMA system, this paper proposes a hot-page aware scheduling optimization method (HASO) and implements a HASO scheduling system. At first we monitor the NUMA node load state, find the hot nodes in which we choose the hot VMs that cause the node hotspots. Then, we predict the distribution of future hot memory pages of the hot VM, and evaluate the cost of migrating the hot pages between NUMA nodes. At last, the hot pages of hot VM with minimized migration cost are migrated to idling nodes, so as to eliminate the hot node and improve the VM application performance. In contrast to the default scheduling mechanism of VMM, our HASO scheduling method can not only improve the memory intensive benchmark cg by up to 27.06% and the benchmark stream by up to 15.63%, but also balance the load of NUMA nodes.
{"title":"HASO: A hot-page aware scheduling optimization method in virtualized NUMA systems","authors":"Butian Huang, Jianhai Chen, Qinming He, Bei Wang, Zhenguang Liu, Yuxia Cheng","doi":"10.1109/IACS.2016.7476088","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476088","url":null,"abstract":"In the situation of CPU and memory overcommit, it is inevitable that some NUMA nodes will be overloaded or hotspotted and become hot nodes, leading to the VM application performance degradation in virtualized NUMA (vNUMA) systems. However, the virtual machine monitor (VMM) can not be aware of the NUMA feature and the distribution array of hot memory pages amongst NUMA nodes effectively. Aiming at eliminating the hot nodes and load imbalance in a vNUMA system, this paper proposes a hot-page aware scheduling optimization method (HASO) and implements a HASO scheduling system. At first we monitor the NUMA node load state, find the hot nodes in which we choose the hot VMs that cause the node hotspots. Then, we predict the distribution of future hot memory pages of the hot VM, and evaluate the cost of migrating the hot pages between NUMA nodes. At last, the hot pages of hot VM with minimized migration cost are migrated to idling nodes, so as to eliminate the hot node and improve the VM application performance. In contrast to the default scheduling mechanism of VMM, our HASO scheduling method can not only improve the memory intensive benchmark cg by up to 27.06% and the benchmark stream by up to 15.63%, but also balance the load of NUMA nodes.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"24 1","pages":"68-73"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90481796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-01DOI: 10.1109/IACS.2016.7476122
Mohd Anuaruddin bin Ahmadon, S. Yamaguchi, B. Gupta
In conventional software development, a program evolves as programmers make changes to its source code. Thus, the process of developing new version or verifying important software specifications based on its original design is difficult. In this paper, we introduced a model-driven development approach to support software evolution. We proposed two key methods in our approach. First, we proposed a reverse engineering method by translating a software program into a Petri net model. Second, we proposed a model-driven verification method to confirm that important execution sequence of the software model can be preserved throughout the evolution. In our approach, a program's code can always be reconstructed as a model and be verified even though changes are made at the source code level or vice versa. In other words, our approach is bidirectional. Then, we illustrated the proposed method with an example of a multithreaded program.
{"title":"A Petri-net based approach for software evolution","authors":"Mohd Anuaruddin bin Ahmadon, S. Yamaguchi, B. Gupta","doi":"10.1109/IACS.2016.7476122","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476122","url":null,"abstract":"In conventional software development, a program evolves as programmers make changes to its source code. Thus, the process of developing new version or verifying important software specifications based on its original design is difficult. In this paper, we introduced a model-driven development approach to support software evolution. We proposed two key methods in our approach. First, we proposed a reverse engineering method by translating a software program into a Petri net model. Second, we proposed a model-driven verification method to confirm that important execution sequence of the software model can be preserved throughout the evolution. In our approach, a program's code can always be reconstructed as a model and be verified even though changes are made at the source code level or vice versa. In other words, our approach is bidirectional. Then, we illustrated the proposed method with an example of a multithreaded program.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"5 6","pages":"264-269"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91438163","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-01DOI: 10.1109/IACS.2016.7476068
N. Sakhnini, Venkata N. Inukollu, J. E. Urban
Automatic programming can be defined as developing software in a high abstraction level. The definition of automatic programming is not precise because what is meant by automatic programming is changing over time. The goal of automatic programming has the programmer set the specifications of a program and the computer generate the source code of that program. There exists a group of specification languages that vary in their properties; the Descartes specification language is known to be comprehensible and easily constructible. Descartes represents the specifications by defining a system's inputs and outputs, as well as the relationship between these as functions. Descartes has been extended to support concurrent systems. These features made Descartes to be a good basis to build this research effort on. This research effort studied automatic programming approaches and created a shortcut between specifications and implementation with all its benefits. This research created a way to transform Descartes specifications into C source code automatically. Automatic programming can apply to all fields of knowledge that can be automated; therefore, the scope of this research project was restricted to a few case studies that involve parallel programming.
{"title":"Automatic parallel programming using the descartes specification language","authors":"N. Sakhnini, Venkata N. Inukollu, J. E. Urban","doi":"10.1109/IACS.2016.7476068","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476068","url":null,"abstract":"Automatic programming can be defined as developing software in a high abstraction level. The definition of automatic programming is not precise because what is meant by automatic programming is changing over time. The goal of automatic programming has the programmer set the specifications of a program and the computer generate the source code of that program. There exists a group of specification languages that vary in their properties; the Descartes specification language is known to be comprehensible and easily constructible. Descartes represents the specifications by defining a system's inputs and outputs, as well as the relationship between these as functions. Descartes has been extended to support concurrent systems. These features made Descartes to be a good basis to build this research effort on. This research effort studied automatic programming approaches and created a shortcut between specifications and implementation with all its benefits. This research created a way to transform Descartes specifications into C source code automatically. Automatic programming can apply to all fields of knowledge that can be automated; therefore, the scope of this research project was restricted to a few case studies that involve parallel programming.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"30 1","pages":"298-303"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84519299","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-01DOI: 10.1109/IACS.2016.7476063
M. Machkour, K. Afdel, Y. I. Khamlichi
XML is a standard for data exchanging between sites and heterogeneous applications. To exploit these data by database systems based on relational model, algorithms and methods of conversion have been developed. To do same with object-relational systems representing an extension of relational systems we propose in this paper a methodology to convert a XML schema respecting a DTD (Document Type Definition) into a schema of object-relational model. This methodology is reversible so that the result of conversion can be used to rebuild the initial XML schema.
{"title":"A reversible conversion methodology: Between XML and object-relational models","authors":"M. Machkour, K. Afdel, Y. I. Khamlichi","doi":"10.1109/IACS.2016.7476063","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476063","url":null,"abstract":"XML is a standard for data exchanging between sites and heterogeneous applications. To exploit these data by database systems based on relational model, algorithms and methods of conversion have been developed. To do same with object-relational systems representing an extension of relational systems we propose in this paper a methodology to convert a XML schema respecting a DTD (Document Type Definition) into a schema of object-relational model. This methodology is reversible so that the result of conversion can be used to rebuild the initial XML schema.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"96 1","pages":"270-275"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83938193","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-01DOI: 10.1109/IACS.2016.7476113
R. Karki, A. Alsadoon, P. Prasad, A. M. S. Rahma, A. Elchouemi
This paper presents a method based on contourlet transform for the purpose of art authentication. The sample digital images from the different artist are converted to the gray scale images. The background patches from the gray scale images are extracted. The extracted patches are transformed using Contourelet transform and modeled using Hidden Markov Tree (HMT). The Fisher distance information is calculated to measure for stylistic similarity between the brushwork of the artist. The result obtained is analyzed to determine the authenticity of the paintings. In this method we have analyzed different current existing solutions determine their limitations and propose a new hybrid method that can overcome those limitations. It is tested on the sample of twenty images of the paintings, consisting of the paintings from the Van Gog and other artist separated as the Van Gog and Non Van Gog paintings.
{"title":"A proposed method on painting authentication using Contourelet transform (PAUCT)","authors":"R. Karki, A. Alsadoon, P. Prasad, A. M. S. Rahma, A. Elchouemi","doi":"10.1109/IACS.2016.7476113","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476113","url":null,"abstract":"This paper presents a method based on contourlet transform for the purpose of art authentication. The sample digital images from the different artist are converted to the gray scale images. The background patches from the gray scale images are extracted. The extracted patches are transformed using Contourelet transform and modeled using Hidden Markov Tree (HMT). The Fisher distance information is calculated to measure for stylistic similarity between the brushwork of the artist. The result obtained is analyzed to determine the authenticity of the paintings. In this method we have analyzed different current existing solutions determine their limitations and propose a new hybrid method that can overcome those limitations. It is tested on the sample of twenty images of the paintings, consisting of the paintings from the Van Gog and other artist separated as the Van Gog and Non Van Gog paintings.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"18 1","pages":"213-216"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84190677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-01DOI: 10.1109/IACS.2016.7476070
Omar Herouane, L. Moumoun, T. Gadi
Machine learning has recently become an interesting research field in 3D objects preprocessing. However, few algorithms using this automatic technique have been proposed to learn 3D objects parts. The aim of this paper is to present two simple and efficient approaches to learn parts of a 3D object. These approaches use Bagging or multiclass Boosting algorithms and the Shape Spectrum Descriptor (SSD) to build the classification models. The trained models will assign an appropriate label to each part of the 3D object of the database. The high quality of the quantitative and qualitative results obtained demonstrated the efficiency and the performance of the proposed approaches.
{"title":"Using bagging and boosting algorithms for 3D object labeling","authors":"Omar Herouane, L. Moumoun, T. Gadi","doi":"10.1109/IACS.2016.7476070","DOIUrl":"https://doi.org/10.1109/IACS.2016.7476070","url":null,"abstract":"Machine learning has recently become an interesting research field in 3D objects preprocessing. However, few algorithms using this automatic technique have been proposed to learn 3D objects parts. The aim of this paper is to present two simple and efficient approaches to learn parts of a 3D object. These approaches use Bagging or multiclass Boosting algorithms and the Shape Spectrum Descriptor (SSD) to build the classification models. The trained models will assign an appropriate label to each part of the 3D object of the database. The high quality of the quantitative and qualitative results obtained demonstrated the efficiency and the performance of the proposed approaches.","PeriodicalId":6579,"journal":{"name":"2016 7th International Conference on Information and Communication Systems (ICICS)","volume":"10 1","pages":"310-315"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75407550","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}