Pub Date : 2011-12-01DOI: 10.1109/CIS.2011.6169128
Xuefeng Zhang, Xuanye An, M. Koshimura, H. Fujita, R. Hasegawa
A Hybrid Particle Swarm Optimization (HPSO) with parameter selection approaches is proposed to solve Flow Shop Scheduling Problem (FSSP) with the objective of minimizing makespan. The HPSO integrates the basic structure of a Particle Swarm Optimization (PSO) together with features borrowed from the fields of Tabu Search (TS), Simulated Annealing (SA). The algorithm works from a population of candidate schedules and generates new populations of neighbor and cooling schedules by applying suitable small perturbation schemes. Furthermore, PSO is very sensitive to efficient parameter setting such that modifying a single parameter may cause a considerable change in the result. Another two classes of new adaptive selection of value for inertia weight and acceleration coefficients are introduced into it. Extensive experiments on different scale benchmarks validate the effectiveness of our approaches, compared with other well-established methods. The experimental results show that new upper bounds of some unsolved problems and better solutions in a relatively reasonable time. In addition, proposed algorithms converge to stopping criteria significantly faster.
{"title":"Hybrid Particle Swarm Optimization with parameter selection approaches to solve Flow Shop Scheduling Problem","authors":"Xuefeng Zhang, Xuanye An, M. Koshimura, H. Fujita, R. Hasegawa","doi":"10.1109/CIS.2011.6169128","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169128","url":null,"abstract":"A Hybrid Particle Swarm Optimization (HPSO) with parameter selection approaches is proposed to solve Flow Shop Scheduling Problem (FSSP) with the objective of minimizing makespan. The HPSO integrates the basic structure of a Particle Swarm Optimization (PSO) together with features borrowed from the fields of Tabu Search (TS), Simulated Annealing (SA). The algorithm works from a population of candidate schedules and generates new populations of neighbor and cooling schedules by applying suitable small perturbation schemes. Furthermore, PSO is very sensitive to efficient parameter setting such that modifying a single parameter may cause a considerable change in the result. Another two classes of new adaptive selection of value for inertia weight and acceleration coefficients are introduced into it. Extensive experiments on different scale benchmarks validate the effectiveness of our approaches, compared with other well-established methods. The experimental results show that new upper bounds of some unsolved problems and better solutions in a relatively reasonable time. In addition, proposed algorithms converge to stopping criteria significantly faster.","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127767176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-09-01DOI: 10.1109/CIS.2011.6169146
A. Hosseinian-Far, H. Jahankhani, E. Pimenidis
London Plan is the London mayor's strategic plan. Policy 4A puts forward the “Tackling climate change” scenario. Many research studies have analyzed the environmental and social aspects of the policy, but lack of an extensive financial assessment is identifiable. The Influence Diagrams in this paper concern financial assessment of Policy 4A.2 of the London Plan. Its nature is complex, uncertain and requires prediction. Probabilistic Networks are novel tools for policy Knowledge Representation. This paper outlines Influence diagrams construction for London Plan case. It also analyses the variables, functions and the relationships in between and gives an overview of ID evaluation. In addition an introduction of efficiency analysis of the ID is given.
{"title":"Using probabilistic networks for the London plan knowledge representation","authors":"A. Hosseinian-Far, H. Jahankhani, E. Pimenidis","doi":"10.1109/CIS.2011.6169146","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169146","url":null,"abstract":"London Plan is the London mayor's strategic plan. Policy 4A puts forward the “Tackling climate change” scenario. Many research studies have analyzed the environmental and social aspects of the policy, but lack of an extensive financial assessment is identifiable. The Influence Diagrams in this paper concern financial assessment of Policy 4A.2 of the London Plan. Its nature is complex, uncertain and requires prediction. Probabilistic Networks are novel tools for policy Knowledge Representation. This paper outlines Influence diagrams construction for London Plan case. It also analyses the variables, functions and the relationships in between and gives an overview of ID evaluation. In addition an introduction of efficiency analysis of the ID is given.","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131535967","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-09-01DOI: 10.1109/CIS.2011.6169135
Abdullah Bawakid, M. Oussalah
In this paper, we emphasize the need for conserving space within sentences by introducing a Sentences Simplification Module (SSM). The module is aimed to shorten the length of sentences via either splitting or compression. We describe how the module is integrated in a Wikipedia-based summarization framework. We highlight the performance differences obtained from introducing such a module by running a series of evaluations.
{"title":"Sentences Simplification for Automatic summarization","authors":"Abdullah Bawakid, M. Oussalah","doi":"10.1109/CIS.2011.6169135","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169135","url":null,"abstract":"In this paper, we emphasize the need for conserving space within sentences by introducing a Sentences Simplification Module (SSM). The module is aimed to shorten the length of sentences via either splitting or compression. We describe how the module is integrated in a Wikipedia-based summarization framework. We highlight the performance differences obtained from introducing such a module by running a series of evaluations.","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125814952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-09-01DOI: 10.1109/CIS.2011.6169148
M. Pluska, D. Sinclair
This work presents the OHMS methodology. The main aim of it is to design a model of a complex system easy to process by formal model checking procedure. The outcome is a verification report showing safety of the system. As the novel approach the complex mathematical notation is hidden form the user and use object base approach with graphical notation. It gives the user better experience and more flexibility in the design. On the other hand at the end of the process the user is still provided with the formal verification report helping in the correct design.
{"title":"The design methodology for the verification of hybrid dynamical systems","authors":"M. Pluska, D. Sinclair","doi":"10.1109/CIS.2011.6169148","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169148","url":null,"abstract":"This work presents the OHMS methodology. The main aim of it is to design a model of a complex system easy to process by formal model checking procedure. The outcome is a verification report showing safety of the system. As the novel approach the complex mathematical notation is hidden form the user and use object base approach with graphical notation. It gives the user better experience and more flexibility in the design. On the other hand at the end of the process the user is still provided with the formal verification report helping in the correct design.","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132161561","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-09-01DOI: 10.1109/CIS.2011.6169147
H. Attaran, A. Hosseinian Far
This paper reflects on a new method for implementing and designing object oriented relational databases. An introduction of the conventional methods is given initially. This method directly derives the database and its relational tables out of systems' classes and this is the novelty of this algorithm. Systems' class recognition is also embedded within the proposed method. Furthermore, the projected method is presented by means of a case study. Moreover, the already implemented systems are introduced. Other advantages and further works of the new method are outlined.
{"title":"A novel technique for object oriented relational database design","authors":"H. Attaran, A. Hosseinian Far","doi":"10.1109/CIS.2011.6169147","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169147","url":null,"abstract":"This paper reflects on a new method for implementing and designing object oriented relational databases. An introduction of the conventional methods is given initially. This method directly derives the database and its relational tables out of systems' classes and this is the novelty of this algorithm. Systems' class recognition is also embedded within the proposed method. Furthermore, the projected method is presented by means of a case study. Moreover, the already implemented systems are introduced. Other advantages and further works of the new method are outlined.","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116459173","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-09-01DOI: 10.1109/CIS.2011.6169127
Atena Sajedin, R. Ebrahimpour, Tahmoures Younesi Garousi
This paper presents a ”Decision Templates” (DTs) approach to develop customized Electrocardiogram (ECG) beat classifier in an effort to further improve the performance of ECG classification. Taking advantage of the Un-decimated Wavelet Transform (UWT), which also serves as a tool for noise reduction, we extracted 10 ECG morphological, as well as one timing interval features. For classification we have used a number of diverse MLPs neural networks as the base classifiers that are trained by Back Propagation algorithm. Then we employed and compared different combination methods. Tested with MIT/BIH arrhythmia database, we observe significant performance enhancement using this approach.
{"title":"Electrocardiogram beat classification using classifier fusion based on Decision Templates","authors":"Atena Sajedin, R. Ebrahimpour, Tahmoures Younesi Garousi","doi":"10.1109/CIS.2011.6169127","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169127","url":null,"abstract":"This paper presents a ”Decision Templates” (DTs) approach to develop customized Electrocardiogram (ECG) beat classifier in an effort to further improve the performance of ECG classification. Taking advantage of the Un-decimated Wavelet Transform (UWT), which also serves as a tool for noise reduction, we extracted 10 ECG morphological, as well as one timing interval features. For classification we have used a number of diverse MLPs neural networks as the base classifiers that are trained by Back Propagation algorithm. Then we employed and compared different combination methods. Tested with MIT/BIH arrhythmia database, we observe significant performance enhancement using this approach.","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121706074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-09-01DOI: 10.1109/CIS.2011.6169129
R. Kala, K. Warwick
Criteria such as driving safety and overall travel efficiency have led to increasing attempts towards autonomy of vehicles wherein different vehicles can plan their journey, maneuver as per scenario, and communicate to each other to create an error free travel plan. In this paper we present the use of Rapidly Exploring Random Trees (RRT) for the planning of multiple vehicles in traffic scenarios. The planner for each vehicle uses RRT to generate a travel plan. Spline curves are used for smoothing of the path generated by the RRT, which follows non-holonomic constraints. Priority is used as a coordination mechanism wherein a higher priority vehicle attempts to avoid all lower priority vehicles. The planner attempts to find the maximum speed at which the vehicle may travel and the corresponding path. Experimental results show that by using the approach, multiple vehicles may be planned to travel in a fairly complex obstacle grid. Further, the vehicles exhibited behaviors including vehicle following and overtaking which are commonly seen in everyday driving.
{"title":"Planning of multiple autonomous vehicles using RRT","authors":"R. Kala, K. Warwick","doi":"10.1109/CIS.2011.6169129","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169129","url":null,"abstract":"Criteria such as driving safety and overall travel efficiency have led to increasing attempts towards autonomy of vehicles wherein different vehicles can plan their journey, maneuver as per scenario, and communicate to each other to create an error free travel plan. In this paper we present the use of Rapidly Exploring Random Trees (RRT) for the planning of multiple vehicles in traffic scenarios. The planner for each vehicle uses RRT to generate a travel plan. Spline curves are used for smoothing of the path generated by the RRT, which follows non-holonomic constraints. Priority is used as a coordination mechanism wherein a higher priority vehicle attempts to avoid all lower priority vehicles. The planner attempts to find the maximum speed at which the vehicle may travel and the corresponding path. Experimental results show that by using the approach, multiple vehicles may be planned to travel in a fairly complex obstacle grid. Further, the vehicles exhibited behaviors including vehicle following and overtaking which are commonly seen in everyday driving.","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127759420","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-09-01DOI: 10.1109/CIS.2011.6169144
J. Nehinbe
Intrusion Detection Systems are fast-growing techniques for monitoring and garnering electronic evidences about suspicious activities that signify threats to computer systems. Generally, these mechanisms overwhelmingly describe and record patterns of suspicious packets as alerts in the form of intrusion logs. Thereafter, analysts must subsequently validate the content of each intrusion log to ascertain the validity of each alert. Secondly, high level of expertise is required to discern each alert. However, more time and resources are unduly spent at the expense of countermeasures that ought to be proactively initiated to thwart attacks in progress. Accordingly, TSA-Log analyzer that uses a computationally fast technique and a uniform baseline to determine patterns of intrusions is proposed in this paper. Validations that are carried out on five publicly available datasets demonstrate that propagation strategies of intrusions, efficient countermeasures and the extent of similarity of intrusions can be forecasted giving the knowledge of the patterns of alerts in intrusion logs.
{"title":"Time series analyses for forecasting network intrusions","authors":"J. Nehinbe","doi":"10.1109/CIS.2011.6169144","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169144","url":null,"abstract":"Intrusion Detection Systems are fast-growing techniques for monitoring and garnering electronic evidences about suspicious activities that signify threats to computer systems. Generally, these mechanisms overwhelmingly describe and record patterns of suspicious packets as alerts in the form of intrusion logs. Thereafter, analysts must subsequently validate the content of each intrusion log to ascertain the validity of each alert. Secondly, high level of expertise is required to discern each alert. However, more time and resources are unduly spent at the expense of countermeasures that ought to be proactively initiated to thwart attacks in progress. Accordingly, TSA-Log analyzer that uses a computationally fast technique and a uniform baseline to determine patterns of intrusions is proposed in this paper. Validations that are carried out on five publicly available datasets demonstrate that propagation strategies of intrusions, efficient countermeasures and the extent of similarity of intrusions can be forecasted giving the knowledge of the patterns of alerts in intrusion logs.","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122012711","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-09-01DOI: 10.1109/CIS.2011.6169130
D. Ong, S. Khaddaj, R. Bashroush
Most intelligent systems have some form of decision making mechanisms built into their organisations. These normally include a logical reasoning element into their design. This paper reviews and compares the different logical reasoning strategies, and tries to address the accuracy and precision of decision making by formulating a tolerance to imprecision view which can be used in conjunction with the various reasoning strategies.
{"title":"Logical reasoning and decision making","authors":"D. Ong, S. Khaddaj, R. Bashroush","doi":"10.1109/CIS.2011.6169130","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169130","url":null,"abstract":"Most intelligent systems have some form of decision making mechanisms built into their organisations. These normally include a logical reasoning element into their design. This paper reviews and compares the different logical reasoning strategies, and tries to address the accuracy and precision of decision making by formulating a tolerance to imprecision view which can be used in conjunction with the various reasoning strategies.","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127391829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-09-01DOI: 10.1109/CIS.2011.6169134
M. A. Hadj Taieb, Mohamed Ben Aouicha, M. Tmar, Abdelmajid Ben Hamadou
Semantic similarity techniques are used to compute the semantic similarity (common shared information) between two concepts according to certain language or domain resources like ontologies, taxonomies, corpora, etc. Semantic similarity techniques constitute important components in most Information Retrieval (IR) and knowledge-based systems. Taking semantics into account passes by the use of external semantic resources coupled with the initial documentation on which it is necessary to have semantic similarity measurements to carry out comparisons between concepts. This paper presents a new approach for measuring semantic relatedness between words and concepts. It combines a new information content (IC) metric using the WordNet thesaurus and the nominalization relation provided by the Java WordNet Library (JWNL). Specifically, the proposed method offers a thorough use of the relation hypernym/hyponym (noun and verb “is a” taxonomy) without external corpus statistical information. Mainly, we use the subgraph formed by hypernyms of the concerned concept which inherits the whole features of its hypernyms and we quantify the contribution of each concept pertaining to this subgraph in its information content. When tested on a common data set of word pair similarity ratings, the proposed approach outperforms other computational models. It gives the highest correlation value 0.70 with a benchmark based on human similarity judgments and especially a large dataset composed of 260 Finkelstein word pairs (Appendix 1 and 2).
{"title":"New information content metric and nominalization relation for a new WordNet-based method to measure the semantic relatedness","authors":"M. A. Hadj Taieb, Mohamed Ben Aouicha, M. Tmar, Abdelmajid Ben Hamadou","doi":"10.1109/CIS.2011.6169134","DOIUrl":"https://doi.org/10.1109/CIS.2011.6169134","url":null,"abstract":"Semantic similarity techniques are used to compute the semantic similarity (common shared information) between two concepts according to certain language or domain resources like ontologies, taxonomies, corpora, etc. Semantic similarity techniques constitute important components in most Information Retrieval (IR) and knowledge-based systems. Taking semantics into account passes by the use of external semantic resources coupled with the initial documentation on which it is necessary to have semantic similarity measurements to carry out comparisons between concepts. This paper presents a new approach for measuring semantic relatedness between words and concepts. It combines a new information content (IC) metric using the WordNet thesaurus and the nominalization relation provided by the Java WordNet Library (JWNL). Specifically, the proposed method offers a thorough use of the relation hypernym/hyponym (noun and verb “is a” taxonomy) without external corpus statistical information. Mainly, we use the subgraph formed by hypernyms of the concerned concept which inherits the whole features of its hypernyms and we quantify the contribution of each concept pertaining to this subgraph in its information content. When tested on a common data set of word pair similarity ratings, the proposed approach outperforms other computational models. It gives the highest correlation value 0.70 with a benchmark based on human similarity judgments and especially a large dataset composed of 260 Finkelstein word pairs (Appendix 1 and 2).","PeriodicalId":286889,"journal":{"name":"2011 IEEE 10th International Conference on Cybernetic Intelligent Systems (CIS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129285489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}