Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552560
M. Qais, Zeyad AbdulWahid
In this paper, we introduced some modifications in the standard particles swarm optimization algorithm to get better results. We modified the velocity equation by inserting triangular functions (cosine and sine), increasing inertia weight and introducing a new method to avoid the stagnation problem. The modified algorithm named as Triangular Particle Swarm Optimization (TriPSO) was tested by five well-known benchmark functions (Sphere, Ackley, Rastrigin, Rosenbrock and Schwefel p2.26). The obtained results are compared with those of standard PSO and different published improved PSO algorithms (SPSO, PSO-XD, CPSO-S and PSO-P5), the comparison showed that TriPSO has the best results.
{"title":"A new method for improving particle swarm optimization algorithm (TriPSO)","authors":"M. Qais, Zeyad AbdulWahid","doi":"10.1109/ICMSAO.2013.6552560","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552560","url":null,"abstract":"In this paper, we introduced some modifications in the standard particles swarm optimization algorithm to get better results. We modified the velocity equation by inserting triangular functions (cosine and sine), increasing inertia weight and introducing a new method to avoid the stagnation problem. The modified algorithm named as Triangular Particle Swarm Optimization (TriPSO) was tested by five well-known benchmark functions (Sphere, Ackley, Rastrigin, Rosenbrock and Schwefel p2.26). The obtained results are compared with those of standard PSO and different published improved PSO algorithms (SPSO, PSO-XD, CPSO-S and PSO-P5), the comparison showed that TriPSO has the best results.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130653142","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552562
Salma Mezghani, H. Chabchoub, B. Aouni
The bin packing problem (BPP) a have many practical applications. The general single-objective formulation consists of allocating all objects in the minimum number of bins. However, BPP can be seen as a bi-objectives problem where the following objectives can be optimized simultaneously, the total cost and conflicts among the items within the bins. These objectives are conflicting. Their aggregation requires some compromise from the managers. In this paper, we will be proposing a goal programming model and the satisfaction functions to aggregate the objectives and explicitly integrates the manager's preferences.
{"title":"Manager's preferences in the Bi-Objectives Bin Packing Problem","authors":"Salma Mezghani, H. Chabchoub, B. Aouni","doi":"10.1109/ICMSAO.2013.6552562","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552562","url":null,"abstract":"The bin packing problem (BPP) a have many practical applications. The general single-objective formulation consists of allocating all objects in the minimum number of bins. However, BPP can be seen as a bi-objectives problem where the following objectives can be optimized simultaneously, the total cost and conflicts among the items within the bins. These objectives are conflicting. Their aggregation requires some compromise from the managers. In this paper, we will be proposing a goal programming model and the satisfaction functions to aggregate the objectives and explicitly integrates the manager's preferences.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130504735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552544
O. Dridi, S. Krichen, A. Guitouni
The resource-constrained project scheduling problem is a general scheduling problem which involving activities need to be scheduled such that the makespan is minimized. However, the RCPSP is confirmed to be an NP-hard combinatorial problem. Restated, it is hard to be solved in a reasonable computational time. Therefore, numerous metaheuristics-based approaches have been developed for finding near-optimal solution for RCPSP. Genetic algorithms have been applied to a wide variety of combinatorial optimization problems and have proved their efficiency. However, prematurely convergence may lead to search stagnation on restricted regions of the search space. To deal with this drawback and beside the good performances attained by local search procedures, a genetic local search algorithm for solving the RCPSP is proposed. Simulation results demonstrate that the proposed GLSA provides an effective and efficient approach for solving RCPSP.
{"title":"Solving resource-constrained project scheduling problem by a genetic local search approach","authors":"O. Dridi, S. Krichen, A. Guitouni","doi":"10.1109/ICMSAO.2013.6552544","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552544","url":null,"abstract":"The resource-constrained project scheduling problem is a general scheduling problem which involving activities need to be scheduled such that the makespan is minimized. However, the RCPSP is confirmed to be an NP-hard combinatorial problem. Restated, it is hard to be solved in a reasonable computational time. Therefore, numerous metaheuristics-based approaches have been developed for finding near-optimal solution for RCPSP. Genetic algorithms have been applied to a wide variety of combinatorial optimization problems and have proved their efficiency. However, prematurely convergence may lead to search stagnation on restricted regions of the search space. To deal with this drawback and beside the good performances attained by local search procedures, a genetic local search algorithm for solving the RCPSP is proposed. Simulation results demonstrate that the proposed GLSA provides an effective and efficient approach for solving RCPSP.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129458771","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552689
Imen Hamdi, T. Loukil
In this paper, we consider the problem of scheduling n jobs in an m-machine permutation flowshop with time lags between consecutive operations of each job. The processing order of jobs is the same for each machine. The time lag is defined as the waiting time between consecutive operations. We use logic-based Benders decomposition to minimize the total number of tardy jobs with long time horizon defined on the last machine. We combine Mixed Integer Linear programming (MILP) to allocate jobs to time intervals of the time horizon and scheduled using Constraint Programming (CP). Also, a lower bound based on Moore's algorithm is developed. Then, computational results are reported.
{"title":"Logic-based Benders decomposition to solve the permutation flowshop scheduling problem with time lags","authors":"Imen Hamdi, T. Loukil","doi":"10.1109/ICMSAO.2013.6552689","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552689","url":null,"abstract":"In this paper, we consider the problem of scheduling n jobs in an m-machine permutation flowshop with time lags between consecutive operations of each job. The processing order of jobs is the same for each machine. The time lag is defined as the waiting time between consecutive operations. We use logic-based Benders decomposition to minimize the total number of tardy jobs with long time horizon defined on the last machine. We combine Mixed Integer Linear programming (MILP) to allocate jobs to time intervals of the time horizon and scheduled using Constraint Programming (CP). Also, a lower bound based on Moore's algorithm is developed. Then, computational results are reported.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129788855","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552696
S. Chaabouni, Salma Jammoussi, Y. Benayed
The objective of this work is to design a new method to solve the problem of integrating the Vapnik theory, as regards support vector machines, in the field of clustering data. For this we turned to bio-inspired meta-heuristics. Bio-inspired approaches aim to develop models resolving a class of problems by drawing on patterns of behavior developed in ethology. For instance, the Particle Swarm Optimization (PSO) is one of the latest and widely used methods in this regard. Inspired by this paradigm we propose a new method for clustering. The proposed method PSvmC ensures the best separation of the unlabeled data sets into two groups. It aims specifically to explore the basic principles of SVM and to combine it with the meta-heuristic of particle swarm optimization to resolve the clustering problem. Indeed, it makes a contribution in the field of analysis of multivariate data. Obtained results present groups as homogeneous as possible. Indeed, the intra-class value is more efficient when comparing it to those obtained by Hierarchical clustering, Simple K-means and EM algorithms for different database of benchmark.
{"title":"Particle swarm optimization for support vector clustering Separating hyper-plane of unlabeled data","authors":"S. Chaabouni, Salma Jammoussi, Y. Benayed","doi":"10.1109/ICMSAO.2013.6552696","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552696","url":null,"abstract":"The objective of this work is to design a new method to solve the problem of integrating the Vapnik theory, as regards support vector machines, in the field of clustering data. For this we turned to bio-inspired meta-heuristics. Bio-inspired approaches aim to develop models resolving a class of problems by drawing on patterns of behavior developed in ethology. For instance, the Particle Swarm Optimization (PSO) is one of the latest and widely used methods in this regard. Inspired by this paradigm we propose a new method for clustering. The proposed method PSvmC ensures the best separation of the unlabeled data sets into two groups. It aims specifically to explore the basic principles of SVM and to combine it with the meta-heuristic of particle swarm optimization to resolve the clustering problem. Indeed, it makes a contribution in the field of analysis of multivariate data. Obtained results present groups as homogeneous as possible. Indeed, the intra-class value is more efficient when comparing it to those obtained by Hierarchical clustering, Simple K-means and EM algorithms for different database of benchmark.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123175346","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552720
Gharbi Leila, Halioui Khamoussi
This paper examines the informational market efficiency in the Islamic and conventional markets in the Gulf Cooperation Council (GCC) region. It aims to investigate whether Islamic markets would be more or less efficient than the conventional ones. Findings indicate that both Dow Jones Islamic Market GCC and Dow Jones GCC Indexes show characteristics of random walk. However, we find an impact of market illiquidity variable on Islamic stock prices but with small extent compared with conventional banking sectors. It is also observed that investor sentiment takes a large explanatory power in the explanation of the stock prices for both Islamic and conventional banking sectors.
{"title":"Informational market efficiency in GCC region: A comparative study between Islamic and conventional markets","authors":"Gharbi Leila, Halioui Khamoussi","doi":"10.1109/ICMSAO.2013.6552720","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552720","url":null,"abstract":"This paper examines the informational market efficiency in the Islamic and conventional markets in the Gulf Cooperation Council (GCC) region. It aims to investigate whether Islamic markets would be more or less efficient than the conventional ones. Findings indicate that both Dow Jones Islamic Market GCC and Dow Jones GCC Indexes show characteristics of random walk. However, we find an impact of market illiquidity variable on Islamic stock prices but with small extent compared with conventional banking sectors. It is also observed that investor sentiment takes a large explanatory power in the explanation of the stock prices for both Islamic and conventional banking sectors.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126381477","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552701
Sondes Fayech, N. Essoussi, M. Limam
Protein secondary structure prediction is a key step in prediction of protein tertiary structure. There have emerged many methods based on machine learning techniques, such as neural networks (NN) and support vector machines (SVM), to focus on the prediction of the secondary structures. In this paper a new method, DM-pred, was proposed based on a protein clustering method to detect homologous sequences, a sequential pattern mining method to detect frequent patterns, features extraction and quantification approaches to prepare features and SVM method to predict structures. When tested on the most popular secondary structure datasets, DM-pred achieved a Q3 accuracy of 78.20% and a SOV of 76.49% which illustrates that it is one of the top range methods for protein secondary structure prediction.
{"title":"Data mining techniques to predict protein secondary structures","authors":"Sondes Fayech, N. Essoussi, M. Limam","doi":"10.1109/ICMSAO.2013.6552701","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552701","url":null,"abstract":"Protein secondary structure prediction is a key step in prediction of protein tertiary structure. There have emerged many methods based on machine learning techniques, such as neural networks (NN) and support vector machines (SVM), to focus on the prediction of the secondary structures. In this paper a new method, DM-pred, was proposed based on a protein clustering method to detect homologous sequences, a sequential pattern mining method to detect frequent patterns, features extraction and quantification approaches to prepare features and SVM method to predict structures. When tested on the most popular secondary structure datasets, DM-pred achieved a Q3 accuracy of 78.20% and a SOV of 76.49% which illustrates that it is one of the top range methods for protein secondary structure prediction.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126866344","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552665
A. Adamou-Mitiche, L. Mitiche, V. Sima
We use the two-dimensional windowing method to design a digital 2D-FIR filter with linear phase, circularly symmetric with respect to the origin of the frequency plane. To get an economical filter with high information efficiency, an interesting way is applying the balanced realization method to this full-order filter. As a result, a linear phase IIR filter is obtained whose frequency response is very close to that of the initial filter.
{"title":"On the synthesis of digital two dimensional filters based on FIR filters approximation","authors":"A. Adamou-Mitiche, L. Mitiche, V. Sima","doi":"10.1109/ICMSAO.2013.6552665","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552665","url":null,"abstract":"We use the two-dimensional windowing method to design a digital 2D-FIR filter with linear phase, circularly symmetric with respect to the origin of the frequency plane. To get an economical filter with high information efficiency, an interesting way is applying the balanced realization method to this full-order filter. As a result, a linear phase IIR filter is obtained whose frequency response is very close to that of the initial filter.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127304940","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552628
A. Smiti, Zied Elouedi
The success of the Case Based Reasoning system depends on the quality of case data and the speed of the retrieval process that can be expensive in time especially when the number of cases gets large. To guarantee this quality, maintaining the contents of a case base becomes necessary. This paper presents two case base maintenance methods. They are mainly based on the idea that the clustering analysis to a large case base can efficiently build new case bases, which are smaller in size and can easily use simpler maintenance operations. One of method is based on partitioning clustering technique and the other one on density clustering technique. Experiments are provided to show the effectiveness of our methods taking into account the performance criteria of the case base. In addition, we support our empirical evaluation with using a new criterion called “competence” in order to show the efficiency of our methods in building high-quality case bases while preserving the competence of the case bases.
{"title":"Using clustering for maintaining case based reasoning systems","authors":"A. Smiti, Zied Elouedi","doi":"10.1109/ICMSAO.2013.6552628","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552628","url":null,"abstract":"The success of the Case Based Reasoning system depends on the quality of case data and the speed of the retrieval process that can be expensive in time especially when the number of cases gets large. To guarantee this quality, maintaining the contents of a case base becomes necessary. This paper presents two case base maintenance methods. They are mainly based on the idea that the clustering analysis to a large case base can efficiently build new case bases, which are smaller in size and can easily use simpler maintenance operations. One of method is based on partitioning clustering technique and the other one on density clustering technique. Experiments are provided to show the effectiveness of our methods taking into account the performance criteria of the case base. In addition, we support our empirical evaluation with using a new criterion called “competence” in order to show the efficiency of our methods in building high-quality case bases while preserving the competence of the case bases.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127726841","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alternative energy for oil as well as advanced technology are used to overcome problems related to public transportation system. In this context, Personal Rapid Transit system (PRT) are among the newest transportation mode that can overcome many of the public transit's problems. Unfortunately, this kind of transportation mode can result a large amount of wasting energy due to the displacement of empty vehicles. In this study, we present and formulate a static problem related to PRT to minimize the total energy consumption. To solve this problem, an adaption of the iterated greedy heuristic (IGH) is represented. Four different versions of the algorithm are proposed as we couple it with the simulated annealing technique. The algorithm is simple and effective as it show how to find good quality results over a short period of time.
{"title":"An iterated greedy heuristic for the static empty vehicle redistribution problem for the Personal Rapid Transit system","authors":"Ezzeddine Fatnassi, Olfa Chebbi, Jouhaina Chaouachi Siala","doi":"10.1109/ICMSAO.2013.6552588","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552588","url":null,"abstract":"Alternative energy for oil as well as advanced technology are used to overcome problems related to public transportation system. In this context, Personal Rapid Transit system (PRT) are among the newest transportation mode that can overcome many of the public transit's problems. Unfortunately, this kind of transportation mode can result a large amount of wasting energy due to the displacement of empty vehicles. In this study, we present and formulate a static problem related to PRT to minimize the total energy consumption. To solve this problem, an adaption of the iterated greedy heuristic (IGH) is represented. Four different versions of the algorithm are proposed as we couple it with the simulated annealing technique. The algorithm is simple and effective as it show how to find good quality results over a short period of time.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114272578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}