Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552581
Walid Abdelfattah, A. Rebai
During the last ten years many works have appeared dealing with the use of data envelopment analysis (DEA) for the assessment of supply chains efficiency. However, when analyzing especially theoretical basis of some of these works we found some issues which could be considered or reconsidered. This paper aims to offer a theoretical framework proposing new perspectives to treat these issues in supply chains technical efficiency measurement. Focusing on dyadic chains (composed only by two members), the purpose herein is firstly the distinction between cooperation and dominance when measuring chains and chains members efficiency and secondly the incorporation of decision makers preferences that could be assigned to these chains and chains members. The proposed perspectives will help to looking for appropriate DEA models in order to provide more realistic performance evaluation of supply chains.
{"title":"Dyadic supply chains efficiency: A new theoretical framework to consider members power relationship and decision makers preferences","authors":"Walid Abdelfattah, A. Rebai","doi":"10.1109/ICMSAO.2013.6552581","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552581","url":null,"abstract":"During the last ten years many works have appeared dealing with the use of data envelopment analysis (DEA) for the assessment of supply chains efficiency. However, when analyzing especially theoretical basis of some of these works we found some issues which could be considered or reconsidered. This paper aims to offer a theoretical framework proposing new perspectives to treat these issues in supply chains technical efficiency measurement. Focusing on dyadic chains (composed only by two members), the purpose herein is firstly the distinction between cooperation and dominance when measuring chains and chains members efficiency and secondly the incorporation of decision makers preferences that could be assigned to these chains and chains members. The proposed perspectives will help to looking for appropriate DEA models in order to provide more realistic performance evaluation of supply chains.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"224 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114428352","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552712
R. Hachicha
In order to make the workflows systems more adaptable to the dynamic changes, we depict a task formal model which makes possible to define the various relations between the tasks and particularly to ensure the feasibility of the workflow execution and to provide solutions whenever any change is met during its running.
{"title":"A formal task model for flexible workflows systems","authors":"R. Hachicha","doi":"10.1109/ICMSAO.2013.6552712","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552712","url":null,"abstract":"In order to make the workflows systems more adaptable to the dynamic changes, we depict a task formal model which makes possible to define the various relations between the tasks and particularly to ensure the feasibility of the workflow execution and to provide solutions whenever any change is met during its running.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116748893","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552668
Laila Messaoudi, A. Rebai
Earlier works on Goal Programming models for portfolio selection problem under uncertainty did not utilize the combination of the different types of uncertainty for a given problem and they only assumed the existence of stochastic or fuzzy uncertainty: These models may be too restrictive in modeling of real life decision making problems where randomness and fuzziness are often coexist. In this paper, we develop a novel fuzzy goal programming model for solving a stochastic multi-objective portfolio selection problem. In this model, the fuzzy chance-constrained goals are described along with the imprecise importance relations among them. The developed model will be utilized to build a new portfolio selection model that considers the tradeoffs between expected return, Value-at-Risk (VaR), the price earning ratio and the flexibility of investor's preferences.
{"title":"A fuzzy stochastic Goal Programming approach for solving portfolio selection problem","authors":"Laila Messaoudi, A. Rebai","doi":"10.1109/ICMSAO.2013.6552668","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552668","url":null,"abstract":"Earlier works on Goal Programming models for portfolio selection problem under uncertainty did not utilize the combination of the different types of uncertainty for a given problem and they only assumed the existence of stochastic or fuzzy uncertainty: These models may be too restrictive in modeling of real life decision making problems where randomness and fuzziness are often coexist. In this paper, we develop a novel fuzzy goal programming model for solving a stochastic multi-objective portfolio selection problem. In this model, the fuzzy chance-constrained goals are described along with the imprecise importance relations among them. The developed model will be utilized to build a new portfolio selection model that considers the tradeoffs between expected return, Value-at-Risk (VaR), the price earning ratio and the flexibility of investor's preferences.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115312584","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552696
S. Chaabouni, Salma Jammoussi, Y. Benayed
The objective of this work is to design a new method to solve the problem of integrating the Vapnik theory, as regards support vector machines, in the field of clustering data. For this we turned to bio-inspired meta-heuristics. Bio-inspired approaches aim to develop models resolving a class of problems by drawing on patterns of behavior developed in ethology. For instance, the Particle Swarm Optimization (PSO) is one of the latest and widely used methods in this regard. Inspired by this paradigm we propose a new method for clustering. The proposed method PSvmC ensures the best separation of the unlabeled data sets into two groups. It aims specifically to explore the basic principles of SVM and to combine it with the meta-heuristic of particle swarm optimization to resolve the clustering problem. Indeed, it makes a contribution in the field of analysis of multivariate data. Obtained results present groups as homogeneous as possible. Indeed, the intra-class value is more efficient when comparing it to those obtained by Hierarchical clustering, Simple K-means and EM algorithms for different database of benchmark.
{"title":"Particle swarm optimization for support vector clustering Separating hyper-plane of unlabeled data","authors":"S. Chaabouni, Salma Jammoussi, Y. Benayed","doi":"10.1109/ICMSAO.2013.6552696","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552696","url":null,"abstract":"The objective of this work is to design a new method to solve the problem of integrating the Vapnik theory, as regards support vector machines, in the field of clustering data. For this we turned to bio-inspired meta-heuristics. Bio-inspired approaches aim to develop models resolving a class of problems by drawing on patterns of behavior developed in ethology. For instance, the Particle Swarm Optimization (PSO) is one of the latest and widely used methods in this regard. Inspired by this paradigm we propose a new method for clustering. The proposed method PSvmC ensures the best separation of the unlabeled data sets into two groups. It aims specifically to explore the basic principles of SVM and to combine it with the meta-heuristic of particle swarm optimization to resolve the clustering problem. Indeed, it makes a contribution in the field of analysis of multivariate data. Obtained results present groups as homogeneous as possible. Indeed, the intra-class value is more efficient when comparing it to those obtained by Hierarchical clustering, Simple K-means and EM algorithms for different database of benchmark.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123175346","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552685
Saima Dhouib, S. Dhouib, H. Chabchoub
In this paper, an Artificial Bee Colony (ABC) metaheuristic is adapted to find Pareto optimal solutions set for Goal Programming (GP) Problems. At first, the GP model is converted to a multi-objective optimization problem (MOO) of minimizing deviations from fixed goals. At second, the ABC is personalized to support the MOO by means of a weighted sum formulation for the objective function: solving several scalarization of the objective function according to a weight vector with non-negative components. The efficiency of the proposed approach is demonstrated by nonlinear engineering design problems. In all problems, multiple solutions to the goal programming problem are found in short computational time using very few user-defined parameters.
{"title":"Artificial bee colony metaheuristic to find pareto optimal solutions set for engineering design problems","authors":"Saima Dhouib, S. Dhouib, H. Chabchoub","doi":"10.1109/ICMSAO.2013.6552685","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552685","url":null,"abstract":"In this paper, an Artificial Bee Colony (ABC) metaheuristic is adapted to find Pareto optimal solutions set for Goal Programming (GP) Problems. At first, the GP model is converted to a multi-objective optimization problem (MOO) of minimizing deviations from fixed goals. At second, the ABC is personalized to support the MOO by means of a weighted sum formulation for the objective function: solving several scalarization of the objective function according to a weight vector with non-negative components. The efficiency of the proposed approach is demonstrated by nonlinear engineering design problems. In all problems, multiple solutions to the goal programming problem are found in short computational time using very few user-defined parameters.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124815979","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552629
S. Chargui, Hana Gharbi, M. Slimani
A general research subject in rainfall runoff modeling is assessment of space time variability in event time series. A MATLAB program is developed for taking account of the space and time distribution. We focus on central Tunisia (Merguellil and Skhira basin), where rainfall is known by its high variability for over a decade. We introduce a variability matrix on a geomorphologybased transfer function. Robustness of the developed program is checked for some real events from the Skhira basin data. Its potential is especially interesting in datasparse regions where the geomorphologybased approach can be applied in a vigorous and adjustable way, and where the accounting of rainfall space and time variability is much supple.
{"title":"A MATLAB program for identifying the rainfall variability in rainfall-runoff modeling in Semi arid region (Merguellil basin: Central Tunisia)","authors":"S. Chargui, Hana Gharbi, M. Slimani","doi":"10.1109/ICMSAO.2013.6552629","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552629","url":null,"abstract":"A general research subject in rainfall runoff modeling is assessment of space time variability in event time series. A MATLAB program is developed for taking account of the space and time distribution. We focus on central Tunisia (Merguellil and Skhira basin), where rainfall is known by its high variability for over a decade. We introduce a variability matrix on a geomorphologybased transfer function. Robustness of the developed program is checked for some real events from the Skhira basin data. Its potential is especially interesting in datasparse regions where the geomorphologybased approach can be applied in a vigorous and adjustable way, and where the accounting of rainfall space and time variability is much supple.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121722683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552664
Hana Sulieman
D-optimal experimental designs for precise parameter estimation are designs which minimize the determinant of the variance-covariance matrix of the parameter estimates based on the conventional parametric sensitivity coefficients. These coefficients are local measures of sensitivity defined by the first-order derivative of system model function with respect to parameters of interest. For nonlinear models, linear sensitivity information fail to gouge the sensitivity behavior of the model and hence, the resulting determinant of variance-covariance matrix may not give a true indication of the volume of the joint inference region for system parameters. In this article, we employ the profile-based sensitivity coefficients developed by Sulieman et.al. (2001, 2004)in the D-optimal experimental designs. Profile-based sensitivity coefficients account for both model nonlinearity and parameter estimate correlations and are, therefore, expected to yield better precision of parameter estimates when used in the optimization of particular experimental design criteria. Some characteristics of the profile-based designs and related computational aspects are discussed. Application of the new designs to nonlinear model case is also presented.
{"title":"Profile-based sensitivity in the design of experiments for parameter precision","authors":"Hana Sulieman","doi":"10.1109/ICMSAO.2013.6552664","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552664","url":null,"abstract":"D-optimal experimental designs for precise parameter estimation are designs which minimize the determinant of the variance-covariance matrix of the parameter estimates based on the conventional parametric sensitivity coefficients. These coefficients are local measures of sensitivity defined by the first-order derivative of system model function with respect to parameters of interest. For nonlinear models, linear sensitivity information fail to gouge the sensitivity behavior of the model and hence, the resulting determinant of variance-covariance matrix may not give a true indication of the volume of the joint inference region for system parameters. In this article, we employ the profile-based sensitivity coefficients developed by Sulieman et.al. (2001, 2004)in the D-optimal experimental designs. Profile-based sensitivity coefficients account for both model nonlinearity and parameter estimate correlations and are, therefore, expected to yield better precision of parameter estimates when used in the optimization of particular experimental design criteria. Some characteristics of the profile-based designs and related computational aspects are discussed. Application of the new designs to nonlinear model case is also presented.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123914624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552580
Ketfi Nadhir, Djabali Chabane, B. Tarek
This paper propose a Firefly algorithm (FA) for optimal placement and sizing of distributed generation (DG) in radial distribution system to minimize the total real power losses and to improve the voltage profile. FA is a metaheuristic algorithm which is inspired by the flashing behavior of fireflies. The primary purpose of firefly's flash is to act as a signal system to attract other fireflies. Metaheuristic algorithms are widely recognized as one of the most practical approaches for hard optimization problems. The most attractive feature of a metaheuristic is that its application requires no special knowledge on the optimization problem. In this paper, IEEE 33-bus distribution test system is used to show the effectiveness of the FA. Comparison with Shuffled Frog Leaping Algorithm (SFLA) is also given.
{"title":"Firefly algorithm based energy loss minimization approach for optimal sizing & placement of distributed generation","authors":"Ketfi Nadhir, Djabali Chabane, B. Tarek","doi":"10.1109/ICMSAO.2013.6552580","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552580","url":null,"abstract":"This paper propose a Firefly algorithm (FA) for optimal placement and sizing of distributed generation (DG) in radial distribution system to minimize the total real power losses and to improve the voltage profile. FA is a metaheuristic algorithm which is inspired by the flashing behavior of fireflies. The primary purpose of firefly's flash is to act as a signal system to attract other fireflies. Metaheuristic algorithms are widely recognized as one of the most practical approaches for hard optimization problems. The most attractive feature of a metaheuristic is that its application requires no special knowledge on the optimization problem. In this paper, IEEE 33-bus distribution test system is used to show the effectiveness of the FA. Comparison with Shuffled Frog Leaping Algorithm (SFLA) is also given.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124172973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552720
Gharbi Leila, Halioui Khamoussi
This paper examines the informational market efficiency in the Islamic and conventional markets in the Gulf Cooperation Council (GCC) region. It aims to investigate whether Islamic markets would be more or less efficient than the conventional ones. Findings indicate that both Dow Jones Islamic Market GCC and Dow Jones GCC Indexes show characteristics of random walk. However, we find an impact of market illiquidity variable on Islamic stock prices but with small extent compared with conventional banking sectors. It is also observed that investor sentiment takes a large explanatory power in the explanation of the stock prices for both Islamic and conventional banking sectors.
{"title":"Informational market efficiency in GCC region: A comparative study between Islamic and conventional markets","authors":"Gharbi Leila, Halioui Khamoussi","doi":"10.1109/ICMSAO.2013.6552720","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552720","url":null,"abstract":"This paper examines the informational market efficiency in the Islamic and conventional markets in the Gulf Cooperation Council (GCC) region. It aims to investigate whether Islamic markets would be more or less efficient than the conventional ones. Findings indicate that both Dow Jones Islamic Market GCC and Dow Jones GCC Indexes show characteristics of random walk. However, we find an impact of market illiquidity variable on Islamic stock prices but with small extent compared with conventional banking sectors. It is also observed that investor sentiment takes a large explanatory power in the explanation of the stock prices for both Islamic and conventional banking sectors.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126381477","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-04-28DOI: 10.1109/ICMSAO.2013.6552689
Imen Hamdi, T. Loukil
In this paper, we consider the problem of scheduling n jobs in an m-machine permutation flowshop with time lags between consecutive operations of each job. The processing order of jobs is the same for each machine. The time lag is defined as the waiting time between consecutive operations. We use logic-based Benders decomposition to minimize the total number of tardy jobs with long time horizon defined on the last machine. We combine Mixed Integer Linear programming (MILP) to allocate jobs to time intervals of the time horizon and scheduled using Constraint Programming (CP). Also, a lower bound based on Moore's algorithm is developed. Then, computational results are reported.
{"title":"Logic-based Benders decomposition to solve the permutation flowshop scheduling problem with time lags","authors":"Imen Hamdi, T. Loukil","doi":"10.1109/ICMSAO.2013.6552689","DOIUrl":"https://doi.org/10.1109/ICMSAO.2013.6552689","url":null,"abstract":"In this paper, we consider the problem of scheduling n jobs in an m-machine permutation flowshop with time lags between consecutive operations of each job. The processing order of jobs is the same for each machine. The time lag is defined as the waiting time between consecutive operations. We use logic-based Benders decomposition to minimize the total number of tardy jobs with long time horizon defined on the last machine. We combine Mixed Integer Linear programming (MILP) to allocate jobs to time intervals of the time horizon and scheduled using Constraint Programming (CP). Also, a lower bound based on Moore's algorithm is developed. Then, computational results are reported.","PeriodicalId":339666,"journal":{"name":"2013 5th International Conference on Modeling, Simulation and Applied Optimization (ICMSAO)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129788855","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}