{"title":"实值标量场交替优化的变量划分","authors":"J. Bezdek, R. Hathaway","doi":"10.1109/NAFIPS.2002.1018067","DOIUrl":null,"url":null,"abstract":"Summary form only given, as follows. Let x be a real-valued scalar field, partitioned into t subsets of non-overlapping variables X/sub i/ (i=1, ..., t). Alternating optimization (AO) is an iterative procedure for minimizing (or maximizing) the function f(x)= f(X/sub 1/, X/sub 2/, ..., X/sub t/) jointly over all variables by alternating restricted minimizations (or maximizations) over the individual subsets of variables X/sub 1/, ..., X/sub t/. AO is the basis for the c-means clustering algorithm (t=2), many forms of vector quantization (t = 2, 3 and 4) and the expectation maximization algorithm (t=4) for normal mixture decomposition. First we review where and how AO fits into the overall optimization landscape. Then we state (without proofs) two new theorems that give very general local and global convergence and rate-of-convergence results which hold for all partitionings of x. Finally, we discuss the important problem of choosing a \"best\" partitioning of the input variables for the AO approach. We show that the number of possible AO iteration schemes is larger than the number of standard partitions of the input variables. Two numerical examples are given to illustrate that the selection of the t subsets of x has an important effect on the rate of convergence of the corresponding AO algorithm to a solution.","PeriodicalId":348314,"journal":{"name":"2002 Annual Meeting of the North American Fuzzy Information Processing Society Proceedings. NAFIPS-FLINT 2002 (Cat. No. 02TH8622)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Partitioning the variables for alternating optimization of real-valued scalar fields\",\"authors\":\"J. Bezdek, R. Hathaway\",\"doi\":\"10.1109/NAFIPS.2002.1018067\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Summary form only given, as follows. Let x be a real-valued scalar field, partitioned into t subsets of non-overlapping variables X/sub i/ (i=1, ..., t). Alternating optimization (AO) is an iterative procedure for minimizing (or maximizing) the function f(x)= f(X/sub 1/, X/sub 2/, ..., X/sub t/) jointly over all variables by alternating restricted minimizations (or maximizations) over the individual subsets of variables X/sub 1/, ..., X/sub t/. AO is the basis for the c-means clustering algorithm (t=2), many forms of vector quantization (t = 2, 3 and 4) and the expectation maximization algorithm (t=4) for normal mixture decomposition. First we review where and how AO fits into the overall optimization landscape. Then we state (without proofs) two new theorems that give very general local and global convergence and rate-of-convergence results which hold for all partitionings of x. Finally, we discuss the important problem of choosing a \\\"best\\\" partitioning of the input variables for the AO approach. We show that the number of possible AO iteration schemes is larger than the number of standard partitions of the input variables. Two numerical examples are given to illustrate that the selection of the t subsets of x has an important effect on the rate of convergence of the corresponding AO algorithm to a solution.\",\"PeriodicalId\":348314,\"journal\":{\"name\":\"2002 Annual Meeting of the North American Fuzzy Information Processing Society Proceedings. NAFIPS-FLINT 2002 (Cat. No. 02TH8622)\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2002-08-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2002 Annual Meeting of the North American Fuzzy Information Processing Society Proceedings. NAFIPS-FLINT 2002 (Cat. No. 02TH8622)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NAFIPS.2002.1018067\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2002 Annual Meeting of the North American Fuzzy Information Processing Society Proceedings. NAFIPS-FLINT 2002 (Cat. No. 02TH8622)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NAFIPS.2002.1018067","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Partitioning the variables for alternating optimization of real-valued scalar fields
Summary form only given, as follows. Let x be a real-valued scalar field, partitioned into t subsets of non-overlapping variables X/sub i/ (i=1, ..., t). Alternating optimization (AO) is an iterative procedure for minimizing (or maximizing) the function f(x)= f(X/sub 1/, X/sub 2/, ..., X/sub t/) jointly over all variables by alternating restricted minimizations (or maximizations) over the individual subsets of variables X/sub 1/, ..., X/sub t/. AO is the basis for the c-means clustering algorithm (t=2), many forms of vector quantization (t = 2, 3 and 4) and the expectation maximization algorithm (t=4) for normal mixture decomposition. First we review where and how AO fits into the overall optimization landscape. Then we state (without proofs) two new theorems that give very general local and global convergence and rate-of-convergence results which hold for all partitionings of x. Finally, we discuss the important problem of choosing a "best" partitioning of the input variables for the AO approach. We show that the number of possible AO iteration schemes is larger than the number of standard partitions of the input variables. Two numerical examples are given to illustrate that the selection of the t subsets of x has an important effect on the rate of convergence of the corresponding AO algorithm to a solution.