Given a set of n distinct points in d-dimensional space that are hypothesized to lie on a hyperplane, robust statistical estimators have been recently proposed for the parameters of the model that best fits these points. This paper presents efficient algorithms for computing median-based robust estimators (e.g., the Theil-Sen and repeated median (RM) estimators) in high-dimensional space. We briefly review basic computational geometry techniques that were used to achieve efficient algorithms in the 2-D case. Then generalization of these techniques to higher dimensions is introduced. Geometric observations are followed by a presentation of O(nd − 1 log n) expected time algorithms for the d-dimensional Theil-Sen and RM estimators. Both algorithms are space optimal; i.e., they require O(n) storage, for fixed d. Finally, an extension of the methodology to nonlinear domain(s) is demonstrated.
{"title":"Computationally Efficient Algorithms for High-Dimensional Robust Estimators","authors":"Mount D.M., Netanyahu N.S.","doi":"10.1006/cgip.1994.1026","DOIUrl":"10.1006/cgip.1994.1026","url":null,"abstract":"<div><p>Given a set of <em>n</em> distinct points in <em>d</em>-dimensional space that are hypothesized to lie on a hyperplane, robust statistical estimators have been recently proposed for the parameters of the model that best fits these points. This paper presents efficient algorithms for computing median-based robust estimators (e.g., the Theil-Sen and repeated median (RM) estimators) in high-dimensional space. We briefly review basic computational geometry techniques that were used to achieve efficient algorithms in the 2-D case. Then generalization of these techniques to higher dimensions is introduced. Geometric observations are followed by a presentation of <em>O</em>(<em>n</em><sup><em>d</em> − 1</sup> log <em>n</em>) expected time algorithms for the <em>d</em>-dimensional Theil-Sen and RM estimators. Both algorithms are space optimal; i.e., they require <em>O</em>(<em>n</em>) storage, for fixed <em>d</em>. Finally, an extension of the methodology to nonlinear domain(s) is demonstrated.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 4","pages":"Pages 289-303"},"PeriodicalIF":0.0,"publicationDate":"1994-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1026","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128252869","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Gaussians are useful in multiscale representation of video data. An algorithm is presented which approximates a sequence of uniformly spaced single-valued data by a sum of Gaussians with a prescribed accuracy. The scale-space image [6] of the data is used to estimate the number of Gaussians and their initial parameters. The Marquardt algorithm (J. SIAM 11(2), 1963, 431-441) is then used to optimize the parameters.
{"title":"Curve Fitting by a Sum of Gaussians","authors":"Goshtasby A., Oneill W.D.","doi":"10.1006/cgip.1994.1025","DOIUrl":"10.1006/cgip.1994.1025","url":null,"abstract":"<div><p>Gaussians are useful in multiscale representation of video data. An algorithm is presented which approximates a sequence of uniformly spaced single-valued data by a sum of Gaussians with a prescribed accuracy. The scale-space image [6] of the data is used to estimate the number of Gaussians and their initial parameters. The Marquardt algorithm (<em>J. SIAM</em> 11(2), 1963, 431-441) is then used to optimize the parameters.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 4","pages":"Pages 281-288"},"PeriodicalIF":0.0,"publicationDate":"1994-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1025","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129723324","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Algorithms to process off-line Arabic handwriting prior to recognition are presented. The first algorithm converts smoothed and thinned images into polygonal approximations. The second algorithm determines the start vertex of writing. The third algorithm enforces temporal information by traversing the graph of the stroke in an order consistent with Arabic handwriting. It implements the following heuristic rule: the minimum distance path that traverses the stroke′s polygon from the start vertex to the end vertex has its vertices ordered as they were generated when the stroke was written. This third algorithm is developed from a standard solution of the Chinese postman′s problem applied to the graph of the stroke. Special rules to enforce temporal information on the stroke to obtain the most likely traversal that is consistent with Arabic handwriting are applied. Unconstrained handwritten strokes written by five subjects, (n = 4065) were used in testing. In 92.6% of the samples, the proposed algorithms restored the actual temporal information.
{"title":"Processing of Off-Line Handwritten Text: Polygonal Approximation and Enforcement of Temporal Information","authors":"Abuhaiba I.S.I., Holt M.J.J., Datta S.","doi":"10.1006/cgip.1994.1029","DOIUrl":"https://doi.org/10.1006/cgip.1994.1029","url":null,"abstract":"<div><p>Algorithms to process off-line Arabic handwriting prior to recognition are presented. The first algorithm converts smoothed and thinned images into polygonal approximations. The second algorithm determines the start vertex of writing. The third algorithm enforces temporal information by traversing the graph of the stroke in an order consistent with Arabic handwriting. It implements the following heuristic rule: the minimum distance path that traverses the stroke′s polygon from the start vertex to the end vertex has its vertices ordered as they were generated when the stroke was written. This third algorithm is developed from a standard solution of the Chinese postman′s problem applied to the graph of the stroke. Special rules to enforce temporal information on the stroke to obtain the most likely traversal that is consistent with Arabic handwriting are applied. Unconstrained handwritten strokes written by five subjects, (<em>n</em> = 4065) were used in testing. In 92.6% of the samples, the proposed algorithms restored the actual temporal information.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 4","pages":"Pages 324-335"},"PeriodicalIF":0.0,"publicationDate":"1994-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1029","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72279500","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Given m points in the plane and a threshold t, a curve is defined to be robust if at least t points lie on it. Efficient algorithms for detecting robust curves are given; the key contribution is to use randomized sampling. In addition, an approximate version of the problem is introduced. A geometric solution to this problem is given; it too can be enhanced by randomization. These algorithms are readily generalized to solve the problem of robust curve detection in a scene of curve fragments: given a set of curve segments, a curve σ is defined to be robust if curve segments of total length at least l lie on σ. Again, both an exact and an approximate version of the problem are considered. The problems and solutions are closely related to the well-investigated Hough transform technique.
{"title":"On the Detection of Robust Curves","authors":"Cole R., Vishkin U.","doi":"10.1006/cgip.1994.1018","DOIUrl":"10.1006/cgip.1994.1018","url":null,"abstract":"<div><p>Given <em>m</em> points in the plane and a threshold <em>t</em>, a curve is defined to be robust if at least <em>t</em> points lie on it. Efficient algorithms for detecting robust curves are given; the key contribution is to use randomized sampling. In addition, an approximate version of the problem is introduced. A geometric solution to this problem is given; it too can be enhanced by randomization. These algorithms are readily generalized to solve the problem of robust curve detection in a scene of curve fragments: given a set of curve segments, a curve σ is defined to be robust if curve segments of total length at least <em>l</em> lie on σ. Again, both an exact and an approximate version of the problem are considered. The problems and solutions are closely related to the well-investigated Hough transform technique.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 189-204"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1018","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120048622","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Image warping, often referred to as "rubber sheeting," represents the deformation of a domain image space into a range image space. In this paper, a technique which extends the definition of a rubber-sheet transformation to allow a polygonal region to be warped into one or more subsets of itself, where the subsets may be multiply connected, is described. To do this, it constructs a set of "slits" in the domain image, which correspond to discontinuities and concavities in the range image, using a technique based on generalized Voronoi diagrams. The concept of medial axis is extended to describe inner and outer medial contours of a polygon. Polygonal regions are decomposed into annular subregions, and path homotopies are introduced to describe the annular subregions. These constructions motivate the definition of a ladder, which guides the construction of grid point pairs necessary to effect the warp itself.
{"title":"Subset Warping: Rubber Sheeting with Cuts","authors":"Landau P., Schwartz E.","doi":"10.1006/cgip.1994.1022","DOIUrl":"10.1006/cgip.1994.1022","url":null,"abstract":"<div><p>Image warping, often referred to as \"rubber sheeting,\" represents the deformation of a domain image space into a range image space. In this paper, a technique which extends the definition of a rubber-sheet transformation to allow a polygonal region to be warped into one or more subsets of itself, where the subsets may be multiply connected, is described. To do this, it constructs a set of \"slits\" in the domain image, which correspond to discontinuities and concavities in the range image, using a technique based on generalized Voronoi diagrams. The concept of medial axis is extended to describe inner and outer medial contours of a polygon. Polygonal regions are decomposed into annular subregions, and path homotopies are introduced to describe the annular subregions. These constructions motivate the definition of a <em>ladder</em>, which guides the construction of grid point pairs necessary to effect the warp itself.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 247-266"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1022","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130236598","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A new method of peak analysis for threshold selection is presented. It is based on the wavelet transform which provides a multiscale analysis of the information content of the histogram of an image. We show that the detection of the zero-crossings and the local extrema of a wavelet transform of the histogram gives a complete characterization of the peaks in the histogram, that is to say, the values at which they start, end, and are extreme. These values are used for the unsupervised and automatic selection of a sequence of thresholds describing a coarse-to-fine analysis of histogram variation. The results of using the proposed technique are presented in the case of different images.
{"title":"Automatic Threshold Selection Using the Wavelet Transform","authors":"Olivo J.C.","doi":"10.1006/cgip.1994.1019","DOIUrl":"10.1006/cgip.1994.1019","url":null,"abstract":"<div><p>A new method of peak analysis for threshold selection is presented. It is based on the wavelet transform which provides a multiscale analysis of the information content of the histogram of an image. We show that the detection of the zero-crossings and the local extrema of a wavelet transform of the histogram gives a complete characterization of the peaks in the histogram, that is to say, the values at which they start, end, and are extreme. These values are used for the unsupervised and automatic selection of a sequence of thresholds describing a coarse-to-fine analysis of histogram variation. The results of using the proposed technique are presented in the case of different images.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 205-218"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1019","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77968118","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A model for fusing the output of multiple segmentation modules is presented. The model is based on the particle system approach to modeling dynamic objects from computer graphics. The model also has built-in capabilities to extract regions, thin the edge image, remove "twigs," and close gaps in the contours. The model functions both as an effective data fusion technique and as a model of an important human visual process.
{"title":"A Particle System Model for Combining Edge Information from Multiple Segmentation Modules","authors":"Dayanand S., Uttal W.R., Shepherd T., Lunskis C.","doi":"10.1006/cgip.1994.1020","DOIUrl":"10.1006/cgip.1994.1020","url":null,"abstract":"<div><p>A model for fusing the output of multiple segmentation modules is presented. The model is based on the particle system approach to modeling dynamic objects from computer graphics. The model also has built-in capabilities to extract regions, thin the edge image, remove \"twigs,\" and close gaps in the contours. The model functions both as an effective data fusion technique and as a model of an important human visual process.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 219-230"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1020","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130218713","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Given a polygonal curve P =[p1, p2, . . . , pn], the polygonal approximation problem considered calls for determining a new curve P′ = [p′1, p′2, . . . , p′m] such that (i) m is significantly smaller than n, (ii) the vertices of P′ are an ordered subset of the vertices of P, and (iii) any line segment [p′A, p′A + 1 of P′ that substitutes a chain [pB, . . . , pC] in P is such that for all i where B ≤ i ≤ C, the approximation error of pi with respect to [p′A, p′A + 1], according to some specified criterion and metric, is less than a predetermined error tolerance. Using the parallel-strip error criterion, we study the following problems for a curve P in Rd, where d = 2, 3: (i) minimize m for a given error tolerance and (ii) given m, find the curve P′ that has the minimum approximation error over all curves that have at most m vertices. These problems are called the min-# and min-ϵ problems, respectively. For R2 and with any one of the L1, L2, or L∞ distance metrics, we give algorithms to solve the min-# problem in O(n2) time and the min-ϵ problem in O(n2 log n) time, improving the best known algorithms to date by a factor of log n. When P is a polygonal curve in R3 that is strictly monotone with respect to one of the three axes, we show that if the L1 and L∞ metrics are used then the min-# problem can be solved in O(n2) time and the min-ϵ problem can be solved in O(n3) time. If distances are computed using the L2 metric then the min-# and min-ϵ problems can be solved in O(n3) and O(n3 log n) time, respectively. All of our algorithms exhibit O(n2) space complexity. Finally, we show that if it is not essential to minimize m, simple modifications of our algorithms afford a reduction by a factor of n for both time and space.
给定一条多边形曲线P =[p1, p2,…], pn],所考虑的多边形逼近问题要求确定一条新曲线P ' = [P ' 1, P ' 2,…], p ' m]使得(i) m明显小于n, (ii) p '的顶点是p '顶点的有序子集,以及(iii) p '的任何线段[p ' a, p ' a + 1]替代链[pB,…]。, P中的pC]是这样的:对于所有i,当B≤i≤C时,pi对[P 'A, P 'A + 1]的近似误差,根据某种规定的准则和度量,小于预定的误差容限。利用平行条形误差准则,我们研究了d = 2,3的曲线P的下列问题:(i)对于给定的误差容限最小化m, (ii)给定m,在所有顶点最多为m的曲线上找到具有最小近似误差的曲线P '。这些问题分别被称为min-#和min- λ问题。R2和任何一个L1, L2,或L∞距离度量,给出算法解决min - #问题在O (n2)时间和min -ϵ问题O (n2 O (log n))时间,改善最著名的算法迄今为止log n倍。当P是一个多边形曲线在R3严格单调的三个轴,我们表明,如果L1和L∞指标使用min - #的问题可以解决在O (n2)时间和min -ϵ问题可以解决在O (n3)时间。如果使用L2度量来计算距离,那么min-#和min- λ问题可以分别在O(n3)和O(n3 log n)时间内解决。我们所有的算法都表现出O(n2)的空间复杂度。最后,我们表明,如果m不是必须最小化的,我们的算法的简单修改可以在时间和空间上减少n个因子。
{"title":"On Approximating Polygonal Curves in Two and Three Dimensions","authors":"Eu D., Toussaint G.T.","doi":"10.1006/cgip.1994.1021","DOIUrl":"https://doi.org/10.1006/cgip.1994.1021","url":null,"abstract":"<div><p>Given a polygonal curve <em>P</em> =[<em>p</em><sub>1</sub>, <em>p</em><sub>2</sub>, . . . , <em>p</em><sub><em>n</em></sub>], the polygonal approximation problem considered calls for determining a new curve <em>P</em>′ = [<em>p</em>′<sub>1</sub>, <em>p</em>′<sub>2</sub>, . . . , <em>p</em>′<sub><em>m</em></sub>] such that (i) <em>m</em> is significantly smaller than <em>n</em>, (ii) the vertices of <em>P</em>′ are an ordered subset of the vertices of <em>P</em>, and (iii) any line segment [<em>p</em>′<sub><em>A</em></sub>, <em>p</em>′<sub><em>A</em> + 1</sub> of <em>P</em>′ that substitutes a chain [<em>p</em><sub><em>B</em></sub>, . . . , <em>p</em><sub><em>C</em></sub>] in <em>P</em> is such that for all <em>i</em> where <em>B</em> ≤ <em>i</em> ≤ <em>C</em>, the approximation error of <em>p</em><sub><em>i</em></sub> with respect to [<em>p</em>′<sub><em>A</em></sub>, <em>p</em>′<sub><em>A</em> + 1</sub>], according to some specified criterion and metric, is less than a predetermined error tolerance. Using the <em>parallel-strip</em> error criterion, we study the following problems for a curve <em>P</em> in <em>R</em><sup><em>d</em></sup>, where <em>d</em> = 2, 3: (i) minimize <em>m</em> for a given error tolerance and (ii) given <em>m</em>, find the curve <em>P</em>′ that has the minimum approximation error over all curves that have at most <em>m</em> vertices. These problems are called the min-# and min-ϵ problems, respectively. For <em>R</em><sup>2</sup> and with any one of the <em>L</em><sub>1</sub>, <em>L</em><sub>2</sub>, or <em>L</em><sub>∞</sub> distance metrics, we give algorithms to solve the min-# problem in <em>O</em>(<em>n</em><sup>2</sup>) time and the min-ϵ problem in <em>O</em>(<em>n</em><sup>2</sup> log <em>n</em>) time, improving the best known algorithms to date by a factor of log <em>n</em>. When <em>P</em> is a polygonal curve in <em>R</em><sup>3</sup> that is strictly monotone with respect to one of the three axes, we show that if the <em>L</em><sub>1</sub> and <em>L</em><sub>∞</sub> metrics are used then the min-# problem can be solved in <em>O</em>(<em>n</em><sup>2</sup>) time and the min-ϵ problem can be solved in <em>O</em>(<em>n</em><sup>3</sup>) time. If distances are computed using the <em>L</em><sub>2</sub> metric then the min-# and min-ϵ problems can be solved in <em>O</em>(<em>n</em><sup>3</sup>) and <em>O</em>(<em>n</em><sup>3</sup> log <em>n</em>) time, respectively. All of our algorithms exhibit <em>O</em>(<em>n</em><sup>2</sup>) space complexity. Finally, we show that if it is not essential to minimize <em>m</em>, simple modifications of our algorithms afford a reduction by a factor of <em>n</em> for both time and space.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 231-246"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1021","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134687087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
An approximate algorithm is proposed for solving the Rectangle Covering Problem based on the quasi-mechanical concept. The phenomenon due to universal gravitation and screen has been simulated. It is pointed out that valuable approximate algorithms can be obtained for large numbers of NP hard problems by following the quasi-mechanical procedure. Behind the quasi-mechanical procedure are the strategies in the priority method in modern recursive theory.
{"title":"A Quasi-mechanical Method for Solving the Rectangle Covering Problem-An Approach to Tackling NP Hard Problems","authors":"Huang W.Q., Wang G.Q.","doi":"10.1006/cgip.1994.1023","DOIUrl":"10.1006/cgip.1994.1023","url":null,"abstract":"<div><p>An approximate algorithm is proposed for solving the Rectangle Covering Problem based on the quasi-mechanical concept. The phenomenon due to universal gravitation and screen has been simulated. It is pointed out that valuable approximate algorithms can be obtained for large numbers of NP hard problems by following the quasi-mechanical procedure. Behind the quasi-mechanical procedure are the strategies in the priority method in modern recursive theory.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 267-271"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1023","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120841809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A hierarchical image smoothing method is presented which does not require user specified parameters. For every pixel the largest centered window (7 × 7, 5 × 5 or 3 × 3) containing a constant patch is sought. The selection is made by comparing a locally computed homogeneity measure with its robust global estimate. If the window is declared homogeneous, the pixel is assigned the spatial average. Around discontinuities an adaptive least squares smoothing method is applied for 3 × 3 windows. The performance of the algorithm is compared with several other smoothing techniques for additively corrupted images. The smoothing of synthetic aperture radar images is used as an example for multiplicative noise.
{"title":"Multiresolution Adaptive Image Smoothing","authors":"Meer P., Park R.H., Cho K.J.","doi":"10.1006/cgip.1994.1013","DOIUrl":"10.1006/cgip.1994.1013","url":null,"abstract":"<div><p>A hierarchical image smoothing method is presented which does not require user specified parameters. For every pixel the largest centered window (7 × 7, 5 × 5 or 3 × 3) containing a constant patch is sought. The selection is made by comparing a locally computed homogeneity measure with its robust global estimate. If the window is declared homogeneous, the pixel is assigned the spatial average. Around discontinuities an adaptive least squares smoothing method is applied for 3 × 3 windows. The performance of the algorithm is compared with several other smoothing techniques for additively corrupted images. The smoothing of synthetic aperture radar images is used as an example for multiplicative noise.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 2","pages":"Pages 140-148"},"PeriodicalIF":0.0,"publicationDate":"1994-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1013","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113954556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}