首页 > 最新文献

CVGIP: Graphical Models and Image Processing最新文献

英文 中文
Computationally Efficient Algorithms for High-Dimensional Robust Estimators 高维鲁棒估计的高效计算算法
Pub Date : 1994-07-01 DOI: 10.1006/cgip.1994.1026
Mount D.M., Netanyahu N.S.

Given a set of n distinct points in d-dimensional space that are hypothesized to lie on a hyperplane, robust statistical estimators have been recently proposed for the parameters of the model that best fits these points. This paper presents efficient algorithms for computing median-based robust estimators (e.g., the Theil-Sen and repeated median (RM) estimators) in high-dimensional space. We briefly review basic computational geometry techniques that were used to achieve efficient algorithms in the 2-D case. Then generalization of these techniques to higher dimensions is introduced. Geometric observations are followed by a presentation of O(nd − 1 log n) expected time algorithms for the d-dimensional Theil-Sen and RM estimators. Both algorithms are space optimal; i.e., they require O(n) storage, for fixed d. Finally, an extension of the methodology to nonlinear domain(s) is demonstrated.

给定d维空间中n个不同的点,假设它们位于超平面上,最近已经提出了最适合这些点的模型参数的鲁棒统计估计器。本文提出了计算高维空间中基于中值的鲁棒估计量(如Theil-Sen和重复中值(RM)估计量)的有效算法。我们简要回顾了用于在二维情况下实现高效算法的基本计算几何技术。然后介绍了这些技术在高维上的推广。几何观测之后是对d维Theil-Sen和RM估计器的O(and−1 log n)期望时间算法的介绍。两种算法都是空间最优的;也就是说,对于固定的d,它们需要O(n)存储空间。最后,将该方法扩展到非线性域(s)。
{"title":"Computationally Efficient Algorithms for High-Dimensional Robust Estimators","authors":"Mount D.M.,&nbsp;Netanyahu N.S.","doi":"10.1006/cgip.1994.1026","DOIUrl":"10.1006/cgip.1994.1026","url":null,"abstract":"<div><p>Given a set of <em>n</em> distinct points in <em>d</em>-dimensional space that are hypothesized to lie on a hyperplane, robust statistical estimators have been recently proposed for the parameters of the model that best fits these points. This paper presents efficient algorithms for computing median-based robust estimators (e.g., the Theil-Sen and repeated median (RM) estimators) in high-dimensional space. We briefly review basic computational geometry techniques that were used to achieve efficient algorithms in the 2-D case. Then generalization of these techniques to higher dimensions is introduced. Geometric observations are followed by a presentation of <em>O</em>(<em>n</em><sup><em>d</em> − 1</sup> log <em>n</em>) expected time algorithms for the <em>d</em>-dimensional Theil-Sen and RM estimators. Both algorithms are space optimal; i.e., they require <em>O</em>(<em>n</em>) storage, for fixed <em>d</em>. Finally, an extension of the methodology to nonlinear domain(s) is demonstrated.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 4","pages":"Pages 289-303"},"PeriodicalIF":0.0,"publicationDate":"1994-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1026","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128252869","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Curve Fitting by a Sum of Gaussians 用高斯函数和进行曲线拟合
Pub Date : 1994-07-01 DOI: 10.1006/cgip.1994.1025
Goshtasby A., Oneill W.D.

Gaussians are useful in multiscale representation of video data. An algorithm is presented which approximates a sequence of uniformly spaced single-valued data by a sum of Gaussians with a prescribed accuracy. The scale-space image [6] of the data is used to estimate the number of Gaussians and their initial parameters. The Marquardt algorithm (J. SIAM 11(2), 1963, 431-441) is then used to optimize the parameters.

高斯分布在视频数据的多尺度表示中非常有用。提出了一种算法,用高斯函数的和来近似一列等间距单值数据,并具有一定的精度。利用数据的尺度空间图像[6]估计高斯函数的个数及其初始参数。然后使用Marquardt算法(J. SIAM 11(2), 1963, 431-441)对参数进行优化。
{"title":"Curve Fitting by a Sum of Gaussians","authors":"Goshtasby A.,&nbsp;Oneill W.D.","doi":"10.1006/cgip.1994.1025","DOIUrl":"10.1006/cgip.1994.1025","url":null,"abstract":"<div><p>Gaussians are useful in multiscale representation of video data. An algorithm is presented which approximates a sequence of uniformly spaced single-valued data by a sum of Gaussians with a prescribed accuracy. The scale-space image [6] of the data is used to estimate the number of Gaussians and their initial parameters. The Marquardt algorithm (<em>J. SIAM</em> 11(2), 1963, 431-441) is then used to optimize the parameters.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 4","pages":"Pages 281-288"},"PeriodicalIF":0.0,"publicationDate":"1994-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1025","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129723324","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 86
Processing of Off-Line Handwritten Text: Polygonal Approximation and Enforcement of Temporal Information 脱机手写文本的处理:时间信息的多边形逼近与强制
Pub Date : 1994-07-01 DOI: 10.1006/cgip.1994.1029
Abuhaiba I.S.I., Holt M.J.J., Datta S.

Algorithms to process off-line Arabic handwriting prior to recognition are presented. The first algorithm converts smoothed and thinned images into polygonal approximations. The second algorithm determines the start vertex of writing. The third algorithm enforces temporal information by traversing the graph of the stroke in an order consistent with Arabic handwriting. It implements the following heuristic rule: the minimum distance path that traverses the stroke′s polygon from the start vertex to the end vertex has its vertices ordered as they were generated when the stroke was written. This third algorithm is developed from a standard solution of the Chinese postman′s problem applied to the graph of the stroke. Special rules to enforce temporal information on the stroke to obtain the most likely traversal that is consistent with Arabic handwriting are applied. Unconstrained handwritten strokes written by five subjects, (n = 4065) were used in testing. In 92.6% of the samples, the proposed algorithms restored the actual temporal information.

提出了在识别之前处理离线阿拉伯笔迹的算法。第一种算法将平滑和细化的图像转换为多边形近似。第二种算法确定书写的起始顶点。第三种算法通过按照与阿拉伯笔迹一致的顺序遍历笔划的图形来强制执行时间信息。它实现了以下启发式规则:从起始顶点到结束顶点遍历笔划多边形的最小距离路径使其顶点按照编写笔划时生成的顺序排列。第三种算法是从中国邮递员问题的标准解发展而来的,应用于笔划图。应用特殊规则来强制笔划上的时间信息,以获得与阿拉伯语手写一致的最有可能的遍历。测试中使用了五名受试者(n=4065)书写的无限制手写笔画。在92.6%的样本中,所提出的算法恢复了实际的时间信息。
{"title":"Processing of Off-Line Handwritten Text: Polygonal Approximation and Enforcement of Temporal Information","authors":"Abuhaiba I.S.I.,&nbsp;Holt M.J.J.,&nbsp;Datta S.","doi":"10.1006/cgip.1994.1029","DOIUrl":"https://doi.org/10.1006/cgip.1994.1029","url":null,"abstract":"<div><p>Algorithms to process off-line Arabic handwriting prior to recognition are presented. The first algorithm converts smoothed and thinned images into polygonal approximations. The second algorithm determines the start vertex of writing. The third algorithm enforces temporal information by traversing the graph of the stroke in an order consistent with Arabic handwriting. It implements the following heuristic rule: the minimum distance path that traverses the stroke′s polygon from the start vertex to the end vertex has its vertices ordered as they were generated when the stroke was written. This third algorithm is developed from a standard solution of the Chinese postman′s problem applied to the graph of the stroke. Special rules to enforce temporal information on the stroke to obtain the most likely traversal that is consistent with Arabic handwriting are applied. Unconstrained handwritten strokes written by five subjects, (<em>n</em> = 4065) were used in testing. In 92.6% of the samples, the proposed algorithms restored the actual temporal information.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 4","pages":"Pages 324-335"},"PeriodicalIF":0.0,"publicationDate":"1994-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1029","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72279500","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
On the Detection of Robust Curves 关于鲁棒曲线的检测
Pub Date : 1994-05-01 DOI: 10.1006/cgip.1994.1018
Cole R., Vishkin U.

Given m points in the plane and a threshold t, a curve is defined to be robust if at least t points lie on it. Efficient algorithms for detecting robust curves are given; the key contribution is to use randomized sampling. In addition, an approximate version of the problem is introduced. A geometric solution to this problem is given; it too can be enhanced by randomization. These algorithms are readily generalized to solve the problem of robust curve detection in a scene of curve fragments: given a set of curve segments, a curve σ is defined to be robust if curve segments of total length at least l lie on σ. Again, both an exact and an approximate version of the problem are considered. The problems and solutions are closely related to the well-investigated Hough transform technique.

给定平面上的m个点和阈值t,如果一条曲线上至少有t个点,则该曲线被定义为鲁棒曲线。给出了检测鲁棒曲线的有效算法;关键的贡献是使用随机抽样。此外,还介绍了该问题的一个近似版本。给出了该问题的几何解;它也可以通过随机化来增强。这些算法很容易推广到解决曲线片段场景下的鲁棒曲线检测问题:给定一组曲线段,如果总长度至少为1的曲线段位于σ上,则定义曲线σ为鲁棒曲线。同样,问题的精确版本和近似版本都被考虑。这些问题和解决方法与研究得很好的霍夫变换技术密切相关。
{"title":"On the Detection of Robust Curves","authors":"Cole R.,&nbsp;Vishkin U.","doi":"10.1006/cgip.1994.1018","DOIUrl":"10.1006/cgip.1994.1018","url":null,"abstract":"<div><p>Given <em>m</em> points in the plane and a threshold <em>t</em>, a curve is defined to be robust if at least <em>t</em> points lie on it. Efficient algorithms for detecting robust curves are given; the key contribution is to use randomized sampling. In addition, an approximate version of the problem is introduced. A geometric solution to this problem is given; it too can be enhanced by randomization. These algorithms are readily generalized to solve the problem of robust curve detection in a scene of curve fragments: given a set of curve segments, a curve σ is defined to be robust if curve segments of total length at least <em>l</em> lie on σ. Again, both an exact and an approximate version of the problem are considered. The problems and solutions are closely related to the well-investigated Hough transform technique.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 189-204"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1018","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120048622","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Subset Warping: Rubber Sheeting with Cuts 子集整经:有切口的橡胶板
Pub Date : 1994-05-01 DOI: 10.1006/cgip.1994.1022
Landau P., Schwartz E.

Image warping, often referred to as "rubber sheeting," represents the deformation of a domain image space into a range image space. In this paper, a technique which extends the definition of a rubber-sheet transformation to allow a polygonal region to be warped into one or more subsets of itself, where the subsets may be multiply connected, is described. To do this, it constructs a set of "slits" in the domain image, which correspond to discontinuities and concavities in the range image, using a technique based on generalized Voronoi diagrams. The concept of medial axis is extended to describe inner and outer medial contours of a polygon. Polygonal regions are decomposed into annular subregions, and path homotopies are introduced to describe the annular subregions. These constructions motivate the definition of a ladder, which guides the construction of grid point pairs necessary to effect the warp itself.

图像翘曲,通常被称为“橡皮板”,表示将域图像空间变形为范围图像空间。本文描述了一种技术,它扩展了橡胶板变换的定义,允许多边形区域被翘曲成其自身的一个或多个子集,其中这些子集可以是多重连通的。为此,它使用基于广义Voronoi图的技术,在域图像中构造了一组“裂缝”,这些裂缝对应于范围图像中的不连续和凹陷。将内轴的概念扩展到描述多边形的内、外内等高线。将多边形区域分解为环形子区域,并引入路径同伦来描述环形子区域。这些结构激发了梯子的定义,它指导了网格点对的构造,这是影响经纱本身所必需的。
{"title":"Subset Warping: Rubber Sheeting with Cuts","authors":"Landau P.,&nbsp;Schwartz E.","doi":"10.1006/cgip.1994.1022","DOIUrl":"10.1006/cgip.1994.1022","url":null,"abstract":"<div><p>Image warping, often referred to as \"rubber sheeting,\" represents the deformation of a domain image space into a range image space. In this paper, a technique which extends the definition of a rubber-sheet transformation to allow a polygonal region to be warped into one or more subsets of itself, where the subsets may be multiply connected, is described. To do this, it constructs a set of \"slits\" in the domain image, which correspond to discontinuities and concavities in the range image, using a technique based on generalized Voronoi diagrams. The concept of medial axis is extended to describe inner and outer medial contours of a polygon. Polygonal regions are decomposed into annular subregions, and path homotopies are introduced to describe the annular subregions. These constructions motivate the definition of a <em>ladder</em>, which guides the construction of grid point pairs necessary to effect the warp itself.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 247-266"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1022","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130236598","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Automatic Threshold Selection Using the Wavelet Transform 基于小波变换的自动阈值选择
Pub Date : 1994-05-01 DOI: 10.1006/cgip.1994.1019
Olivo J.C.

A new method of peak analysis for threshold selection is presented. It is based on the wavelet transform which provides a multiscale analysis of the information content of the histogram of an image. We show that the detection of the zero-crossings and the local extrema of a wavelet transform of the histogram gives a complete characterization of the peaks in the histogram, that is to say, the values at which they start, end, and are extreme. These values are used for the unsupervised and automatic selection of a sequence of thresholds describing a coarse-to-fine analysis of histogram variation. The results of using the proposed technique are presented in the case of different images.

提出了一种用于阈值选择的峰值分析新方法。它基于小波变换,对图像直方图的信息内容进行多尺度分析。我们证明了直方图的小波变换的过零和局部极值的检测给出了直方图中峰的完整表征,也就是说,它们开始,结束和极值的值。这些值用于描述直方图变化的粗到精分析的一系列阈值的无监督和自动选择。在不同图像的情况下,给出了使用该技术的结果。
{"title":"Automatic Threshold Selection Using the Wavelet Transform","authors":"Olivo J.C.","doi":"10.1006/cgip.1994.1019","DOIUrl":"10.1006/cgip.1994.1019","url":null,"abstract":"<div><p>A new method of peak analysis for threshold selection is presented. It is based on the wavelet transform which provides a multiscale analysis of the information content of the histogram of an image. We show that the detection of the zero-crossings and the local extrema of a wavelet transform of the histogram gives a complete characterization of the peaks in the histogram, that is to say, the values at which they start, end, and are extreme. These values are used for the unsupervised and automatic selection of a sequence of thresholds describing a coarse-to-fine analysis of histogram variation. The results of using the proposed technique are presented in the case of different images.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 205-218"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1019","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77968118","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 42
A Particle System Model for Combining Edge Information from Multiple Segmentation Modules 多分割模块边缘信息组合的粒子系统模型
Pub Date : 1994-05-01 DOI: 10.1006/cgip.1994.1020
Dayanand S., Uttal W.R., Shepherd T., Lunskis C.

A model for fusing the output of multiple segmentation modules is presented. The model is based on the particle system approach to modeling dynamic objects from computer graphics. The model also has built-in capabilities to extract regions, thin the edge image, remove "twigs," and close gaps in the contours. The model functions both as an effective data fusion technique and as a model of an important human visual process.

提出了一种融合多个分割模块输出的模型。该模型是基于粒子系统的方法,从计算机图形学建模动态对象。该模型还具有提取区域、使边缘图像变薄、去除“细枝”和关闭轮廓中的间隙的内置功能。该模型既是一种有效的数据融合技术,也是一种重要的人类视觉过程的模型。
{"title":"A Particle System Model for Combining Edge Information from Multiple Segmentation Modules","authors":"Dayanand S.,&nbsp;Uttal W.R.,&nbsp;Shepherd T.,&nbsp;Lunskis C.","doi":"10.1006/cgip.1994.1020","DOIUrl":"10.1006/cgip.1994.1020","url":null,"abstract":"<div><p>A model for fusing the output of multiple segmentation modules is presented. The model is based on the particle system approach to modeling dynamic objects from computer graphics. The model also has built-in capabilities to extract regions, thin the edge image, remove \"twigs,\" and close gaps in the contours. The model functions both as an effective data fusion technique and as a model of an important human visual process.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 219-230"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1020","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130218713","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
On Approximating Polygonal Curves in Two and Three Dimensions 二维和三维多边形曲线的逼近
Pub Date : 1994-05-01 DOI: 10.1006/cgip.1994.1021
Eu D., Toussaint G.T.

Given a polygonal curve P =[p1, p2, . . . , pn], the polygonal approximation problem considered calls for determining a new curve P′ = [p1, p2, . . . , pm] such that (i) m is significantly smaller than n, (ii) the vertices of P′ are an ordered subset of the vertices of P, and (iii) any line segment [pA, pA + 1 of P′ that substitutes a chain [pB, . . . , pC] in P is such that for all i where BiC, the approximation error of pi with respect to [pA, pA + 1], according to some specified criterion and metric, is less than a predetermined error tolerance. Using the parallel-strip error criterion, we study the following problems for a curve P in Rd, where d = 2, 3: (i) minimize m for a given error tolerance and (ii) given m, find the curve P′ that has the minimum approximation error over all curves that have at most m vertices. These problems are called the min-# and min-ϵ problems, respectively. For R2 and with any one of the L1, L2, or L distance metrics, we give algorithms to solve the min-# problem in O(n2) time and the min-ϵ problem in O(n2 log n) time, improving the best known algorithms to date by a factor of log n. When P is a polygonal curve in R3 that is strictly monotone with respect to one of the three axes, we show that if the L1 and L metrics are used then the min-# problem can be solved in O(n2) time and the min-ϵ problem can be solved in O(n3) time. If distances are computed using the L2 metric then the min-# and min-ϵ problems can be solved in O(n3) and O(n3 log n) time, respectively. All of our algorithms exhibit O(n2) space complexity. Finally, we show that if it is not essential to minimize m, simple modifications of our algorithms afford a reduction by a factor of n for both time and space.

给定一条多边形曲线P =[p1, p2,…], pn],所考虑的多边形逼近问题要求确定一条新曲线P ' = [P ' 1, P ' 2,…], p ' m]使得(i) m明显小于n, (ii) p '的顶点是p '顶点的有序子集,以及(iii) p '的任何线段[p ' a, p ' a + 1]替代链[pB,…]。, P中的pC]是这样的:对于所有i,当B≤i≤C时,pi对[P 'A, P 'A + 1]的近似误差,根据某种规定的准则和度量,小于预定的误差容限。利用平行条形误差准则,我们研究了d = 2,3的曲线P的下列问题:(i)对于给定的误差容限最小化m, (ii)给定m,在所有顶点最多为m的曲线上找到具有最小近似误差的曲线P '。这些问题分别被称为min-#和min- λ问题。R2和任何一个L1, L2,或L∞距离度量,给出算法解决min - #问题在O (n2)时间和min -ϵ问题O (n2 O (log n))时间,改善最著名的算法迄今为止log n倍。当P是一个多边形曲线在R3严格单调的三个轴,我们表明,如果L1和L∞指标使用min - #的问题可以解决在O (n2)时间和min -ϵ问题可以解决在O (n3)时间。如果使用L2度量来计算距离,那么min-#和min- λ问题可以分别在O(n3)和O(n3 log n)时间内解决。我们所有的算法都表现出O(n2)的空间复杂度。最后,我们表明,如果m不是必须最小化的,我们的算法的简单修改可以在时间和空间上减少n个因子。
{"title":"On Approximating Polygonal Curves in Two and Three Dimensions","authors":"Eu D.,&nbsp;Toussaint G.T.","doi":"10.1006/cgip.1994.1021","DOIUrl":"https://doi.org/10.1006/cgip.1994.1021","url":null,"abstract":"<div><p>Given a polygonal curve <em>P</em> =[<em>p</em><sub>1</sub>, <em>p</em><sub>2</sub>, . . . , <em>p</em><sub><em>n</em></sub>], the polygonal approximation problem considered calls for determining a new curve <em>P</em>′ = [<em>p</em>′<sub>1</sub>, <em>p</em>′<sub>2</sub>, . . . , <em>p</em>′<sub><em>m</em></sub>] such that (i) <em>m</em> is significantly smaller than <em>n</em>, (ii) the vertices of <em>P</em>′ are an ordered subset of the vertices of <em>P</em>, and (iii) any line segment [<em>p</em>′<sub><em>A</em></sub>, <em>p</em>′<sub><em>A</em> + 1</sub> of <em>P</em>′ that substitutes a chain [<em>p</em><sub><em>B</em></sub>, . . . , <em>p</em><sub><em>C</em></sub>] in <em>P</em> is such that for all <em>i</em> where <em>B</em> ≤ <em>i</em> ≤ <em>C</em>, the approximation error of <em>p</em><sub><em>i</em></sub> with respect to [<em>p</em>′<sub><em>A</em></sub>, <em>p</em>′<sub><em>A</em> + 1</sub>], according to some specified criterion and metric, is less than a predetermined error tolerance. Using the <em>parallel-strip</em> error criterion, we study the following problems for a curve <em>P</em> in <em>R</em><sup><em>d</em></sup>, where <em>d</em> = 2, 3: (i) minimize <em>m</em> for a given error tolerance and (ii) given <em>m</em>, find the curve <em>P</em>′ that has the minimum approximation error over all curves that have at most <em>m</em> vertices. These problems are called the min-# and min-ϵ problems, respectively. For <em>R</em><sup>2</sup> and with any one of the <em>L</em><sub>1</sub>, <em>L</em><sub>2</sub>, or <em>L</em><sub>∞</sub> distance metrics, we give algorithms to solve the min-# problem in <em>O</em>(<em>n</em><sup>2</sup>) time and the min-ϵ problem in <em>O</em>(<em>n</em><sup>2</sup> log <em>n</em>) time, improving the best known algorithms to date by a factor of log <em>n</em>. When <em>P</em> is a polygonal curve in <em>R</em><sup>3</sup> that is strictly monotone with respect to one of the three axes, we show that if the <em>L</em><sub>1</sub> and <em>L</em><sub>∞</sub> metrics are used then the min-# problem can be solved in <em>O</em>(<em>n</em><sup>2</sup>) time and the min-ϵ problem can be solved in <em>O</em>(<em>n</em><sup>3</sup>) time. If distances are computed using the <em>L</em><sub>2</sub> metric then the min-# and min-ϵ problems can be solved in <em>O</em>(<em>n</em><sup>3</sup>) and <em>O</em>(<em>n</em><sup>3</sup> log <em>n</em>) time, respectively. All of our algorithms exhibit <em>O</em>(<em>n</em><sup>2</sup>) space complexity. Finally, we show that if it is not essential to minimize <em>m</em>, simple modifications of our algorithms afford a reduction by a factor of <em>n</em> for both time and space.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 231-246"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1021","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134687087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Quasi-mechanical Method for Solving the Rectangle Covering Problem-An Approach to Tackling NP Hard Problems 求解矩形覆盖问题的一种准力学方法——解决NP困难问题的一种途径
Pub Date : 1994-05-01 DOI: 10.1006/cgip.1994.1023
Huang W.Q., Wang G.Q.

An approximate algorithm is proposed for solving the Rectangle Covering Problem based on the quasi-mechanical concept. The phenomenon due to universal gravitation and screen has been simulated. It is pointed out that valuable approximate algorithms can be obtained for large numbers of NP hard problems by following the quasi-mechanical procedure. Behind the quasi-mechanical procedure are the strategies in the priority method in modern recursive theory.

基于准力学概念,提出了求解矩形覆盖问题的近似算法。模拟了万有引力和筛网的作用。本文指出,采用准力学方法可以得到大量NP困难问题的有价值的近似算法。准机械过程的背后是现代递归理论中优先级法的策略。
{"title":"A Quasi-mechanical Method for Solving the Rectangle Covering Problem-An Approach to Tackling NP Hard Problems","authors":"Huang W.Q.,&nbsp;Wang G.Q.","doi":"10.1006/cgip.1994.1023","DOIUrl":"10.1006/cgip.1994.1023","url":null,"abstract":"<div><p>An approximate algorithm is proposed for solving the Rectangle Covering Problem based on the quasi-mechanical concept. The phenomenon due to universal gravitation and screen has been simulated. It is pointed out that valuable approximate algorithms can be obtained for large numbers of NP hard problems by following the quasi-mechanical procedure. Behind the quasi-mechanical procedure are the strategies in the priority method in modern recursive theory.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 3","pages":"Pages 267-271"},"PeriodicalIF":0.0,"publicationDate":"1994-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1023","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120841809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Multiresolution Adaptive Image Smoothing 多分辨率自适应图像平滑
Pub Date : 1994-03-01 DOI: 10.1006/cgip.1994.1013
Meer P., Park R.H., Cho K.J.

A hierarchical image smoothing method is presented which does not require user specified parameters. For every pixel the largest centered window (7 × 7, 5 × 5 or 3 × 3) containing a constant patch is sought. The selection is made by comparing a locally computed homogeneity measure with its robust global estimate. If the window is declared homogeneous, the pixel is assigned the spatial average. Around discontinuities an adaptive least squares smoothing method is applied for 3 × 3 windows. The performance of the algorithm is compared with several other smoothing techniques for additively corrupted images. The smoothing of synthetic aperture radar images is used as an example for multiplicative noise.

提出了一种不需要用户指定参数的分层图像平滑方法。对于每个像素,寻找包含恒定补丁的最大中心窗口(7 × 7,5 × 5或3 × 3)。通过比较局部计算的均匀性度量与其稳健的全局估计来进行选择。如果窗口被声明为均匀的,则为像素分配空间平均值。对于3 × 3窗口,采用自适应最小二乘平滑方法。该算法的性能与其他几种平滑技术的加性损坏图像进行了比较。以合成孔径雷达图像的平滑作为乘性噪声的例子。
{"title":"Multiresolution Adaptive Image Smoothing","authors":"Meer P.,&nbsp;Park R.H.,&nbsp;Cho K.J.","doi":"10.1006/cgip.1994.1013","DOIUrl":"10.1006/cgip.1994.1013","url":null,"abstract":"<div><p>A hierarchical image smoothing method is presented which does not require user specified parameters. For every pixel the largest centered window (7 × 7, 5 × 5 or 3 × 3) containing a constant patch is sought. The selection is made by comparing a locally computed homogeneity measure with its robust global estimate. If the window is declared homogeneous, the pixel is assigned the spatial average. Around discontinuities an adaptive least squares smoothing method is applied for 3 × 3 windows. The performance of the algorithm is compared with several other smoothing techniques for additively corrupted images. The smoothing of synthetic aperture radar images is used as an example for multiplicative noise.</p></div>","PeriodicalId":100349,"journal":{"name":"CVGIP: Graphical Models and Image Processing","volume":"56 2","pages":"Pages 140-148"},"PeriodicalIF":0.0,"publicationDate":"1994-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cgip.1994.1013","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113954556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 63
期刊
CVGIP: Graphical Models and Image Processing
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1