Computer systems can be used to interconnect many different types of color devices: monitors, printers, film and video recorders. Techniques based on CIE standards provide a degree of “device independence” to color reproduction in these systems. However, reproducing tristimulus values, pixel by pixel, will not result in acceptable color reproduction for images or related color sets. Differences in gamut and appearance characteristics make an additional transformation step essential, a step we call gamut mapping. While this model has proven useful for color reproduction across media [11], there are still research problems left to be solved before it can be fully realized. This paper will discuss these issues and also some of the inherent limitations of this approach to color reproduction.
{"title":"Device Independent Color Reproduction","authors":"M. Stone","doi":"10.1364/av.1989.fa1","DOIUrl":"https://doi.org/10.1364/av.1989.fa1","url":null,"abstract":"Computer systems can be used to interconnect many different types of color devices: monitors, printers, film and video recorders. Techniques based on CIE standards provide a degree of “device independence” to color reproduction in these systems. However, reproducing tristimulus values, pixel by pixel, will not result in acceptable color reproduction for images or related color sets. Differences in gamut and appearance characteristics make an additional transformation step essential, a step we call gamut mapping. While this model has proven useful for color reproduction across media [11], there are still research problems left to be solved before it can be fully realized. This paper will discuss these issues and also some of the inherent limitations of this approach to color reproduction.","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115952603","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A periodic moving stimulus can appear to move in the reverse direction if it is under-sampled in time, as in the case of the "wagon wheel" effect caused by an inadequate frame rate in motion pictures. Sampling by a spatial array of sensors or pixels can produce a similar motion reversal for periodic patterns moving at any velocity, if the spatial sampling frequency is too low. These artifacts are well-known to engineers who design discrete imaging systems. The artifact resulting from spatial under-sampling has been demonstrated in biological imaging systems (Goetz, 1965, Coletta and Williams, 1987). For example, insects tethered at the center of a rotating drum containing low spatial frequency vertical stripes exhibit an optomotor response: they rotate in the same direction as the stripes. However, these insects reverse their direction of motion when confronted with spatial frequencies that exceed the Nyquist frequency of their ommatidial array. This is just what one would expect from spatial aliasing by the regular array of insect ommatidia. Nancy Coletta and I have demonstrated a similar effect in the human with drifting interference fringes whose contrast is immune to optical degradation. In the parafoveal retina, high spatial frequency (but not low) gratings look like two-dimensional spatial noise and can appear to move in the opposite direction from their true direction of motion. This motion reversal can be demonstrated with a forced-choice technique. Subjects guessed the direction of motion of vertical, unity contrast fringes whose direction was randomly determined on each trial. No feedback was provided. Percent correct falls significantly below chance performance at high spatial frequencies, indicating a reversal in the perceived direction of motion. At higher frequencies, the perceived direction of motion reverses a second time, and at even higher frequencies performance settles to chance.
{"title":"Photoreceptor Sampling of Moving Images","authors":"David R. Williams","doi":"10.1364/av.1989.wc1","DOIUrl":"https://doi.org/10.1364/av.1989.wc1","url":null,"abstract":"A periodic moving stimulus can appear to move in the reverse direction if it is under-sampled in time, as in the case of the \"wagon wheel\" effect caused by an inadequate frame rate in motion pictures. Sampling by a spatial array of sensors or pixels can produce a similar motion reversal for periodic patterns moving at any velocity, if the spatial sampling frequency is too low. These artifacts are well-known to engineers who design discrete imaging systems. The artifact resulting from spatial under-sampling has been demonstrated in biological imaging systems (Goetz, 1965, Coletta and Williams, 1987). For example, insects tethered at the center of a rotating drum containing low spatial frequency vertical stripes exhibit an optomotor response: they rotate in the same direction as the stripes. However, these insects reverse their direction of motion when confronted with spatial frequencies that exceed the Nyquist frequency of their ommatidial array. This is just what one would expect from spatial aliasing by the regular array of insect ommatidia. Nancy Coletta and I have demonstrated a similar effect in the human with drifting interference fringes whose contrast is immune to optical degradation. In the parafoveal retina, high spatial frequency (but not low) gratings look like two-dimensional spatial noise and can appear to move in the opposite direction from their true direction of motion. This motion reversal can be demonstrated with a forced-choice technique. Subjects guessed the direction of motion of vertical, unity contrast fringes whose direction was randomly determined on each trial. No feedback was provided. Percent correct falls significantly below chance performance at high spatial frequencies, indicating a reversal in the perceived direction of motion. At higher frequencies, the perceived direction of motion reverses a second time, and at even higher frequencies performance settles to chance.","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125898883","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The fundamental objective of advanced television research is to produce beautiful, life-like pictures. The perceptual requirements for producing such vivid images include image size, brightness, contrast, color saturation and purity, and noise and artifact visibility in addition to spatial resolution.
{"title":"Visual Perception and the Evolution of Video","authors":"C. R. Carlson, J. Bergen","doi":"10.1364/av.1989.thc2","DOIUrl":"https://doi.org/10.1364/av.1989.thc2","url":null,"abstract":"The fundamental objective of advanced television research is to produce beautiful, life-like pictures. The perceptual requirements for producing such vivid images include image size, brightness, contrast, color saturation and purity, and noise and artifact visibility in addition to spatial resolution.","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122076566","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A visual system is calibrated geometrically if its estimates of the spatial properties of a scene are accurate: straight lines are judged straight, angles are correctly estimated, collinear line segments are perceived to fall on a common line. A visual system can fail to be calibrated because of a mismatch between its optics and later visual processing: calibration of computer vision systems typically requires remapping the sensor inputs to compensate for spherical aberration in the camera lens [1].
{"title":"Calibrating a Linear Visual System by Comparision of Inputs Across Camera/Eye Movements","authors":"L. Maloney","doi":"10.1364/av.1989.wc4","DOIUrl":"https://doi.org/10.1364/av.1989.wc4","url":null,"abstract":"A visual system is calibrated geometrically if its estimates of the spatial properties of a scene are accurate: straight lines are judged straight, angles are correctly estimated, collinear line segments are perceived to fall on a common line. A visual system can fail to be calibrated because of a mismatch between its optics and later visual processing: calibration of computer vision systems typically requires remapping the sensor inputs to compensate for spherical aberration in the camera lens [1].","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"89 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115826164","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
At display or print time, images must be properly scaled with optional sharpening, tone-scale or color adjusted, and quantized either dynamically or simply, depending on the available color levels at the targeted device.
{"title":"Challenges in Device-Independent Image Rendering","authors":"R. Ulichney","doi":"10.1364/av.1989.fa4","DOIUrl":"https://doi.org/10.1364/av.1989.fa4","url":null,"abstract":"At display or print time, images must be properly scaled with optional sharpening, tone-scale or color adjusted, and quantized either dynamically or simply, depending on the available color levels at the targeted device.","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121582653","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
F. O. Huck, Sarah John, J. A. McCormick, R. Narayanswamy
Image gathering and digital restoration are commonly treated as separate tasks. However, it is possible to gain significant improvements in fidelity, resolution, sharpness, and clarity when these two tasks are optimized together. In this paper, we demonstrate the improvements that can be gained when (1) the design of the image-gathering system is optimized for high information density rather than for conventional image reconstruction, and (2) the digital restoration of the image accounts for the aliasing as well as the blurring and noise in image gathering and practically eliminates the degradations that occur due to the blurring and raster effects in image reconstruction.
{"title":"Image Gathering and Digital Restoration: End-To-End Optimization for Visual Quality","authors":"F. O. Huck, Sarah John, J. A. McCormick, R. Narayanswamy","doi":"10.1364/av.1989.tha9","DOIUrl":"https://doi.org/10.1364/av.1989.tha9","url":null,"abstract":"Image gathering and digital restoration are commonly treated as separate tasks. However, it is possible to gain significant improvements in fidelity, resolution, sharpness, and clarity when these two tasks are optimized together. In this paper, we demonstrate the improvements that can be gained when (1) the design of the image-gathering system is optimized for high information density rather than for conventional image reconstruction, and (2) the digital restoration of the image accounts for the aliasing as well as the blurring and noise in image gathering and practically eliminates the degradations that occur due to the blurring and raster effects in image reconstruction.","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"116 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126705945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
One of the main problems that arises when designing color codes for electronic visual displays involves color selection. The colors must be distinctive and immediately recognizable as corresponding with the color names they represent. Otherwise, their meanings may be ambiguous, thereby defeating the code's purpose. We are approaching this problem by mapping the relationship between location on the CIE 1976 uniform chromaticity-scale (UCS) diagram and population stereotypes for color naming. This information should simplify the color selection process by helping the designer avoid, for example, specifying a "red" that actually appears orange. Thus, our project can be characterized as an attempt to improve on the Kelly (1943) color boundaries and is similar with an earlier effort by Haeusing (1976). It is also related to Boynton and Olson's (1987) work on focal colors. This paper describes our method, provides an overview of six experiments we have performed, and shows some representative results.
{"title":"Color-Name Boundaries for Color Coding","authors":"D. Post, C. Calhoun","doi":"10.1364/av.1989.pd1","DOIUrl":"https://doi.org/10.1364/av.1989.pd1","url":null,"abstract":"One of the main problems that arises when designing color codes for electronic visual displays involves color selection. The colors must be distinctive and immediately recognizable as corresponding with the color names they represent. Otherwise, their meanings may be ambiguous, thereby defeating the code's purpose. We are approaching this problem by mapping the relationship between location on the CIE 1976 uniform chromaticity-scale (UCS) diagram and population stereotypes for color naming. This information should simplify the color selection process by helping the designer avoid, for example, specifying a \"red\" that actually appears orange. Thus, our project can be characterized as an attempt to improve on the Kelly (1943) color boundaries and is similar with an earlier effort by Haeusing (1976). It is also related to Boynton and Olson's (1987) work on focal colors. This paper describes our method, provides an overview of six experiments we have performed, and shows some representative results.","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125598888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The extraction of human facial features from images in binary form has been shown to be useful in at least two areas of low data-rate image coding. One is in the transmission of moving cartoons over the public switched telephone network [1]-[3]. Another is in model-based coding, where analysis of the camera signal is needed to line up the software model with the exterior world [4]. Binarization of the input image is also used as a first stage in machine recognition of faces [5].
{"title":"Operators For Facial Feature Extraction","authors":"D. Pearson, E. Hanna","doi":"10.1364/av.1989.wd1","DOIUrl":"https://doi.org/10.1364/av.1989.wd1","url":null,"abstract":"The extraction of human facial features from images in binary form has been shown to be useful in at least two areas of low data-rate image coding. One is in the transmission of moving cartoons over the public switched telephone network [1]-[3]. Another is in model-based coding, where analysis of the camera signal is needed to line up the software model with the exterior world [4]. Binarization of the input image is also used as a first stage in machine recognition of faces [5].","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130371209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The input data from a scanner or computer generated image will often contain colors which are outside the printable gamut of a hard copy device. This paper describes a method for obtaining a perceptual appearance match between the original image and the hard copy output under these conditions.
{"title":"Color Gamut Matching for Hard Copy","authors":"John Meyer, Brian Barth","doi":"10.1364/av.1989.fa3","DOIUrl":"https://doi.org/10.1364/av.1989.fa3","url":null,"abstract":"The input data from a scanner or computer generated image will often contain colors which are outside the printable gamut of a hard copy device. This paper describes a method for obtaining a perceptual appearance match between the original image and the hard copy output under these conditions.","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133021535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The distance at which an observer can just perform his particular visual observation task is perhaps the most practical metric of image quality with regard to viewing instruments. The direct relation between the distance of a target and the scaling of its image upon the retina suggests a simple theoretical approach: is it possible to express image quality in some sort of effective retinal "pixel-size", such as visual acuity, but then in a more generalized form, including the effects of luminance level, contrast, and noise? In this paper experiments will be described in order to determine the effective range first of a typical image intensifier system, and second for a thermal viewing system. The results will be discussed in the light of the above question.
{"title":"Effective Range of Viewing Instruments","authors":"A. van Meeteren","doi":"10.1364/av.1989.tha1","DOIUrl":"https://doi.org/10.1364/av.1989.tha1","url":null,"abstract":"The distance at which an observer can just perform his particular visual observation task is perhaps the most practical metric of image quality with regard to viewing instruments. The direct relation between the distance of a target and the scaling of its image upon the retina suggests a simple theoretical approach: is it possible to express image quality in some sort of effective retinal \"pixel-size\", such as visual acuity, but then in a more generalized form, including the effects of luminance level, contrast, and noise? In this paper experiments will be described in order to determine the effective range first of a typical image intensifier system, and second for a thermal viewing system. The results will be discussed in the light of the above question.","PeriodicalId":344719,"journal":{"name":"Applied Vision","volume":"110 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117153004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}