The compressive sensing framework states that a signal which has sparse representation in a known basis may be reconstructed from samples obtained from a sub-Nyquist sampling rate. Due to its inherent properties, the Fourier domain is widely used in compressive sensing applications. Sparse signal recovery applications making use of a small number of Fourier Transform coe±cients have made solutions to large scale data recovery problems, i.e. images, applicable and more practical. The sparse reconstruction of two dimensional images is performed by making use of sampling patterns generated by taking into consideration the general frequency characteristics of natural images. In this work, instead of forming a general sampling pattern for infrared images of sea-surveillance scenarios, a special sampling pattern has been obtained by making use of a new iterative algorithm that uses a database containing images recorded under similar conditions to extract important frequency characteristics. It has been shown by experimental results that, the proposed sampling pattern provides better sparse recovery performance compared to the baseline sampling methods proposed in the literature.
{"title":"A novel sampling method for the sparse recovery of infrared sea surveillance images","authors":"Serdar Çakır, Hande Uzeler, T. Aytaç","doi":"10.1117/12.2029878","DOIUrl":"https://doi.org/10.1117/12.2029878","url":null,"abstract":"The compressive sensing framework states that a signal which has sparse representation in a known basis may be reconstructed from samples obtained from a sub-Nyquist sampling rate. Due to its inherent properties, the Fourier domain is widely used in compressive sensing applications. Sparse signal recovery applications making use of a small number of Fourier Transform coe±cients have made solutions to large scale data recovery problems, i.e. images, applicable and more practical. The sparse reconstruction of two dimensional images is performed by making use of sampling patterns generated by taking into consideration the general frequency characteristics of natural images. In this work, instead of forming a general sampling pattern for infrared images of sea-surveillance scenarios, a special sampling pattern has been obtained by making use of a new iterative algorithm that uses a database containing images recorded under similar conditions to extract important frequency characteristics. It has been shown by experimental results that, the proposed sampling pattern provides better sparse recovery performance compared to the baseline sampling methods proposed in the literature.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130793993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Si based sensors, in particular CMOS Image sensors, have revolutionized low cost imaging systems but to date have hardly been considered as possible candidates for gun muzzle flash detection, due to performance limitations, and low SNR in the visible spectrum. In this study, a CMOS Single Photon Avalanche Diode (SPAD) module is used to record and sample muzzle flash events in the visible spectrum, from representative weapons, common on the modern battlefield. SPADs possess two crucial properties for muzzle flash imaging - Namely, very high photon detection sensitivity, coupled with a unique ability to convert the optical signal to a digital signal at the source pixel, thus practically eliminating readout noise. This enables high sampling frequencies in the kilohertz range without SNR degradation, in contrast to regular CMOS image sensors. To date, the SPAD has not been utilized for flash detection in an uncontrolled environment, such as gun muzzle flash detection. Gun propellant manufacturers use alkali salts to suppress secondary flashes ignited during the muzzle flash event. Common alkali salts are compounds based on Potassium or Sodium, with spectral emission lines around 769nm and 589nm, respectively. A narrow band filter around the Potassium emission doublet is used in this study to favor the muzzle flash signal over solar radiation. This research will demonstrate the SPAD's ability to accurately sample and reconstruct the temporal behavior of the muzzle flash in the visible wavelength under the specified imaging conditions. The reconstructed signal is clearly distinguishable from background clutter, through exploitation of flash temporal characteristics.
{"title":"Gun muzzle flash detection using a CMOS single photon avalanche diode","authors":"Tomer Merhav, V. Savuskan, Y. Nemirovsky","doi":"10.1117/12.2026923","DOIUrl":"https://doi.org/10.1117/12.2026923","url":null,"abstract":"Si based sensors, in particular CMOS Image sensors, have revolutionized low cost imaging systems but to date have hardly been considered as possible candidates for gun muzzle flash detection, due to performance limitations, and low SNR in the visible spectrum. In this study, a CMOS Single Photon Avalanche Diode (SPAD) module is used to record and sample muzzle flash events in the visible spectrum, from representative weapons, common on the modern battlefield. SPADs possess two crucial properties for muzzle flash imaging - Namely, very high photon detection sensitivity, coupled with a unique ability to convert the optical signal to a digital signal at the source pixel, thus practically eliminating readout noise. This enables high sampling frequencies in the kilohertz range without SNR degradation, in contrast to regular CMOS image sensors. To date, the SPAD has not been utilized for flash detection in an uncontrolled environment, such as gun muzzle flash detection. Gun propellant manufacturers use alkali salts to suppress secondary flashes ignited during the muzzle flash event. Common alkali salts are compounds based on Potassium or Sodium, with spectral emission lines around 769nm and 589nm, respectively. A narrow band filter around the Potassium emission doublet is used in this study to favor the muzzle flash signal over solar radiation. This research will demonstrate the SPAD's ability to accurately sample and reconstruct the temporal behavior of the muzzle flash in the visible wavelength under the specified imaging conditions. The reconstructed signal is clearly distinguishable from background clutter, through exploitation of flash temporal characteristics.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"88 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120921331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Carl Brännlund, J. Tidström, M. Henriksson, L. Sjöqvist
Snipers and other optically guided weapon systems are serious threats in military operations. We have studied a SWIR (Short Wave Infrared) camera-based system with capability to detect and locate snipers both before and after shot over a large field-of-view. The high frame rate SWIR-camera allows resolution of the temporal profile of muzzle flashes which is the infrared signature associated with the ejection of the bullet from the rifle. The capability to detect and discriminate sniper muzzle flashes with this system has been verified by FOI in earlier studies. In this work we have extended the system by adding a laser channel for optics detection. A laser diode with slit-shaped beam profile is scanned over the camera field-of-view to detect retro reflection from optical sights. The optics detection system has been tested at various distances up to 1.15 km showing the feasibility to detect rifle scopes in full daylight. The high speed camera gives the possibility to discriminate false alarms by analyzing the temporal data. The intensity variation, caused by atmospheric turbulence, enables discrimination of small sights from larger reflectors due to aperture averaging, although the targets only cover a single pixel. It is shown that optics detection can be integrated in combination with muzzle flash detection by adding a scanning rectangular laser slit. The overall optics detection capability by continuous surveillance of a relatively large field-of-view looks promising. This type of multifunctional system may become an important tool to detect snipers before and after shot.
{"title":"Combined hostile fire and optics detection","authors":"Carl Brännlund, J. Tidström, M. Henriksson, L. Sjöqvist","doi":"10.1117/12.2028846","DOIUrl":"https://doi.org/10.1117/12.2028846","url":null,"abstract":"Snipers and other optically guided weapon systems are serious threats in military operations. We have studied a SWIR (Short Wave Infrared) camera-based system with capability to detect and locate snipers both before and after shot over a large field-of-view. The high frame rate SWIR-camera allows resolution of the temporal profile of muzzle flashes which is the infrared signature associated with the ejection of the bullet from the rifle. The capability to detect and discriminate sniper muzzle flashes with this system has been verified by FOI in earlier studies. In this work we have extended the system by adding a laser channel for optics detection. A laser diode with slit-shaped beam profile is scanned over the camera field-of-view to detect retro reflection from optical sights. The optics detection system has been tested at various distances up to 1.15 km showing the feasibility to detect rifle scopes in full daylight. The high speed camera gives the possibility to discriminate false alarms by analyzing the temporal data. The intensity variation, caused by atmospheric turbulence, enables discrimination of small sights from larger reflectors due to aperture averaging, although the targets only cover a single pixel. It is shown that optics detection can be integrated in combination with muzzle flash detection by adding a scanning rectangular laser slit. The overall optics detection capability by continuous surveillance of a relatively large field-of-view looks promising. This type of multifunctional system may become an important tool to detect snipers before and after shot.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129235199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Trzaskawka, M. Kastek, M. Życzkowski, R. Dulski, M. Szustakowski, W. Ciurapinski, J. Barela
Recent terrorist attacks and possibilities of such actions in future have forced to develop security systems for critical infrastructures that embrace sensors technologies and technical organization of systems. The used till now perimeter protection of stationary objects, based on construction of a ring with two-zone fencing, visual cameras with illumination are efficiently displaced by the systems of the multisensor technology that consists of: visible technology – day/night cameras registering optical contrast of a scene, thermal technology – cheap bolometric cameras recording thermal contrast of a scene and active ground radars – microwave and millimetre wavelengths that record and detect reflected radiation. Merging of these three different technologies into one system requires methodology for selection of technical conditions of installation and parameters of sensors. This procedure enables us to construct a system with correlated range, resolution, field of view and object identification. Important technical problem connected with the multispectral system is its software, which helps couple the radar with the cameras. This software can be used for automatic focusing of cameras, automatic guiding cameras to an object detected by the radar, tracking of the object and localization of the object on the digital map as well as target identification and alerting. Based on “plug and play” architecture, this system provides unmatched flexibility and simplistic integration of sensors and devices in TCP/IP networks. Using a graphical user interface it is possible to control sensors and monitor streaming video and other data over the network, visualize the results of data fusion process and obtain detailed information about detected intruders over a digital map. System provide high-level applications and operator workload reduction with features such as sensor to sensor cueing from detection devices, automatic e-mail notification and alarm triggering. The paper presents a structure and some elements of critical infrastructure protection solution which is based on a modular multisensor security system. System description is focused mainly on methodology of selection of sensors parameters. The results of the tests in real conditions are also presented.
{"title":"System for critical infrastructure security based on multispectral observation-detection module","authors":"P. Trzaskawka, M. Kastek, M. Życzkowski, R. Dulski, M. Szustakowski, W. Ciurapinski, J. Barela","doi":"10.1117/12.2028740","DOIUrl":"https://doi.org/10.1117/12.2028740","url":null,"abstract":"Recent terrorist attacks and possibilities of such actions in future have forced to develop security systems for critical infrastructures that embrace sensors technologies and technical organization of systems. The used till now perimeter protection of stationary objects, based on construction of a ring with two-zone fencing, visual cameras with illumination are efficiently displaced by the systems of the multisensor technology that consists of: visible technology – day/night cameras registering optical contrast of a scene, thermal technology – cheap bolometric cameras recording thermal contrast of a scene and active ground radars – microwave and millimetre wavelengths that record and detect reflected radiation. Merging of these three different technologies into one system requires methodology for selection of technical conditions of installation and parameters of sensors. This procedure enables us to construct a system with correlated range, resolution, field of view and object identification. Important technical problem connected with the multispectral system is its software, which helps couple the radar with the cameras. This software can be used for automatic focusing of cameras, automatic guiding cameras to an object detected by the radar, tracking of the object and localization of the object on the digital map as well as target identification and alerting. Based on “plug and play” architecture, this system provides unmatched flexibility and simplistic integration of sensors and devices in TCP/IP networks. Using a graphical user interface it is possible to control sensors and monitor streaming video and other data over the network, visualize the results of data fusion process and obtain detailed information about detected intruders over a digital map. System provide high-level applications and operator workload reduction with features such as sensor to sensor cueing from detection devices, automatic e-mail notification and alarm triggering. The paper presents a structure and some elements of critical infrastructure protection solution which is based on a modular multisensor security system. System description is focused mainly on methodology of selection of sensors parameters. The results of the tests in real conditions are also presented.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121502076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jeeyeon Yoon, D. Ryu, Sangmin Kim, S. Seong, Jieun Kim, Sug-Whan Kim, W. Yoon
We report realistic performance simulation results for a new MWIR camera. It is designed for early detection of long distance missile plumes over few hundreds kilometer in the distance range. The camera design uses a number of refractive optical element and a IR detector. Both imaging and radiometric performance of the camera are investigated by using large scale ray tracing including targets and background scene models. Missile plume radiance was calculated from using CFD type radiative transfer algorithm and used as the light source for ray tracing computation. The atmospheric background was estimated using MODTRAN utilizing path thermal radiance, single/multiple scattered radiance and transmittance. The ray tracing simulation results demonstrate that the camera would satisfy the imaging and radiometric performance requirements in field operation at the target MWIR band.
{"title":"Performance simulation model for a new MWIR camera for missile plume detection","authors":"Jeeyeon Yoon, D. Ryu, Sangmin Kim, S. Seong, Jieun Kim, Sug-Whan Kim, W. Yoon","doi":"10.1117/12.2029245","DOIUrl":"https://doi.org/10.1117/12.2029245","url":null,"abstract":"We report realistic performance simulation results for a new MWIR camera. It is designed for early detection of long distance missile plumes over few hundreds kilometer in the distance range. The camera design uses a number of refractive optical element and a IR detector. Both imaging and radiometric performance of the camera are investigated by using large scale ray tracing including targets and background scene models. Missile plume radiance was calculated from using CFD type radiative transfer algorithm and used as the light source for ray tracing computation. The atmospheric background was estimated using MODTRAN utilizing path thermal radiance, single/multiple scattered radiance and transmittance. The ray tracing simulation results demonstrate that the camera would satisfy the imaging and radiometric performance requirements in field operation at the target MWIR band.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115440137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Infrared guided missile seekers utilizing pulse width modulation in target tracking is one of the threats against air platforms. To be able to achieve a “soft-kill” protection of own platform against these type of threats, one needs to examine carefully the seeker operating principle with its special electronic counter-counter measure (ECCM) capability. One of the cost-effective ways of soft kill protection is to use flare decoys in accordance with an optimized dispensing program. Such an optimization requires a good understanding of the threat seeker, capabilities of the air platform and engagement scenario information between them. Modeling and simulation is very powerful tool to achieve a valuable insight and understand the underlying phenomenology. A careful interpretation of simulation results is crucial to infer valuable conclusions from the data. In such an interpretation there are lots of factors (features) which affect the results. Therefore, powerful statistical tools and pattern recognition algorithms are of special interest in the analysis. In this paper, we show how self-organizing maps (SOMs), which is one of those powerful tools, can be used in analyzing the effectiveness of various flare dispensing programs against a PWM seeker. We perform several Monte Carlo runs for a typical engagement scenario in a MATLAB-based simulation environment. In each run, we randomly change the flare dispending program and obtain corresponding class: “successful” or “unsuccessful”, depending on whether the corresponding flare dispensing program deceives the seeker or not, respectively. Then, in the analysis phase, we use SOMs to interpret and visualize the results.
{"title":"Analyzing the effectiveness of flare dispensing programs against pulse width modulation seekers using self-organizing maps","authors":"M. C. Sahingil, M. Aslan","doi":"10.1117/12.2029331","DOIUrl":"https://doi.org/10.1117/12.2029331","url":null,"abstract":"Infrared guided missile seekers utilizing pulse width modulation in target tracking is one of the threats against air platforms. To be able to achieve a “soft-kill” protection of own platform against these type of threats, one needs to examine carefully the seeker operating principle with its special electronic counter-counter measure (ECCM) capability. One of the cost-effective ways of soft kill protection is to use flare decoys in accordance with an optimized dispensing program. Such an optimization requires a good understanding of the threat seeker, capabilities of the air platform and engagement scenario information between them. Modeling and simulation is very powerful tool to achieve a valuable insight and understand the underlying phenomenology. A careful interpretation of simulation results is crucial to infer valuable conclusions from the data. In such an interpretation there are lots of factors (features) which affect the results. Therefore, powerful statistical tools and pattern recognition algorithms are of special interest in the analysis. In this paper, we show how self-organizing maps (SOMs), which is one of those powerful tools, can be used in analyzing the effectiveness of various flare dispensing programs against a PWM seeker. We perform several Monte Carlo runs for a typical engagement scenario in a MATLAB-based simulation environment. In each run, we randomly change the flare dispending program and obtain corresponding class: “successful” or “unsuccessful”, depending on whether the corresponding flare dispensing program deceives the seeker or not, respectively. Then, in the analysis phase, we use SOMs to interpret and visualize the results.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126733454","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A novel Sagnac fiber optic sensor employing time delay estimation for distributed detection and location is proposed and demonstrated. The sensor employs Sagnac interferometer as interfering unit. A broadband, low-coherence source is spectrally sliced into two wavelength bands using wavelength division multiplexer. Therefore, the sensor consists of two Sagnac interferometers, multiplexed with a broadband light source, interfering unit and sensing fiber by wavelength division multiplexer, and hence four detected signals with two different wavelengths are obtained. After the demodulation scheme based on 3×3 coupler, two signals with fixed time delay are achieved and the location of the disturbance gained by time delay estimation enables the localization comparably accurate. Experimental results show that the sensor is especially advantageous for low location error to the application of intrusion detecting.
{"title":"A novel Sagnac fiber optic sensor employing time delay estimation for distributed detection and location","authors":"Yuan Wu, Pang Bian, B. Jia, Qian Xiao","doi":"10.1117/12.2028311","DOIUrl":"https://doi.org/10.1117/12.2028311","url":null,"abstract":"A novel Sagnac fiber optic sensor employing time delay estimation for distributed detection and location is proposed and demonstrated. The sensor employs Sagnac interferometer as interfering unit. A broadband, low-coherence source is spectrally sliced into two wavelength bands using wavelength division multiplexer. Therefore, the sensor consists of two Sagnac interferometers, multiplexed with a broadband light source, interfering unit and sensing fiber by wavelength division multiplexer, and hence four detected signals with two different wavelengths are obtained. After the demodulation scheme based on 3×3 coupler, two signals with fixed time delay are achieved and the location of the disturbance gained by time delay estimation enables the localization comparably accurate. Experimental results show that the sensor is especially advantageous for low location error to the application of intrusion detecting.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"7 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134035238","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A cavity blackbody is the appropriate IR reference source for IR sensors which require high radiance levels. It combines high emissivity independent from wavelength and high speed warm up and high stability thanks to its light trap structure. However, the inconvenient of this structure is that it leads to a prohibitive cooling time. HGH developed a method to speed up the cooling time.
{"title":"Improving cooling of cavity blackbodies","authors":"C. Barrat, Gildas Chauvel","doi":"10.1117/12.2028874","DOIUrl":"https://doi.org/10.1117/12.2028874","url":null,"abstract":"A cavity blackbody is the appropriate IR reference source for IR sensors which require high radiance levels. It combines high emissivity independent from wavelength and high speed warm up and high stability thanks to its light trap structure. However, the inconvenient of this structure is that it leads to a prohibitive cooling time. HGH developed a method to speed up the cooling time.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"159 11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130409800","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Panoramic objectives are becoming, due to the availability of large area digital sensors, a diffuse optical system to catch very wide field of view (FoV). Typical panoramic lens have a view angle of 360° in azimuth (the plane orthogonal to the optical axis), just like a fish-eye, and plus and minus tens of degrees in elevation angle, i. e. above and below the horizon. Most common panoramic lenses use a curved, usually aspheric, mirror placed in front of a commercial objective to capture a 360° area around the horizon. More recent design use a catadiopter instead of a mirror. Both the solutions have the draw-back effect to obscure the frontal view of the objective, producing the classic "donut-shape" image in the focal plane. We present here a panoramic lens in which the frontal field is make available to be imaged in the focal plane, by means of a frontal optics, together with the panoramic field, producing a FoV of 360° in azimuth and 260° in elevation; it have then the capabilities of a fish eye plus those of a panoramic lens: we call it hyper-hemispheric lens. We design also a lens in which the frontal optics have a different paraxial focal length with respect to the equivalent panoramic; with this solution one can image, in the same sensor, the panoramic field plus an enlargement of a portion of it: that's the bifocal panoramic lens. Both the lenses have been designed and realized and we show here the optical scheme, the nominal performances and some pictures as an example.
{"title":"Hyper-hemispheric and bifocal panoramic lenses","authors":"C. Pernechele","doi":"10.1117/12.2028099","DOIUrl":"https://doi.org/10.1117/12.2028099","url":null,"abstract":"Panoramic objectives are becoming, due to the availability of large area digital sensors, a diffuse optical system to catch very wide field of view (FoV). Typical panoramic lens have a view angle of 360° in azimuth (the plane orthogonal to the optical axis), just like a fish-eye, and plus and minus tens of degrees in elevation angle, i. e. above and below the horizon. Most common panoramic lenses use a curved, usually aspheric, mirror placed in front of a commercial objective to capture a 360° area around the horizon. More recent design use a catadiopter instead of a mirror. Both the solutions have the draw-back effect to obscure the frontal view of the objective, producing the classic \"donut-shape\" image in the focal plane. We present here a panoramic lens in which the frontal field is make available to be imaged in the focal plane, by means of a frontal optics, together with the panoramic field, producing a FoV of 360° in azimuth and 260° in elevation; it have then the capabilities of a fish eye plus those of a panoramic lens: we call it hyper-hemispheric lens. We design also a lens in which the frontal optics have a different paraxial focal length with respect to the equivalent panoramic; with this solution one can image, in the same sensor, the panoramic field plus an enlargement of a portion of it: that's the bifocal panoramic lens. Both the lenses have been designed and realized and we show here the optical scheme, the nominal performances and some pictures as an example.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132091204","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Uneven response of particular detectors (pixels) to the same incident power of infrared radiation is an inherent feature of microbolometer focal plane arrays. As a result an image degradation occurs, known as Fixed Pattern Noise (FPN), which distorts the thermal representation of an observed scene and impairs the parameters of a thermal camera. In order to compensate such non-uniformity, several NUC correction methods are applied in digital data processing modules implemented in thermal cameras. Coefficients required to perform the non-uniformity correction procedure (NUC coefficients) are determined by calibrating the camera against uniform radiation sources (blackbodies). Non-uniformity correction is performed in a digital processing unit in order to remove FPN pattern in the registered thermal images. Relevant correction coefficients are calculated on the basis of recorded detector responses to several values of radiant flux emitted from reference IR radiation sources (blackbodies). The measurement of correction coefficients requires specialized setup, in which uniform, extended radiation sources with high temperature stability are one of key elements. Measurement stand for NUC correction developed in Institute of Optoelectronics, MUT, comprises two integrated extended blackbodies with the following specifications: area 200×200 mm, stabilized absolute temperature range +15 °C÷100 °C, and uniformity of temperature distribution across entire surface ±0.014 °C. Test stand, method used for the measurement of NUC coefficients and the results obtained during the measurements conducted on a prototype thermal camera will be presented in the paper.
{"title":"Test stand for non-uniformity correction of microbolometer focal plane arrays used in thermal cameras","authors":"M. Krupiński, J. Barela, K. Firmanty, M. Kastek","doi":"10.1117/12.2028633","DOIUrl":"https://doi.org/10.1117/12.2028633","url":null,"abstract":"Uneven response of particular detectors (pixels) to the same incident power of infrared radiation is an inherent feature of microbolometer focal plane arrays. As a result an image degradation occurs, known as Fixed Pattern Noise (FPN), which distorts the thermal representation of an observed scene and impairs the parameters of a thermal camera. In order to compensate such non-uniformity, several NUC correction methods are applied in digital data processing modules implemented in thermal cameras. Coefficients required to perform the non-uniformity correction procedure (NUC coefficients) are determined by calibrating the camera against uniform radiation sources (blackbodies). Non-uniformity correction is performed in a digital processing unit in order to remove FPN pattern in the registered thermal images. Relevant correction coefficients are calculated on the basis of recorded detector responses to several values of radiant flux emitted from reference IR radiation sources (blackbodies). The measurement of correction coefficients requires specialized setup, in which uniform, extended radiation sources with high temperature stability are one of key elements. Measurement stand for NUC correction developed in Institute of Optoelectronics, MUT, comprises two integrated extended blackbodies with the following specifications: area 200×200 mm, stabilized absolute temperature range +15 °C÷100 °C, and uniformity of temperature distribution across entire surface ±0.014 °C. Test stand, method used for the measurement of NUC coefficients and the results obtained during the measurements conducted on a prototype thermal camera will be presented in the paper.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"179 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122277867","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}