We report on the design and testing of a 2-color dynamic scene projector system based on the MIRAGE-XL infrared scene projector. The system is based on the optical combination of two 1024x1024 MIRAGE-XL resistive arrays. Algorithms derived for 2-color operation are discussed and system performance data is presented, including radiometric performance, sub-pixel spatial co-registration and compensation for spectral cross-talk.
{"title":"A two-color 1024x1024 dynamic infrared scene projection system","authors":"J. Laveigne, G. Franks, Marcus Prewarski","doi":"10.1117/12.2016254","DOIUrl":"https://doi.org/10.1117/12.2016254","url":null,"abstract":"We report on the design and testing of a 2-color dynamic scene projector system based on the MIRAGE-XL infrared scene projector. The system is based on the optical combination of two 1024x1024 MIRAGE-XL resistive arrays. Algorithms derived for 2-color operation are discussed and system performance data is presented, including radiometric performance, sub-pixel spatial co-registration and compensation for spectral cross-talk.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117172766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Mansur, R. Vaillancourt, Ryan Benedict-Gill, S. Newbry, Julia Rentz Dupuis
OPTRA is developing a next-generation digital micromirror device (DMD) based two-band infrared scene projector (IRSP) with infinite bit-depth independent of frame rate and an order of magnitude improvement in contrast over the state of the art. Traditionally DMD-based IRSPs have offered larger format and superior uniformity and pixel operability relative to resistive and diode arrays, however, they have been limited in contrast and also by the inherent bitdepth / frame rate tradeoff imposed by pulse width modulation (PWM). OPTRA’s high dynamic range IRSP (HIDRA SP) has broken this dependency with a dynamic structured illumination solution. The HIDRA SP uses a source conditioning DMD to impose the structured illumination on two projector DMDs – one for each spectral band. The source conditioning DMD is operated in binary mode, and the relay optics which form the structured illumination act as a low pass spatial filter. The structured illumination is therefore spatially grayscaled and more importantly is analog with no PWM. In addition, the structured illumination concentrates energy where bright object will be projected and extinguishes energy in dark regions; the result is a significant improvement in contrast. The projector DMDs are operated with 8-bit PWM, however the total projected image is analog with no bit-depth / frame rate dependency. In this paper we describe our progress towards the development, build, and test of a prototype HIDRA SP.
{"title":"High-dynamic range DMD-based infrared scene projector","authors":"D. Mansur, R. Vaillancourt, Ryan Benedict-Gill, S. Newbry, Julia Rentz Dupuis","doi":"10.1117/12.2014390","DOIUrl":"https://doi.org/10.1117/12.2014390","url":null,"abstract":"OPTRA is developing a next-generation digital micromirror device (DMD) based two-band infrared scene projector (IRSP) with infinite bit-depth independent of frame rate and an order of magnitude improvement in contrast over the state of the art. Traditionally DMD-based IRSPs have offered larger format and superior uniformity and pixel operability relative to resistive and diode arrays, however, they have been limited in contrast and also by the inherent bitdepth / frame rate tradeoff imposed by pulse width modulation (PWM). OPTRA’s high dynamic range IRSP (HIDRA SP) has broken this dependency with a dynamic structured illumination solution. The HIDRA SP uses a source conditioning DMD to impose the structured illumination on two projector DMDs – one for each spectral band. The source conditioning DMD is operated in binary mode, and the relay optics which form the structured illumination act as a low pass spatial filter. The structured illumination is therefore spatially grayscaled and more importantly is analog with no PWM. In addition, the structured illumination concentrates energy where bright object will be projected and extinguishes energy in dark regions; the result is a significant improvement in contrast. The projector DMDs are operated with 8-bit PWM, however the total projected image is analog with no bit-depth / frame rate dependency. In this paper we describe our progress towards the development, build, and test of a prototype HIDRA SP.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"20 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123507028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The performance parameters influence the design of a Flight Motion Simulator (FMS) and affect its dynamic accuracies. A highly dynamic simulator needs low inertias and lightweight gimbals. This is counterproductive for a system with high position accuracies. A simulator with high position accuracy requires a stiff, rigid system with minimal deflections. Critical parameters that affect the FMS design are payload sizes, accuracies, and dynamic requirements.
{"title":"The design of flight motion simulators: high accuracy versus high dynamics","authors":"R. W. Mitchell","doi":"10.1117/12.2013985","DOIUrl":"https://doi.org/10.1117/12.2013985","url":null,"abstract":"The performance parameters influence the design of a Flight Motion Simulator (FMS) and affect its dynamic accuracies. A highly dynamic simulator needs low inertias and lightweight gimbals. This is counterproductive for a system with high position accuracies. A simulator with high position accuracy requires a stiff, rigid system with minimal deflections. Critical parameters that affect the FMS design are payload sizes, accuracies, and dynamic requirements.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125732131","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A model is described for the problem of optimally projecting a pixellated light source onto a pixellated imaging sensor, in the context that the projected source is used for performance testing of the sensor. The model can be used, for example, to compute the paraxial design requirements of the projection lens, given that the parameters of all other subsystems in the problem are fixed. For remote sensing applications, where the performance of a sensor focused at infinity is to be tested, the projector lens becomes a collimator. For optimal projection when using the source for performance testing of the sensor, one then requires that the projector pixels are not spatially resolved by the imaging sensor, the entrance pupil of the sensor is overfilled without vignetting, and also, where feasible, the sensor field of view is overfilled. The model uses paraxial analytical ray tracing approximations to provide a set of equations that are used in an associated spreadsheet to determine the basic collimator requirements such as effective focal length, f/#, and relief distance, given the geometrical characteristics of the projector spatial light modulator and the sensor under test. Beyond this, the model provides a sense of intuition and guidance prior to detailed computerized ray tracing.
{"title":"Analytic determination of optimal projector lens design requirements for pixilated projectors used to test pixilated imaging sensors","authors":"J. Rice","doi":"10.1117/12.2018504","DOIUrl":"https://doi.org/10.1117/12.2018504","url":null,"abstract":"A model is described for the problem of optimally projecting a pixellated light source onto a pixellated imaging sensor, in the context that the projected source is used for performance testing of the sensor. The model can be used, for example, to compute the paraxial design requirements of the projection lens, given that the parameters of all other subsystems in the problem are fixed. For remote sensing applications, where the performance of a sensor focused at infinity is to be tested, the projector lens becomes a collimator. For optimal projection when using the source for performance testing of the sensor, one then requires that the projector pixels are not spatially resolved by the imaging sensor, the entrance pupil of the sensor is overfilled without vignetting, and also, where feasible, the sensor field of view is overfilled. The model uses paraxial analytical ray tracing approximations to provide a set of equations that are used in an associated spreadsheet to determine the basic collimator requirements such as effective focal length, f/#, and relief distance, given the geometrical characteristics of the projector spatial light modulator and the sensor under test. Beyond this, the model provides a sense of intuition and guidance prior to detailed computerized ray tracing.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116937845","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chad L. Christie, Efthimios Gouthas, O. Williams, L. Swierkowski
At DSTO, a real-time scene generation framework, VIRSuite, has been developed in recent years, within which trials data are predominantly used for modelling the radiometric properties of the simulated objects. Since in many cases the data are insufficient, a physics-based simulator capable of predicting the infrared signatures of objects and their backgrounds has been developed as a new VIRSuite module. It includes transient heat conduction within the materials, and boundary conditions that take into account the heat fluxes due to solar radiation, wind convection and radiative transfer. In this paper, an overview is presented, covering both the steady-state and transient performance.
{"title":"Dynamic thermal signature prediction for real-time scene generation","authors":"Chad L. Christie, Efthimios Gouthas, O. Williams, L. Swierkowski","doi":"10.1117/12.2015656","DOIUrl":"https://doi.org/10.1117/12.2015656","url":null,"abstract":"At DSTO, a real-time scene generation framework, VIRSuite, has been developed in recent years, within which trials data are predominantly used for modelling the radiometric properties of the simulated objects. Since in many cases the data are insufficient, a physics-based simulator capable of predicting the infrared signatures of objects and their backgrounds has been developed as a new VIRSuite module. It includes transient heat conduction within the materials, and boundary conditions that take into account the heat fluxes due to solar radiation, wind convection and radiative transfer. In this paper, an overview is presented, covering both the steady-state and transient performance.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125886398","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Missile Defense Transfer Radiometer (MDXR) is designed to calibrate infrared collimated and flood sources over the fW/cm2 to W/cm2 power range from 3 μm to 28μ m in wavelength. The MDXR operates in three different modes: as a filter radiometer, a Fourier-transform spectrometer (FTS)-based spectroradiometer, and as an absolute cryogenic radiometer (ACR). Since 2010, the MDXR has made measurements of the collimated infrared irradiance at the output port of seven different infrared test chambers at several facilities. We present a selection of results from these calibration efforts compared to signal predictions from the respective chamber models for the three different MDXR calibration modes. We also compare the results to previous measurements made of the same chambers with a legacy transfer radiometer, the NIST BXR. In general, the results are found to agree within their combined uncertainties, with the MDXR having 30 % lower uncertainty and greater spectral coverage.
{"title":"Calibration of IR test chambers with the missile defense transfer radiometer","authors":"S. Kaplan, S. I. Woods, A. Carter, T. Jung","doi":"10.1117/12.2015982","DOIUrl":"https://doi.org/10.1117/12.2015982","url":null,"abstract":"The Missile Defense Transfer Radiometer (MDXR) is designed to calibrate infrared collimated and flood sources over the fW/cm2 to W/cm2 power range from 3 μm to 28μ m in wavelength. The MDXR operates in three different modes: as a filter radiometer, a Fourier-transform spectrometer (FTS)-based spectroradiometer, and as an absolute cryogenic radiometer (ACR). Since 2010, the MDXR has made measurements of the collimated infrared irradiance at the output port of seven different infrared test chambers at several facilities. We present a selection of results from these calibration efforts compared to signal predictions from the respective chamber models for the three different MDXR calibration modes. We also compare the results to previous measurements made of the same chambers with a legacy transfer radiometer, the NIST BXR. In general, the results are found to agree within their combined uncertainties, with the MDXR having 30 % lower uncertainty and greater spectral coverage.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126041127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
B. Weiss, L. J. Fronczek, E. Morse, Z. Kootbally, C. Schlenoff
Transformative Apps (TransApps) is a Defense Advanced Research Projects Agency (DARPA) funded program whose goal is to develop a range of militarily-relevant software applications (“apps”) to enhance the operational-effectiveness of military personnel on (and off) the battlefield. TransApps is also developing a military apps marketplace to facilitate rapid development and dissemination of applications to address user needs by connecting engaged communities of endusers with development groups. The National Institute of Standards and Technology’s (NIST) role in the TransApps program is to design and implement evaluation procedures to assess the performance of: 1) the various software applications, 2) software-hardware interactions, and 3) the supporting online application marketplace. Specifically, NIST is responsible for evaluating 50+ tactically-relevant applications operating on numerous Android™-powered platforms. NIST efforts include functional regression testing and quantitative performance testing. This paper discusses the evaluation methodologies employed to assess the performance of three key program elements: 1) handheld-based applications and their integration with various hardware platforms, 2) client-based applications and 3) network technologies operating on both the handheld and client systems along with their integration into the application marketplace. Handheld-based applications are assessed using a combination of utility and usability-based checklists and quantitative performance tests. Client-based applications are assessed to replicate current overseas disconnected (i.e. no network connectivity between handhelds) operations and to assess connected operations envisioned for later use. Finally, networked applications are assessed on handhelds to establish baselines of performance for when connectivity will be common usage.
{"title":"Performance assessments of Android-powered military applications operating on tactical handheld devices","authors":"B. Weiss, L. J. Fronczek, E. Morse, Z. Kootbally, C. Schlenoff","doi":"10.1117/12.2014771","DOIUrl":"https://doi.org/10.1117/12.2014771","url":null,"abstract":"Transformative Apps (TransApps) is a Defense Advanced Research Projects Agency (DARPA) funded program whose goal is to develop a range of militarily-relevant software applications (“apps”) to enhance the operational-effectiveness of military personnel on (and off) the battlefield. TransApps is also developing a military apps marketplace to facilitate rapid development and dissemination of applications to address user needs by connecting engaged communities of endusers with development groups. The National Institute of Standards and Technology’s (NIST) role in the TransApps program is to design and implement evaluation procedures to assess the performance of: 1) the various software applications, 2) software-hardware interactions, and 3) the supporting online application marketplace. Specifically, NIST is responsible for evaluating 50+ tactically-relevant applications operating on numerous Android™-powered platforms. NIST efforts include functional regression testing and quantitative performance testing. This paper discusses the evaluation methodologies employed to assess the performance of three key program elements: 1) handheld-based applications and their integration with various hardware platforms, 2) client-based applications and 3) network technologies operating on both the handheld and client systems along with their integration into the application marketplace. Handheld-based applications are assessed using a combination of utility and usability-based checklists and quantitative performance tests. Client-based applications are assessed to replicate current overseas disconnected (i.e. no network connectivity between handhelds) operations and to assess connected operations envisioned for later use. Finally, networked applications are assessed on handhelds to establish baselines of performance for when connectivity will be common usage.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127861762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The two stage hierarchical unsupervised learning system has been proposed for modeling complex dynamic surveillance and cyberspace systems. Using a modification of the expectation maximization learning approach, we introduced a three layer approach to learning concepts from input data: features, objects, and situations. Using the Bernoulli model, this approach models each situation as a collection of objects, and each object as a collection of features. Further complexity is added with the addition of clutter features and clutter objects. During the learning process, at the lowest level, only binary feature information (presence or absence) is provided. The system attempts to simultaneously determine the probabilities of the situation and presence of corresponding objects from the detected features. The proposed approach demonstrated robust performance after a short training period. This paper discusses this hierarchical learning system in a broader context of different feedback mechanisms between layers and highlights challenges on the road to practical applications.
{"title":"The two stages hierarchical unsupervised learning system for complex dynamic scene recognition","authors":"James Graham, A. O'Connor, I. Ternovskiy, R. Ilin","doi":"10.1117/12.2018754","DOIUrl":"https://doi.org/10.1117/12.2018754","url":null,"abstract":"The two stage hierarchical unsupervised learning system has been proposed for modeling complex dynamic surveillance and cyberspace systems. Using a modification of the expectation maximization learning approach, we introduced a three layer approach to learning concepts from input data: features, objects, and situations. Using the Bernoulli model, this approach models each situation as a collection of objects, and each object as a collection of features. Further complexity is added with the addition of clutter features and clutter objects. During the learning process, at the lowest level, only binary feature information (presence or absence) is provided. The system attempts to simultaneously determine the probabilities of the situation and presence of corresponding objects from the detected features. The proposed approach demonstrated robust performance after a short training period. This paper discusses this hierarchical learning system in a broader context of different feedback mechanisms between layers and highlights challenges on the road to practical applications.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"112 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128030718","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We applied a two stage unsupervised hierarchical learning system to model complex dynamic surveillance and cyber space monitoring systems using a non-commercial version of the NeoAxis visualization software. The hierarchical scene learning and recognition approach is based on hierarchical expectation maximization, and was linked to a 3D graphics engine for validation of learning and classification results and understanding the human – autonomous system relationship. Scene recognition is performed by taking synthetically generated data and feeding it to a dynamic logic algorithm. The algorithm performs hierarchical recognition of the scene by first examining the features of the objects to determine which objects are present, and then determines the scene based on the objects present. This paper presents a framework within which low level data linked to higher-level visualization can provide support to a human operator and be evaluated in a detailed and systematic way.
{"title":"Complex scenes and situations visualization in hierarchical learning algorithm with dynamic 3D NeoAxis engine","authors":"James Graham, I. Ternovskiy","doi":"10.1117/12.2018833","DOIUrl":"https://doi.org/10.1117/12.2018833","url":null,"abstract":"We applied a two stage unsupervised hierarchical learning system to model complex dynamic surveillance and cyber space monitoring systems using a non-commercial version of the NeoAxis visualization software. The hierarchical scene learning and recognition approach is based on hierarchical expectation maximization, and was linked to a 3D graphics engine for validation of learning and classification results and understanding the human – autonomous system relationship. Scene recognition is performed by taking synthetically generated data and feeding it to a dynamic logic algorithm. The algorithm performs hierarchical recognition of the scene by first examining the features of the objects to determine which objects are present, and then determines the scene based on the objects present. This paper presents a framework within which low level data linked to higher-level visualization can provide support to a human operator and be evaluated in a detailed and systematic way.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124850792","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Ziegler, H. Bitterlich, R. Breiter, M. Bruder, D. Eich, P. Fries, R. Wollrab, J. Wendler, J. Wenisch
Based on its well established 640×512 pixel, 15 µm pitch detector for a staring application, which is produced at AIM in high quantities at reproducible high yield and with superior performance, AIM has developed an MWIR and LWIR 1280×1024 pixel design with a 15 µm pixel pitch to make use of the advantages of large format detectors for IR systems applications. Benefitting from the continuous advancement of traditional liquid phase epitaxy (LPE) n-on-p technology, excellent electro-optical performance over a wide range of operating temperatures as well as enhanced long-term and thermal cycle stability have been achieved for this new and challenging detector format. In parallel, the performance of MCT material grown by molecular beam epitaxy (MBE), which is currently under development to take advantage of 3rd generation device architecture and the alternative GaAs substrate material, is evaluated for this application. In this paper, we will present the results of electro-optical detector characterizations and IR images of MWIR and LWIR 1280×1024 FPAs fabricated by LPE. We demonstrate the progress in MBE development at AIM and present electro-optical figures of merit, e.g., NETD and the operability of MWIR and LWIR 1280×1024 FPAs with MCT layers grown on GaAs by MBE.
{"title":"Large-format MWIR and LWIR detectors at AIM","authors":"J. Ziegler, H. Bitterlich, R. Breiter, M. Bruder, D. Eich, P. Fries, R. Wollrab, J. Wendler, J. Wenisch","doi":"10.1117/12.2015241","DOIUrl":"https://doi.org/10.1117/12.2015241","url":null,"abstract":"Based on its well established 640×512 pixel, 15 µm pitch detector for a staring application, which is produced at AIM in high quantities at reproducible high yield and with superior performance, AIM has developed an MWIR and LWIR 1280×1024 pixel design with a 15 µm pixel pitch to make use of the advantages of large format detectors for IR systems applications. Benefitting from the continuous advancement of traditional liquid phase epitaxy (LPE) n-on-p technology, excellent electro-optical performance over a wide range of operating temperatures as well as enhanced long-term and thermal cycle stability have been achieved for this new and challenging detector format. In parallel, the performance of MCT material grown by molecular beam epitaxy (MBE), which is currently under development to take advantage of 3rd generation device architecture and the alternative GaAs substrate material, is evaluated for this application. In this paper, we will present the results of electro-optical detector characterizations and IR images of MWIR and LWIR 1280×1024 FPAs fabricated by LPE. We demonstrate the progress in MBE development at AIM and present electro-optical figures of merit, e.g., NETD and the operability of MWIR and LWIR 1280×1024 FPAs with MCT layers grown on GaAs by MBE.","PeriodicalId":338283,"journal":{"name":"Defense, Security, and Sensing","volume":"647 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115115065","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}