Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519492
R. Illman, Terry Bird, G. Catlow, S. Clarke, Len Theobald, G. Willetts
The implementation of quasi-exhaustive BlST in a VLSl Content Addressable File Store (CAFS) system built from four ASIC designs and commodity memory chips is described. A novel application of BIST at the system level for improved system reliability and maintenance is discussed.
{"title":"Built-in self-test of the VLSI content addressable filestore","authors":"R. Illman, Terry Bird, G. Catlow, S. Clarke, Len Theobald, G. Willetts","doi":"10.1109/TEST.1991.519492","DOIUrl":"https://doi.org/10.1109/TEST.1991.519492","url":null,"abstract":"The implementation of quasi-exhaustive BlST in a VLSl Content Addressable File Store (CAFS) system built from four ASIC designs and commodity memory chips is described. A novel application of BIST at the system level for improved system reliability and maintenance is discussed.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122020549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519779
T. Moore
The testing of logic devices incorporating boundary scan with the IEEE 1149.1 standard has been shown to be a practical method to test and diagnose loaded board interconnect. A highly flexible workstation based boundary scan interconnect tesr system has been used to provide low cost interconnect verification. This paper will describe the suite of tools, their integration, and their application to diFerent test requirements.
{"title":"A WORKSTATION ENVIRONMENT FOR BOUNDARY SCAN INTERCONNECT TESTING","authors":"T. Moore","doi":"10.1109/TEST.1991.519779","DOIUrl":"https://doi.org/10.1109/TEST.1991.519779","url":null,"abstract":"The testing of logic devices incorporating boundary scan with the IEEE 1149.1 standard has been shown to be a practical method to test and diagnose loaded board interconnect. A highly flexible workstation based boundary scan interconnect tesr system has been used to provide low cost interconnect verification. This paper will describe the suite of tools, their integration, and their application to diFerent test requirements.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126663356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519702
W. Simpson, J. Sheppard
In diagnosing a failed system, a smart technician would choose tests to be performed based on the context of the situation. Currently, test program sets do not fault-. isolate within the context of a situation. Instead, testing follows a rigid, predetermined, fault-isolation sequence that is based on an embedded fault tree. Current test programs do not tolerate instrument failure and cannot redirect testing by incorporating new information. However, there is a new approach to automatic testing that emulates the best features of a trained technician yet, unlike the development of rule-based expert systems, does not require a trained technician to build the knowledge base. This new approach is model-based and has evolved over the last 10 years. This evolution has led to the development of several maintenance tools and an architecture for intelligent automatic test equipment (ATE). The architecture has been implemented for testing two cards from an AV-8B power supply.
{"title":"AN INTELLIGENT APPROACH TO AUTOMATIC TEST EQUIPMENT","authors":"W. Simpson, J. Sheppard","doi":"10.1109/TEST.1991.519702","DOIUrl":"https://doi.org/10.1109/TEST.1991.519702","url":null,"abstract":"In diagnosing a failed system, a smart technician would choose tests to be performed based on the context of the situation. Currently, test program sets do not fault-. isolate within the context of a situation. Instead, testing follows a rigid, predetermined, fault-isolation sequence that is based on an embedded fault tree. Current test programs do not tolerate instrument failure and cannot redirect testing by incorporating new information. However, there is a new approach to automatic testing that emulates the best features of a trained technician yet, unlike the development of rule-based expert systems, does not require a trained technician to build the knowledge base. This new approach is model-based and has evolved over the last 10 years. This evolution has led to the development of several maintenance tools and an architecture for intelligent automatic test equipment (ATE). The architecture has been implemented for testing two cards from an AV-8B power supply.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126319088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519790
Charles C. Packard
{"title":"IS BURN-IN BURNED OUT?","authors":"Charles C. Packard","doi":"10.1109/TEST.1991.519790","DOIUrl":"https://doi.org/10.1109/TEST.1991.519790","url":null,"abstract":"","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125532753","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519765
Eric Rosenfeld, Bradford Sumner
DSP waveform synthesizers cause signal distortion because of inherent limitations in their ability to construct continuous lime waveforms from discrete samples. Often this distortion is minimized by increasing the sampling rate of the synthesizer, thereby decreasing the processing bandwidth. This paper presentsa software-based, calibration routine for correcting the generated waveform. This technique uses a software equalizer to estimate and correct for the system response. The equalizer is created using a recursive least-squares (ULS) algorithm and has the form of a transversal, FIR filter. Thisfilter can then be used toprewarp the input waveform sequence. The results of this calibration method are presented in the final section. Tntroduction The problem addressed in this paper is accurate time-waveform generation. The actual output of a synthesizer is a time waveform transformed from a sequence of digital samples. Here, transformation should be understood as an inherent limitation on waveform reconstruction which can be described for all synthesizers. Specifically, this paper will not treat harmonic distortion or noise effects. In fact, this paper will show that the inherent limitations of waveform synthesizers can be successfully modelled as linear effects. There are three important types of transformation which will be discussed: sinex+m-x distortion, spectral images and group delay . Often these problems are solved by simply increasing the synthesizer sampling rate. This paper will present a new DSP-based technique for directly calibrating these sources of distortion without increasing the sampling rate. Paper 36.3 986 This technique is intended for multitone and complex waveform testing. The issues addressed by this paper must also be addressedin singletone testing, but there the solution is simpler. For singletone testing it is only necessary to generate calibration factors for gain and phase at a small number of frequencies, whereas the technique described here attempts to calibrate across an entire frequency band. iVaveform Synthesizers and their Limitations
{"title":"DSP CALIBRATION FOR ACCURATE TIME WAVEFORM RECONSTRUCTION","authors":"Eric Rosenfeld, Bradford Sumner","doi":"10.1109/TEST.1991.519765","DOIUrl":"https://doi.org/10.1109/TEST.1991.519765","url":null,"abstract":"DSP waveform synthesizers cause signal distortion because of inherent limitations in their ability to construct continuous lime waveforms from discrete samples. Often this distortion is minimized by increasing the sampling rate of the synthesizer, thereby decreasing the processing bandwidth. This paper presentsa software-based, calibration routine for correcting the generated waveform. This technique uses a software equalizer to estimate and correct for the system response. The equalizer is created using a recursive least-squares (ULS) algorithm and has the form of a transversal, FIR filter. Thisfilter can then be used toprewarp the input waveform sequence. The results of this calibration method are presented in the final section. Tntroduction The problem addressed in this paper is accurate time-waveform generation. The actual output of a synthesizer is a time waveform transformed from a sequence of digital samples. Here, transformation should be understood as an inherent limitation on waveform reconstruction which can be described for all synthesizers. Specifically, this paper will not treat harmonic distortion or noise effects. In fact, this paper will show that the inherent limitations of waveform synthesizers can be successfully modelled as linear effects. There are three important types of transformation which will be discussed: sinex+m-x distortion, spectral images and group delay . Often these problems are solved by simply increasing the synthesizer sampling rate. This paper will present a new DSP-based technique for directly calibrating these sources of distortion without increasing the sampling rate. Paper 36.3 986 This technique is intended for multitone and complex waveform testing. The issues addressed by this paper must also be addressedin singletone testing, but there the solution is simpler. For singletone testing it is only necessary to generate calibration factors for gain and phase at a small number of frequencies, whereas the technique described here attempts to calibrate across an entire frequency band. iVaveform Synthesizers and their Limitations","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125661837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519759
X. Delord, G. Saucier
This paper focuses on the adaptation of a concurrent control-flow checking technique to pipelined RISC microprocessors. This technique, called embedded signature monitoring (ESM), verifies the validity of the instructions executed by the processor. Numerous ESM schemes have been studied with non pipelined processors but up-to-date machines pose new problems. The instruction pipeline of these processors makes difficult to know which instructions are actually executed among the fetched ones: the pipeline may be flushed when a jlowcontrol instruction is executed or when an exception is taken. A behavioural model is presented for the pipeline of most recent processors. It is used to propose a new simple ESM scheme compatible with these processors. This scheme is experienced on the Motorola MC88100 RISC processor. The design of a signature monitor dedicated to this processor is presented and hardware costs are discussed.
{"title":"Formalizing Signature Analysis for Control Flow Checking of Pipelined RISC Microprocessors","authors":"X. Delord, G. Saucier","doi":"10.1109/TEST.1991.519759","DOIUrl":"https://doi.org/10.1109/TEST.1991.519759","url":null,"abstract":"This paper focuses on the adaptation of a concurrent control-flow checking technique to pipelined RISC microprocessors. This technique, called embedded signature monitoring (ESM), verifies the validity of the instructions executed by the processor. Numerous ESM schemes have been studied with non pipelined processors but up-to-date machines pose new problems. The instruction pipeline of these processors makes difficult to know which instructions are actually executed among the fetched ones: the pipeline may be flushed when a jlowcontrol instruction is executed or when an exception is taken. A behavioural model is presented for the pipeline of most recent processors. It is used to propose a new simple ESM scheme compatible with these processors. This scheme is experienced on the Motorola MC88100 RISC processor. The design of a signature monitor dedicated to this processor is presented and hardware costs are discussed.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130512511","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519502
J. Lyon, Michael E. Gladden, E. Hartung, Eric Hoang, K. Raghunathan
The purpose of this paper is to describe the testability features implemented in Motorola's recently completed design of a sixteen bit microcontroller, the 68HC16Z1. The discussion includes a brief introduction to the 68HC16Z1, test objectives and organization along with descriptions of design for test (DR) techniques and structures.
{"title":"TESTABILITY FEATURES OF THE 68HC16Z1","authors":"J. Lyon, Michael E. Gladden, E. Hartung, Eric Hoang, K. Raghunathan","doi":"10.1109/TEST.1991.519502","DOIUrl":"https://doi.org/10.1109/TEST.1991.519502","url":null,"abstract":"The purpose of this paper is to describe the testability features implemented in Motorola's recently completed design of a sixteen bit microcontroller, the 68HC16Z1. The discussion includes a brief introduction to the 68HC16Z1, test objectives and organization along with descriptions of design for test (DR) techniques and structures.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130310685","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519780
C. Maunder
ion From Detail Traditionally, test data interchange has occurred at a very low level -involving detailed binary data and signal timings. Where boundary-scan is extensively used, interchange can occur at an abstracted level -provided that the target test system can comprehend the transmitted data. For example, where all testing is to be achieved through the 1149.1 interface, interchange may be at the level of instructions and associated data values, with the method of application being implied from a statement that the circuit complies with the standard. A key advantage of boundary-scan-based testing is that test data can be used both in the factory and later in life -for example, during field fault diagnosis or in depot repair. Whereas a test system designed for use in a factory environment may offer high-throughput, support for fault diagnosis, and other 'top-end features, testers intended for field use may be based on off-the-shelf notebook PCs, possibly with plug-in modules that support boundary-scan test access. This latter type of 'tester' may be limited to low-throughput go-nogo testing. The objective for future test data interchange standards should be to allow the same basic test programme to be used both in the factory and in the field To use an analogy with microprocessor software, test data interchange is moving from 'microcode' to 'assembler'.
{"title":"Languages to Support Boundary-Scan Test","authors":"C. Maunder","doi":"10.1109/TEST.1991.519780","DOIUrl":"https://doi.org/10.1109/TEST.1991.519780","url":null,"abstract":"ion From Detail Traditionally, test data interchange has occurred at a very low level -involving detailed binary data and signal timings. Where boundary-scan is extensively used, interchange can occur at an abstracted level -provided that the target test system can comprehend the transmitted data. For example, where all testing is to be achieved through the 1149.1 interface, interchange may be at the level of instructions and associated data values, with the method of application being implied from a statement that the circuit complies with the standard. A key advantage of boundary-scan-based testing is that test data can be used both in the factory and later in life -for example, during field fault diagnosis or in depot repair. Whereas a test system designed for use in a factory environment may offer high-throughput, support for fault diagnosis, and other 'top-end features, testers intended for field use may be based on off-the-shelf notebook PCs, possibly with plug-in modules that support boundary-scan test access. This latter type of 'tester' may be limited to low-throughput go-nogo testing. The objective for future test data interchange standards should be to allow the same basic test programme to be used both in the factory and in the field To use an analogy with microprocessor software, test data interchange is moving from 'microcode' to 'assembler'.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130947704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519493
C. Stroud
A practical application and case s,tudy of a Built-In Self-Test (BIST) technique for high-speed data-path circuitry is described. The approach has been implemented in six VLSI devices developed for broadband packet switching applications. The technique provides high fault coverage (> 90%) with low area overhead penalty (< 4%) and no impact to performance. The BIST approach is used for all levels of testing and, at the system level, performs full circuit board BIST with diagnostic resolution to the faulty component or interconnect.
{"title":"Built-in self-test for high-speed data-path circuitry","authors":"C. Stroud","doi":"10.1109/TEST.1991.519493","DOIUrl":"https://doi.org/10.1109/TEST.1991.519493","url":null,"abstract":"A practical application and case s,tudy of a Built-In Self-Test (BIST) technique for high-speed data-path circuitry is described. The approach has been implemented in six VLSI devices developed for broadband packet switching applications. The technique provides high fault coverage (> 90%) with low area overhead penalty (< 4%) and no impact to performance. The BIST approach is used for all levels of testing and, at the system level, performs full circuit board BIST with diagnostic resolution to the faulty component or interconnect.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131249067","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1991-10-26DOI: 10.1109/TEST.1991.519514
A. Singh, C. M. Krishna
We propose a new adaptive testing procedure that uses spatial defect clustering information to optimize test lengths during wafer-probe testing. For the same average test lengths, our approach shows better than a factor-of-two improvement in average defect levels. It further allows the separation of high-quality dies with defect levels more than an order of magnitude better than the average for the production run. Our proposal is orthogonal to all other approaches for improving defect quality and can be combined with them.
{"title":"On Optimizing Wafer-Probe Testing for Product Quality Using Die-Yield Prediction","authors":"A. Singh, C. M. Krishna","doi":"10.1109/TEST.1991.519514","DOIUrl":"https://doi.org/10.1109/TEST.1991.519514","url":null,"abstract":"We propose a new adaptive testing procedure that uses spatial defect clustering information to optimize test lengths during wafer-probe testing. For the same average test lengths, our approach shows better than a factor-of-two improvement in average defect levels. It further allows the separation of high-quality dies with defect levels more than an order of magnitude better than the average for the production run. Our proposal is orthogonal to all other approaches for improving defect quality and can be combined with them.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133536023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}