A 13-Bit 10-Mhz Adc Background-Calibrated with Real-Time Oversampling Calibrator

T. Shu, B. Song, K. Bacrania
{"title":"A 13-Bit 10-Mhz Adc Background-Calibrated with Real-Time Oversampling Calibrator","authors":"T. Shu, B. Song, K. Bacrania","doi":"10.1109/VLSIC.1994.586166","DOIUrl":null,"url":null,"abstract":"A real-time digital-domain code-error calibration technique is developed and demonstrated in this fully-Merential5-volt 13-bit IO-= BiCMOS ADC. The calibration process does not interfere with the normal operation of the converter but improves its linearity in real time while the converter is working. The core of this technique is an oversampling sigma-delta ratio calibrator working synchronously with the converter in background.' 1. Introdiictiou Existing self-calibration techniques for data converters interrupt normal conversion cycle for calibration[ I]. Temperature variation, device parameter drift, etc., may cause the calibration data to become invalid unless the converter is periodically re-calibrated. Other analogdomain calibration techniques are limited by the analog signal accuracy and circuit noise[2]. A novel digital-domain, code-error background calibration technique is implemented in this experimental 13-bit 10-MHZ ADC. The calibration procedure is virtually trans arent to the normal converter operation, thus allowing tg e converter to operate continuously even while being calibrated. 2.Real-Time Digital Calibration Technique In a multi-stage type ADC, the linearity of the D-to-A converter @AC) in the first stage determines the overall transfer characteristics of the ADC. To calibrate the ADC it is necessary to correct the linearity error of the fist stage DAC. In this ADC, a resistor-string DAC is used in the first stage because it allows both the calibrator circuit and the ADC amplifier to tap different DAC outputs simultaneously without affecting each other, provided that the DAC outputs settle within the clock phase. The DAC output errors are digitized by the calibrator and later subtracted from the raw output codes with the code-error calibration technique[3], To measure the resistor DAC error, a ratio-measurement method has been used. A switched-capacitor subtracter is used to measure the mid-point voltage error of a given section of the resistor string. A simplified diagram is shown in Figxe 1. Assuming ideal conditions,C, = C2 = C,,","PeriodicalId":350730,"journal":{"name":"Proceedings of 1994 IEEE Symposium on VLSI Circuits","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE Symposium on VLSI Circuits","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VLSIC.1994.586166","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

A real-time digital-domain code-error calibration technique is developed and demonstrated in this fully-Merential5-volt 13-bit IO-= BiCMOS ADC. The calibration process does not interfere with the normal operation of the converter but improves its linearity in real time while the converter is working. The core of this technique is an oversampling sigma-delta ratio calibrator working synchronously with the converter in background.' 1. Introdiictiou Existing self-calibration techniques for data converters interrupt normal conversion cycle for calibration[ I]. Temperature variation, device parameter drift, etc., may cause the calibration data to become invalid unless the converter is periodically re-calibrated. Other analogdomain calibration techniques are limited by the analog signal accuracy and circuit noise[2]. A novel digital-domain, code-error background calibration technique is implemented in this experimental 13-bit 10-MHZ ADC. The calibration procedure is virtually trans arent to the normal converter operation, thus allowing tg e converter to operate continuously even while being calibrated. 2.Real-Time Digital Calibration Technique In a multi-stage type ADC, the linearity of the D-to-A converter @AC) in the first stage determines the overall transfer characteristics of the ADC. To calibrate the ADC it is necessary to correct the linearity error of the fist stage DAC. In this ADC, a resistor-string DAC is used in the first stage because it allows both the calibrator circuit and the ADC amplifier to tap different DAC outputs simultaneously without affecting each other, provided that the DAC outputs settle within the clock phase. The DAC output errors are digitized by the calibrator and later subtracted from the raw output codes with the code-error calibration technique[3], To measure the resistor DAC error, a ratio-measurement method has been used. A switched-capacitor subtracter is used to measure the mid-point voltage error of a given section of the resistor string. A simplified diagram is shown in Figxe 1. Assuming ideal conditions,C, = C2 = C,,
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于实时过采样校准器的13位10mhz Adc背景校准
本文开发了一种实时数字域码错校准技术,并在这个全merential5伏13位IO-= BiCMOS ADC中进行了演示。校准过程不影响变换器的正常工作,而是在变换器工作时实时提高其线性度。该技术的核心是一个过采样σ - δ比校准器,该校准器与后台转换器同步工作。“1。现有的数据转换器自校准技术中断了正常的转换周期进行校准[j]。温度变化、器件参数漂移等可能导致校准数据失效,除非定期重新校准转换器。其他模拟域校准技术受到模拟信号精度和电路噪声的限制[2]。在这个13位10mhz实验ADC中实现了一种新的数字域、码错背景校准技术。校准程序实际上是透明的正常转换器操作,从而允许tg转换器连续运行,即使在校准。2.实时数字校准技术在多级型ADC中,第一级数模转换器(ac)的线性度决定了ADC的整体传输特性。为了校准ADC,有必要纠正第一级DAC的线性误差。在这个ADC中,第一级使用电阻串DAC,因为它允许校准电路和ADC放大器同时分接不同的DAC输出,而不会相互影响,前提是DAC输出在时钟相位内稳定。DAC输出误差由校准器数字化,然后通过码差校准技术从原始输出码中减去[3]。为了测量电阻DAC误差,采用了比率测量方法。开关电容减法器用于测量给定电阻串部分的中点电压误差。一个简化的图表如图1所示。假设理想条件下,C, = C2 = C,,
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Circuit Techniques For An 8-ns Ecl 100K Compatible 3.3v 16mb Bicmos Sram With Minimum Operation Voltage Of 2.3v A 15- 150mhz, All-Digital Phase-Locked Loop with 50-Cycle Lock Time for High-Performance Low-Power Microprocessors A Digital Self Compensation Circuit for High Speed D/a Converters A 110 mhz Mpeg2 Variable Length Decoder LSI A 200mhz 16mbit Synchronous Dram With Block Access Mode
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1