Exact knowledge on the morphology of coronary vessel systems is of increasing importance for clinical applications. The quantification of morphologic vessel parameters is an essential aid in diagnosis, therapy planning and verification of surgical results. 3-D reconstruction from biplane angiograms is well suited for these purposes. Most of the present reconstruction systems assume a fixed isocenter, which does not reflect the real conditions. Even if the devices were adjusted by special phantoms, the systems may shift during angulation. Mechanical solutions of this problem would be complex and expensive. Handling the diffuse isocenter problem just by a generalized intersection point cannot deliver sufficient results for quantitative evaluations. In the authors' new approach presented here, they dropped the assumption of a stable isocenter and considered the real mechanical properties of biplane imaging systems.<>
{"title":"Biplane coronary angiography: accurate quantitative 3-D reconstruction without isocenter","authors":"A. Wahle, E. Wellnhofer, H. Oswald, E. Fleck","doi":"10.1109/CIC.1993.378495","DOIUrl":"https://doi.org/10.1109/CIC.1993.378495","url":null,"abstract":"Exact knowledge on the morphology of coronary vessel systems is of increasing importance for clinical applications. The quantification of morphologic vessel parameters is an essential aid in diagnosis, therapy planning and verification of surgical results. 3-D reconstruction from biplane angiograms is well suited for these purposes. Most of the present reconstruction systems assume a fixed isocenter, which does not reflect the real conditions. Even if the devices were adjusted by special phantoms, the systems may shift during angulation. Mechanical solutions of this problem would be complex and expensive. Handling the diffuse isocenter problem just by a generalized intersection point cannot deliver sufficient results for quantitative evaluations. In the authors' new approach presented here, they dropped the assumption of a stable isocenter and considered the real mechanical properties of biplane imaging systems.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":"19 1","pages":"97-100"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88957704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The authors present a new approach to the ill-conditioned inverse problem of electrocardiography which employs finite element techniques to generate a truncated eigenvector expansion which stabilizes the inversion. The body surface potentials are expanded in terms of the eigenvectors, and a least squares fit to the measured body surface potentials is used to determine the coefficients of the expansion. This expansion is then used directly to determine the potentials on the surface of the heart.<>
{"title":"A generalized matrix eigensystem approach to the inverse problem of electrocardiography","authors":"R. Throne, L. Olson","doi":"10.1109/CIC.1993.378444","DOIUrl":"https://doi.org/10.1109/CIC.1993.378444","url":null,"abstract":"The authors present a new approach to the ill-conditioned inverse problem of electrocardiography which employs finite element techniques to generate a truncated eigenvector expansion which stabilizes the inversion. The body surface potentials are expanded in terms of the eigenvectors, and a least squares fit to the measured body surface potentials is used to determine the coefficients of the expansion. This expansion is then used directly to determine the potentials on the surface of the heart.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":"22 1","pages":"301-303"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81585708","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Intracoronary ultrasound (ICUS) is a valuable tool in the study of coronary artery disease. Accurate and robust methods using real-time automated boundary detection algorithms are needed to quantify lumen and intimal-medial areas. The authors' approach transforms ICUS boundary detection into a recirculant multilayer graph problem with local minima. Dynamic programming (DP) and simulated annealing (SA) are two fundamentally different approaches to this optimization problem which are known to converge to the global minimum. However, time to convergence for SA is impractical. The authors compare a new optimized implementation of SA called compensated simulated annealing with DP.<>
{"title":"Compensated simulated annealing vs. dynamic programming used for boundary detection in intracoronary ultrasound","authors":"T. Johnson, W. Snyder, D. Herrington","doi":"10.1109/CIC.1993.378491","DOIUrl":"https://doi.org/10.1109/CIC.1993.378491","url":null,"abstract":"Intracoronary ultrasound (ICUS) is a valuable tool in the study of coronary artery disease. Accurate and robust methods using real-time automated boundary detection algorithms are needed to quantify lumen and intimal-medial areas. The authors' approach transforms ICUS boundary detection into a recirculant multilayer graph problem with local minima. Dynamic programming (DP) and simulated annealing (SA) are two fundamentally different approaches to this optimization problem which are known to converge to the global minimum. However, time to convergence for SA is impractical. The authors compare a new optimized implementation of SA called compensated simulated annealing with DP.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":"32 1","pages":"113-116"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86175075","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Voudouris, M. Strintzis, N. Maglaveras, C. Pappas
A new hidden Marcov model (HMM) structure with vector-valued observation sequences is developed for the characterization of cardiac arrhythmias and other irregularities in multiple-lead ECG recordings. An analysis procedure is then proposed which generalises the HMM analysis developed for single-lead ECGs.<>
{"title":"Use of hidden Marcov models for the analysis of multiple lead ECG recordings","authors":"D. Voudouris, M. Strintzis, N. Maglaveras, C. Pappas","doi":"10.1109/CIC.1993.378294","DOIUrl":"https://doi.org/10.1109/CIC.1993.378294","url":null,"abstract":"A new hidden Marcov model (HMM) structure with vector-valued observation sequences is developed for the characterization of cardiac arrhythmias and other irregularities in multiple-lead ECG recordings. An analysis procedure is then proposed which generalises the HMM analysis developed for single-lead ECGs.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":"24 1","pages":"899-902"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88333462","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Volume rendering techniques are particularly useful for the visualization of echographic data which appear noisy, irregular and fuzzy. Data were acquired by means of an annular transducer which rotates around its axis with increments of 3.6 degrees allowing the acquisition of 50 ECG-gated 2-D echography images. The 2-D images acquired in polar coordinates were remapped over a cartesian grid to produce a cube of data. A transfer function that allows the mapping of the 3-D echo data into colour and opacity visual parameters to enhance the regions of interest has been considered. A suitable shading algorithm allows the data to be viewed 3-dimensionally by means of a simple illumination technique. The final 2-D image is obtained by means of a back to front ray casting algorithm.<>
{"title":"Volume rendering for 3-D echocardiography visualization","authors":"A. Sarti, C. Lamberti, G. Erbacci, R. Pini","doi":"10.1109/CIC.1993.378467","DOIUrl":"https://doi.org/10.1109/CIC.1993.378467","url":null,"abstract":"Volume rendering techniques are particularly useful for the visualization of echographic data which appear noisy, irregular and fuzzy. Data were acquired by means of an annular transducer which rotates around its axis with increments of 3.6 degrees allowing the acquisition of 50 ECG-gated 2-D echography images. The 2-D images acquired in polar coordinates were remapped over a cartesian grid to produce a cube of data. A transfer function that allows the mapping of the 3-D echo data into colour and opacity visual parameters to enhance the regions of interest has been considered. A suitable shading algorithm allows the data to be viewed 3-dimensionally by means of a simple illumination technique. The final 2-D image is obtained by means of a back to front ray casting algorithm.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":" 23","pages":"209-212"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91413127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. Micheli-Tzanakou, C. Yi, W. Kostis, D. Shindler, J. Kostis
Neural networks (NNs) have been found useful in many biomedical applications. The authors' purpose is to apply NNs to two specific problems in cardiology, namely, diagnosis of echocardiograms for myocardial infarction and prediction of vital status of patients that suffered such. The authors used NNs to discriminate between normal and infarcted myocardium, by looking at intensity changes. The intensities of selected regions are used for training and testing. In predicting the vital status of patients that have suffered acute myocardial infarction, the authors used a large database (MIDAS) with follow-ups. The NN in this case has two hidden layers with 18 patient variables from the MIDAS dataset as inputs. The NN was again trained with the feedback algorithm ALOPEX and tested with unknown data.<>
{"title":"Myocardial infarction: diagnosis and vital status prediction using neural networks","authors":"E. Micheli-Tzanakou, C. Yi, W. Kostis, D. Shindler, J. Kostis","doi":"10.1109/CIC.1993.378462","DOIUrl":"https://doi.org/10.1109/CIC.1993.378462","url":null,"abstract":"Neural networks (NNs) have been found useful in many biomedical applications. The authors' purpose is to apply NNs to two specific problems in cardiology, namely, diagnosis of echocardiograms for myocardial infarction and prediction of vital status of patients that suffered such. The authors used NNs to discriminate between normal and infarcted myocardium, by looking at intensity changes. The intensities of selected regions are used for training and testing. In predicting the vital status of patients that have suffered acute myocardial infarction, the authors used a large database (MIDAS) with follow-ups. The NN in this case has two hidden layers with 18 patient variables from the MIDAS dataset as inputs. The NN was again trained with the feedback algorithm ALOPEX and tested with unknown data.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":"21 1","pages":"229-232"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87320659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nonstationary analysis of ECGs (especially the ST segment) was performed using the Wigner-Ville distribution (WVD) and wavelet transforms. The analysis was done on multiple leads of the same subject and on subjects with normal ECG, ischemia, necrosis and infarct. All data came from the CSE multilead database. It was found that the spectrotemporal maps were not considerably different from lead to lead and that substantial changes in spectrotemporal maps concerning the existence of nonstationarities exist among the above-mentioned pathological states. These changes were evident mainly in the QRS complex and the ST segment. Only in the infarcted subject did such changes persist over the whole P-QRS-T complex. The WVD was found superior from the wavelet transform in having better time- and frequency-domain resolution and superior computational performance.<>
{"title":"Nonstationary ECG analysis using Wigner-Ville transform and wavelets","authors":"P. Kotsas, C. Pappas, M. Strintzis, N. Maglaveras","doi":"10.1109/CIC.1993.378394","DOIUrl":"https://doi.org/10.1109/CIC.1993.378394","url":null,"abstract":"Nonstationary analysis of ECGs (especially the ST segment) was performed using the Wigner-Ville distribution (WVD) and wavelet transforms. The analysis was done on multiple leads of the same subject and on subjects with normal ECG, ischemia, necrosis and infarct. All data came from the CSE multilead database. It was found that the spectrotemporal maps were not considerably different from lead to lead and that substantial changes in spectrotemporal maps concerning the existence of nonstationarities exist among the above-mentioned pathological states. These changes were evident mainly in the QRS complex and the ST segment. Only in the infarcted subject did such changes persist over the whole P-QRS-T complex. The WVD was found superior from the wavelet transform in having better time- and frequency-domain resolution and superior computational performance.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":"1 1","pages":"499-502"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90662871","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Olinic, F. Trémel, P. Defaye, N. Olinic, S. Nedevschi, R. Vlaicu, B. Denis
Fragmented ventricular depolarizations (FVD) were assumed to be recorded as low amplitude deflections (5-100 /spl mu/V) on the high resolution recordings of averaged unfiltered electrocardiograms. Scores that quantify FVD over the entire QRS complex and, respectively, in the terminal part of the QRS complex, in all the three X, Y and Z leads, were defined. In 55 patients with ventricular tachycardia, the detection of late potentials (at 40 Hz) was significantly related to the duration of the QRS complex and to the Score and duration of FVD in the terminal QRS complex. As compared to late potentials detection, a FVD Score >3 over the entire QRS complex, and not only a FVD Score >2 in the terminal QRS, significantly improved the identification of VT patients, without increasing the prevalence of false positive results in the control groups of hypertensive patients and of subjects without cardiopathy.<>
{"title":"Quantification of fragmented ventricular depolarizations over the entire QRS complex for improving the identification of patients with ventricular tachycardia","authors":"D. Olinic, F. Trémel, P. Defaye, N. Olinic, S. Nedevschi, R. Vlaicu, B. Denis","doi":"10.1109/CIC.1993.378502","DOIUrl":"https://doi.org/10.1109/CIC.1993.378502","url":null,"abstract":"Fragmented ventricular depolarizations (FVD) were assumed to be recorded as low amplitude deflections (5-100 /spl mu/V) on the high resolution recordings of averaged unfiltered electrocardiograms. Scores that quantify FVD over the entire QRS complex and, respectively, in the terminal part of the QRS complex, in all the three X, Y and Z leads, were defined. In 55 patients with ventricular tachycardia, the detection of late potentials (at 40 Hz) was significantly related to the duration of the QRS complex and to the Score and duration of FVD in the terminal QRS complex. As compared to late potentials detection, a FVD Score >3 over the entire QRS complex, and not only a FVD Score >2 in the terminal QRS, significantly improved the identification of VT patients, without increasing the prevalence of false positive results in the control groups of hypertensive patients and of subjects without cardiopathy.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":"12 1","pages":"69-72"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91217568","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. Linderer, W. Wunderlich, F. Fischer, R. Schroder
The authors studied the effect various catheter sizes and floating image zoom on catheter calibrated vessel diameters. The pixel sizes of vessel phantoms (0.3-4.5 mm o) were obtained by optimum weighted edge detection. The 2.0 mm and the 3.0 mm phantom served as scaling device for catheter calibration. Image zoom varied from 1 to 6 fold. The authors found, that reference object calibration causes a rotation of measured native pixel diameter curves, whereby the amount of rotation depends on the size of the scaling catheter and affects especially diameters in the normal range of coronaries. It is concluded that using different catheter sizes and varying image zoom should be discouraged in long-term studies of coronary artery disease, since they feign non-existent lesion changes.<>
研究了不同导管尺寸和浮动图像变焦对导管标定血管直径的影响。通过最优加权边缘检测,获得了血管影的像素大小(0.3 ~ 4.5 mm o)。2.0 mm和3.0 mm幻影作为导管校准的刻度装置。图像变焦从1到6倍不等。作者发现,参考对象校准引起测量的原生像素直径曲线的旋转,其中旋转的量取决于缩放导管的大小,特别是影响冠状动脉正常范围内的直径。因此,在冠状动脉疾病的长期研究中,不建议使用不同尺寸的导管和不同的图像变焦,因为它们会伪造不存在的病变改变
{"title":"Quantitative coronary arteriography: the impact of image zoom and reference object size on diameter measurements","authors":"T. Linderer, W. Wunderlich, F. Fischer, R. Schroder","doi":"10.1109/CIC.1993.378336","DOIUrl":"https://doi.org/10.1109/CIC.1993.378336","url":null,"abstract":"The authors studied the effect various catheter sizes and floating image zoom on catheter calibrated vessel diameters. The pixel sizes of vessel phantoms (0.3-4.5 mm o) were obtained by optimum weighted edge detection. The 2.0 mm and the 3.0 mm phantom served as scaling device for catheter calibration. Image zoom varied from 1 to 6 fold. The authors found, that reference object calibration causes a rotation of measured native pixel diameter curves, whereby the amount of rotation depends on the size of the scaling catheter and affects especially diameters in the normal range of coronaries. It is concluded that using different catheter sizes and varying image zoom should be discouraged in long-term studies of coronary artery disease, since they feign non-existent lesion changes.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":"121 1","pages":"579-582"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85968015","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Akazawa, T. Uchiyama, S. Tanaka, A. Sasamori, E. Harasawa
Proposes a new adaptive method of data compression for digital ambulatory electrocardiograms (ECGs), considering the diagnostic significance of each segment of the ECG. The R-wave is detected, followed by multi-template matching of the detected beat and judgment of the noise level; the templates are successively created during processing. The residual signal (the difference between the original ECG and the best-fit template) is approximated with the FAN data compression method SAPA2 (Scan-Along Polygonal Approximation) and then encoded. The error threshold of FAN is decreased during the P-wave segments and increased during the noise segments; the maximum error of the reconstructed signal at each time is known. This method is applied to ECGs of the AHA (American Heart Association) database and its usefulness is indicated; e.g. the bit rate is approximately 400 bps at 8% PRD (percent RMS difference) and 200 bps at 15% PRD.<>
{"title":"Adaptive data compression of ambulatory ECG using multi templates","authors":"K. Akazawa, T. Uchiyama, S. Tanaka, A. Sasamori, E. Harasawa","doi":"10.1109/CIC.1993.378395","DOIUrl":"https://doi.org/10.1109/CIC.1993.378395","url":null,"abstract":"Proposes a new adaptive method of data compression for digital ambulatory electrocardiograms (ECGs), considering the diagnostic significance of each segment of the ECG. The R-wave is detected, followed by multi-template matching of the detected beat and judgment of the noise level; the templates are successively created during processing. The residual signal (the difference between the original ECG and the best-fit template) is approximated with the FAN data compression method SAPA2 (Scan-Along Polygonal Approximation) and then encoded. The error threshold of FAN is decreased during the P-wave segments and increased during the noise segments; the maximum error of the reconstructed signal at each time is known. This method is applied to ECGs of the AHA (American Heart Association) database and its usefulness is indicated; e.g. the bit rate is approximately 400 bps at 8% PRD (percent RMS difference) and 200 bps at 15% PRD.<<ETX>>","PeriodicalId":20445,"journal":{"name":"Proceedings of Computers in Cardiology Conference","volume":"7 1","pages":"495-498"},"PeriodicalIF":0.0,"publicationDate":"1993-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85973940","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}