Pub Date : 1996-10-01DOI: 10.1016/S0020-7101(96)01220-2
C. Peter Waegemann
The move to computerization in health care requires attention to security issues because the risks to violate patients' rights for privacy are dramatically increasing. As providers in many countries are moving toward computerization, it is important to understand the dangers of computerization. Unauthorized users can access, copy, alter, delete, or distort hundreds or thousands of medical records within minutes. Information can be violated by individuals and by system failures. It is important to understand the potential harm to patients and to our society. The relationship between privacy rights, confidentiality measures, and system security measures must be addressed. It is dangerous to create computer systems in health care without establishing appropriate security measures. Special attention should be given to the weaknesses of the internet and the requirements of network security for a future ‘global information infrastructure’. The internet is based on messaging as are many communication standards such as Edifact, HL7, etc. Messaging systems in general need to be examined with regard to security and accountability issues.
{"title":"IT security: developing a response to increasing risks","authors":"C. Peter Waegemann","doi":"10.1016/S0020-7101(96)01220-2","DOIUrl":"10.1016/S0020-7101(96)01220-2","url":null,"abstract":"<div><p>The move to computerization in health care requires attention to security issues because the risks to violate patients' rights for privacy are dramatically increasing. As providers in many countries are moving toward computerization, it is important to understand the dangers of computerization. Unauthorized users can access, copy, alter, delete, or distort hundreds or thousands of medical records within minutes. Information can be violated by individuals and by system failures. It is important to understand the potential harm to patients and to our society. The relationship between privacy rights, confidentiality measures, and system security measures must be addressed. It is dangerous to create computer systems in health care without establishing appropriate security measures. Special attention should be given to the weaknesses of the internet and the requirements of network security for a future ‘global information infrastructure’. The internet is based on messaging as are many communication standards such as Edifact, HL7, etc. Messaging systems in general need to be examined with regard to security and accountability issues.</p></div>","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"43 1","pages":"Pages 5-8"},"PeriodicalIF":0.0,"publicationDate":"1996-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/S0020-7101(96)01220-2","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19923451","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-01DOI: 10.1016/0020-7101(96)01197-X
Karl-Hans Englmeier , Rainer Herpers , Isabel Künzer , Marko Obermaier , Markus Altmann
A method for a three-dimensional surface reconstruction of the retina in the area of the papilla is presented. The surface reconstruction is based on a sequence of discrete gray-level images of the retina recorded by a scanning laser ophthalmoscope (SLO). The underlying assumption of the surface reconstruction algorithm developed here is that the depth information is also encoded in the brightness values of the single pixels in addition to the ordinary spatial 2D information. The brightness of an image position depends on the degree of reflection of a confocal laser beam. Only those surface structures located directly in the focus plane of the confocal laser beam produce a high response to the laser light. The displacements between the single images of a sequence are considered to be approximately linear and are corrected by applying the cepstrum technique. The depth is estimated from the volumetric representation of the image sequence by searching for the maximal value of the brightness within a computed depth profile, at every image position. In the resulting images, disturbances occurring during the recording cause incorrect local estimations of the depth. These local disturbances are corrected by applying specially developed surface improvement processes. The work is concluded with a comparison of several different approaches to reduce the noise and disturbances in SLO image data.
{"title":"Displacement correction and surface reconstruction of the retina using scanning laser ophthalmoscopic images","authors":"Karl-Hans Englmeier , Rainer Herpers , Isabel Künzer , Marko Obermaier , Markus Altmann","doi":"10.1016/0020-7101(96)01197-X","DOIUrl":"10.1016/0020-7101(96)01197-X","url":null,"abstract":"<div><p>A method for a three-dimensional surface reconstruction of the retina in the area of the papilla is presented. The surface reconstruction is based on a sequence of discrete gray-level images of the retina recorded by a scanning laser ophthalmoscope (SLO). The underlying assumption of the surface reconstruction algorithm developed here is that the depth information is also encoded in the brightness values of the single pixels in addition to the ordinary spatial 2D information. The brightness of an image position depends on the degree of reflection of a confocal laser beam. Only those surface structures located directly in the focus plane of the confocal laser beam produce a high response to the laser light. The displacements between the single images of a sequence are considered to be approximately linear and are corrected by applying the cepstrum technique. The depth is estimated from the volumetric representation of the image sequence by searching for the maximal value of the brightness within a computed depth profile, at every image position. In the resulting images, disturbances occurring during the recording cause incorrect local estimations of the depth. These local disturbances are corrected by applying specially developed surface improvement processes. The work is concluded with a comparison of several different approaches to reduce the noise and disturbances in SLO image data.</p></div>","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"42 3","pages":"Pages 191-204"},"PeriodicalIF":0.0,"publicationDate":"1996-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0020-7101(96)01197-X","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19859225","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-01DOI: 10.1016/0020-7101(96)01204-4
Pieter J. Toussaint, Fenno P. Ottes, Albert R. Bakker
Interoperability seems to be a major focal point of the activities within the Informatics Society in general, and the Medical Informatics society in particular. In both Europe and the USA standardization efforts are pursued in order to enable interoperability. However, even if the technical requirements are met, interoperability is sometimes not feasible because the message exchange needed is too complex. This complexity is influenced by at least three factors: the volume of the data to be exchanged, the functionality of the information exchange, and the communication standard adopted.
{"title":"The complexity of transactions: a means for assessing interoperability","authors":"Pieter J. Toussaint, Fenno P. Ottes, Albert R. Bakker","doi":"10.1016/0020-7101(96)01204-4","DOIUrl":"10.1016/0020-7101(96)01204-4","url":null,"abstract":"<div><p>Interoperability seems to be a major focal point of the activities within the Informatics Society in general, and the Medical Informatics society in particular. In both Europe and the USA standardization efforts are pursued in order to enable interoperability. However, even if the technical requirements are met, interoperability is sometimes not feasible because the message exchange needed is too complex. This complexity is influenced by at least three factors: the volume of the data to be exchanged, the functionality of the information exchange, and the communication standard adopted.</p></div>","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"42 3","pages":"Pages 225-231"},"PeriodicalIF":0.0,"publicationDate":"1996-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0020-7101(96)01204-4","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19859228","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-01DOI: 10.1016/0020-7101(96)01199-3
Jzau-Sheng Lin , Kuo-Sheng Cheng , Chi-Wu Mao
This paper demonstrates a fuzzy Hopfield neural network for segmenting multispectral MR brain images. The proposed approach is a new unsupervised 2-D Hopfield neural network based upon the fuzzy clustering technique. Its implementation consists of the combination of 2-D Hopfield neural network and fuzzy c-means clustering algorithm in order to make parallel implementation for segmenting multispectral MR brain images feasible. For generating feasible results, a fuzzy c-means clustering strategy is included in the Hopfield neural network to eliminate the need for finding weighting factors in the energy function which is formulated and based on a basic concept commonly used in pattern classification, called the ‘within-class scatter matrix’ principle. The suggested fuzzy c-means clustering strategy has also been proven to be convergent and to allow the network to learn more effectively than the conventional Hopfield neural network. The experimental results show that a near optimal solution can be obtained using the fuzzy Hopfield neural network based on the within-class scatter matrix.
{"title":"Multispectral magnetic resonance images segmentation using fuzzy Hopfield neural network","authors":"Jzau-Sheng Lin , Kuo-Sheng Cheng , Chi-Wu Mao","doi":"10.1016/0020-7101(96)01199-3","DOIUrl":"10.1016/0020-7101(96)01199-3","url":null,"abstract":"<div><p>This paper demonstrates a fuzzy Hopfield neural network for segmenting multispectral MR brain images. The proposed approach is a new unsupervised 2-D Hopfield neural network based upon the fuzzy clustering technique. Its implementation consists of the combination of 2-D Hopfield neural network and fuzzy c-means clustering algorithm in order to make parallel implementation for segmenting multispectral MR brain images feasible. For generating feasible results, a fuzzy c-means clustering strategy is included in the Hopfield neural network to eliminate the need for finding weighting factors in the energy function which is formulated and based on a basic concept commonly used in pattern classification, called the ‘within-class scatter matrix’ principle. The suggested fuzzy c-means clustering strategy has also been proven to be convergent and to allow the network to learn more effectively than the conventional Hopfield neural network. The experimental results show that a near optimal solution can be obtained using the fuzzy Hopfield neural network based on the within-class scatter matrix.</p></div>","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"42 3","pages":"Pages 205-214"},"PeriodicalIF":0.0,"publicationDate":"1996-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0020-7101(96)01199-3","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19859226","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-01DOI: 10.1016/0020-7101(96)01173-7
László Tóthfalusi, László Endrényi
Nonlinear regression algorithms were compared by Monte-Carlo simulations when the measurement error was dependent on the measured values (heteroscedasticity) and possibly contaminated with outliers. The tested least-squares (LSQ) algorithms either required user-supplied weights to accommodate heteroscedasticity or the weights were estimated within the procedures. Robust versions of the LSQ algorithms, namely robust iteratively reweighted (IRR) and least absolute value (LAV) regressions, were also considered. The comparisons were based on the efficiency of the estimated parameters and their resistance to outliers. Based on these criteria, among the tested LSQ algorithms, extended least squares (ELSQ) was found to be the most reliable. The IRR versions of these algorithms were slightly more efficient than the LAV versions when there were no outliers but they provided weaker protection to outliers than the LAV variants.
{"title":"Algorithms for robust nonlinear regression with heteroscedastic errors","authors":"László Tóthfalusi, László Endrényi","doi":"10.1016/0020-7101(96)01173-7","DOIUrl":"10.1016/0020-7101(96)01173-7","url":null,"abstract":"<div><p>Nonlinear regression algorithms were compared by Monte-Carlo simulations when the measurement error was dependent on the measured values (heteroscedasticity) and possibly contaminated with outliers. The tested least-squares (LSQ) algorithms either required user-supplied weights to accommodate heteroscedasticity or the weights were estimated within the procedures. Robust versions of the LSQ algorithms, namely robust iteratively reweighted (IRR) and least absolute value (LAV) regressions, were also considered. The comparisons were based on the efficiency of the estimated parameters and their resistance to outliers. Based on these criteria, among the tested LSQ algorithms, extended least squares (ELSQ) was found to be the most reliable. The IRR versions of these algorithms were slightly more efficient than the LAV versions when there were no outliers but they provided weaker protection to outliers than the LAV variants.</p></div>","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"42 3","pages":"Pages 181-190"},"PeriodicalIF":0.0,"publicationDate":"1996-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0020-7101(96)01173-7","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19859824","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-08-01DOI: 10.1016/0020-7101(96)01210-X
B Müller , A. Hasman , J.A. Blom
In an earlier study an approach was described to generate intelligent alarm systems for monitoring ventilation of patients via mathematical simulation and machine learning. However, ventilator settings were not varied. In this study we investigated whether an alarm system could be created with which a satisfactory classification performance could be obtained under a wide variety of ventilator settings, by varying inspiratory to expiratory time (I:E) ratio, tidal volume and respiratory rate. In a first experiment three patient data sets were modeled, each with a different I:E ratio. A part of each data set was used to construct an alarm system for each I:E ratio. The remaining part was used to test the performance of the alarm systems. The three training sets were also combined to construct one alarm system, which was tested with the three test sets. Finally, all alarm systems were tested with data generated by a patient simulator. Similar experiments were performed for the tidal volume and the respiratory rate. It was concluded that an optimally functioning alarm system should contain a library of rule sets, one for each set of ventilator settings. A second best alternative is to take all possible settings into consideration when constructing the training set. Classification performance of the trees that were trained with multiple ventilator settings ranged from 98 to 100% for all test sets. When tested with the independent patient simulator data the classification performance of these trees ranged from 80 to 100%.
{"title":"Building intelligent alarm systems by combining mathematical models and inductive machine learning techniques Part 2—Sensitivity analysis","authors":"B Müller , A. Hasman , J.A. Blom","doi":"10.1016/0020-7101(96)01210-X","DOIUrl":"10.1016/0020-7101(96)01210-X","url":null,"abstract":"<div><p>In an earlier study an approach was described to generate intelligent alarm systems for monitoring ventilation of patients via mathematical simulation and machine learning. However, ventilator settings were not varied. In this study we investigated whether an alarm system could be created with which a satisfactory classification performance could be obtained under a wide variety of ventilator settings, by varying inspiratory to expiratory time (I:E) ratio, tidal volume and respiratory rate. In a first experiment three patient data sets were modeled, each with a different I:E ratio. A part of each data set was used to construct an alarm system for each I:E ratio. The remaining part was used to test the performance of the alarm systems. The three training sets were also combined to construct one alarm system, which was tested with the three test sets. Finally, all alarm systems were tested with data generated by a patient simulator. Similar experiments were performed for the tidal volume and the respiratory rate. It was concluded that an optimally functioning alarm system should contain a library of rule sets, one for each set of ventilator settings. A second best alternative is to take all possible settings into consideration when constructing the training set. Classification performance of the trees that were trained with multiple ventilator settings ranged from 98 to 100% for all test sets. When tested with the independent patient simulator data the classification performance of these trees ranged from 80 to 100%.</p></div>","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"42 3","pages":"Pages 165-179"},"PeriodicalIF":0.0,"publicationDate":"1996-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0020-7101(96)01210-X","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19859823","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The problem of evaluating short-term autonomic response to hypovolemia in patients under chronic hemodialysis treatment is considered. Power spectra of the beat-to-beat heart rate variability were evaluated during the dialysis treatment in twenty hemodynamically stable and unstable patients, using a parametric technique. The autoregressive model coefficients were calculated by the modified covariance method, while model order was selected according to the minimum description length criterion. Reported results demonstrate that stable and unstable patients present markedly different spectral patterns. The efficiency of the compensatory response to hemodialysis-induced hypovolemia was evaluated through the ratio between the powers in LF and HF bands. Stable patients exhibit a LF/HF ratio greater than one with large fluctuations over the whole dialysis session. In contrast, all the unstable patients are characterized by a value of LF/HF lower than one and with a reduced time variability. This result suggests that the hemodynamic instability of the hypotension-prone patients may be due to a deficiency in the short-term compensatory response to the hemodialysis-induced hypovolemia.
{"title":"Parametric analysis of heart rate variability during hemodialysis","authors":"Silvio Cavalcanti , Lorenzo Chiari , Stefano Severi , Guido Avanzolini , Guido Enzmann , Claudio Lamberti","doi":"10.1016/0020-7101(96)01205-6","DOIUrl":"10.1016/0020-7101(96)01205-6","url":null,"abstract":"<div><p>The problem of evaluating short-term autonomic response to hypovolemia in patients under chronic hemodialysis treatment is considered. Power spectra of the beat-to-beat heart rate variability were evaluated during the dialysis treatment in twenty hemodynamically stable and unstable patients, using a parametric technique. The autoregressive model coefficients were calculated by the modified covariance method, while model order was selected according to the minimum description length criterion. Reported results demonstrate that stable and unstable patients present markedly different spectral patterns. The efficiency of the compensatory response to hemodialysis-induced hypovolemia was evaluated through the ratio between the powers in LF and HF bands. Stable patients exhibit a LF/HF ratio greater than one with large fluctuations over the whole dialysis session. In contrast, all the unstable patients are characterized by a value of LF/HF lower than one and with a reduced time variability. This result suggests that the hemodynamic instability of the hypotension-prone patients may be due to a deficiency in the short-term compensatory response to the hemodialysis-induced hypovolemia.</p></div>","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"42 3","pages":"Pages 215-224"},"PeriodicalIF":0.0,"publicationDate":"1996-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0020-7101(96)01205-6","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19859227","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-07-01DOI: 10.1016/0020-7101(96)81526-1
Peter Sönksen , Charles Williams
In this article we have stressed that a diabetes care information system should be useful to, usable and actually used by carers at the point of patient contact. Information resulting from such encounters should, at no extra cost, furnish the needs of communication, audit, research and management. Diabeta is a clinical record system for supporting the management of patients with diabetes. It has grown ‘organically’ within an academic clinical unit over a period of 23 years. It is used for each and every encounter with the clinicians in our diabetes team and as such, contains an immense amount of objective clinical experience. This experience can be interrogated very easily by computer-naive clinicians using a remarkable interactive program (‘Datascan’) which contains statistical procedures ‘embedded’ in the APL computer code, eliminating the need to ‘export’ the data into a statistical package. The latest PC-based version is incredibly fast and this immense amount of clinical experience can be carried around on a notebook PC and be available for exploration at any time. This makes ‘evidence-based medicine’ available in a remarkably flexible way since it shares the accumulated objective experience of literally ‘dozens’ of clinicians over a period which now extends to 23 years. It adds a completely new dimension to the term ‘clinical experience’ and is unattainable with manual records. It would be naive to assume that such systems are easy to design, build or implement, or that the initial capital outlay required will be small although costs are falling continuously. Medicine is a highly complex activity, the essential basis of which is human interaction. Introduction of a technology into this interaction requires sensitivity to the wishes and requirements of individuals, and protection of their exchanges from third parties. The potential of computers in diabetes care is so great that these issues must be addressed through continuing research, development, evaluation and funding of new systems. This must be led by the medical profession not the computer industry.
{"title":"Information technology in diabetes care ‘Diabeta’: 23 years of development and use of a computer-based record for diabetes care","authors":"Peter Sönksen , Charles Williams","doi":"10.1016/0020-7101(96)81526-1","DOIUrl":"10.1016/0020-7101(96)81526-1","url":null,"abstract":"<div><p>In this article we have stressed that a diabetes care information system should be useful to, usable and actually used by carers at the point of patient contact. Information resulting from such encounters should, at no extra cost, furnish the needs of communication, audit, research and management. Diabeta is a clinical record system for supporting the management of patients with diabetes. It has grown ‘organically’ within an academic clinical unit over a period of 23 years. It is used for each and every encounter with the clinicians in our diabetes team and as such, contains an immense amount of objective clinical experience. This experience can be interrogated very easily by computer-naive clinicians using a remarkable interactive program (‘Datascan’) which contains statistical procedures ‘embedded’ in the APL computer code, eliminating the need to ‘export’ the data into a statistical package. The latest PC-based version is incredibly fast and this immense amount of clinical experience can be carried around on a notebook PC and be available for exploration at any time. This makes ‘evidence-based medicine’ available in a remarkably flexible way since it shares the accumulated objective experience of literally ‘dozens’ of clinicians over a period which now extends to 23 years. It adds a completely new dimension to the term ‘clinical experience’ and is unattainable with manual records. It would be naive to assume that such systems are easy to design, build or implement, or that the initial capital outlay required will be small although costs are falling continuously. Medicine is a highly complex activity, the essential basis of which is human interaction. Introduction of a technology into this interaction requires sensitivity to the wishes and requirements of individuals, and protection of their exchanges from third parties. The potential of computers in diabetes care is so great that these issues must be addressed through continuing research, development, evaluation and funding of new systems. This must be led by the medical profession not the computer industry.</p></div>","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"42 1","pages":"Pages 67-77"},"PeriodicalIF":0.0,"publicationDate":"1996-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0020-7101(96)81526-1","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19845135","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1996-07-01DOI: 10.1016/0020-7101(96)01191-9
R. Fabbretti, P.-A. Dorsaz, P.-A. Doriot, W. Rutishauser
In order to master the overwhelming quantity of data produced by the different laboratories of our Cardiology Division, we are presently developing a centralized database. Our aim is to improve the quality of diagnoses and therapies by constituting patient centered medical files integrating logically the results of the different examinations and allowing for a rapid access to the patient data. The database has to be accessible from an heterogeneous set of PC, MacIntoshes and UNIX workstations. It must have an ergonomie graphic user interface and generate reports which can be sent to the patient physician. It is well known that the requirements for a medical database make its conceptual analysis very difficult. As medical knowledge continually evolves, the examination protocols change and, therefore, the data sets have to be updated. The maintenance of classical databases is usually expensive because it requires specialized staff to alter the database structure and to adapt the user interface. To allow for flexibility, modularity, code reusability and reliability, the object paradigm was applied to a classical relational database. Thanks to the combination of both data structure and behavior in single entities, it is possible to build generic user interfaces which can be easily tailored to the needs of every laboratory of our Cardiology Division.
{"title":"Applying the object paradigm to a centralized database for a cardiology division","authors":"R. Fabbretti, P.-A. Dorsaz, P.-A. Doriot, W. Rutishauser","doi":"10.1016/0020-7101(96)01191-9","DOIUrl":"10.1016/0020-7101(96)01191-9","url":null,"abstract":"<div><p>In order to master the overwhelming quantity of data produced by the different laboratories of our Cardiology Division, we are presently developing a centralized database. Our aim is to improve the quality of diagnoses and therapies by constituting patient centered medical files integrating logically the results of the different examinations and allowing for a rapid access to the patient data. The database has to be accessible from an heterogeneous set of PC, MacIntoshes and UNIX workstations. It must have an ergonomie graphic user interface and generate reports which can be sent to the patient physician. It is well known that the requirements for a medical database make its conceptual analysis very difficult. As medical knowledge continually evolves, the examination protocols change and, therefore, the data sets have to be updated. The maintenance of classical databases is usually expensive because it requires specialized staff to alter the database structure and to adapt the user interface. To allow for flexibility, modularity, code reusability and reliability, the object paradigm was applied to a classical relational database. Thanks to the combination of both data structure and behavior in single entities, it is possible to build generic user interfaces which can be easily tailored to the needs of every laboratory of our Cardiology Division.</p></div>","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"42 1","pages":"Pages 129-134"},"PeriodicalIF":0.0,"publicationDate":"1996-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0020-7101(96)01191-9","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19845143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Future of patient records. Care for records for care: an appraisal of the practicalities of electronic patient record. Proceedings of the 8th European Health Records Conference. Maastricht, The Netherlands, May 21-24, 1996.","authors":"","doi":"","DOIUrl":"","url":null,"abstract":"","PeriodicalId":75935,"journal":{"name":"International journal of bio-medical computing","volume":"42 1-2","pages":"1-163"},"PeriodicalIF":0.0,"publicationDate":"1996-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"19966760","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}