CombiTool is a new computer program for the analysis of combination effects of biologically active agents. It performs model calculations and an analysis of experimental combination effects for two or three agents according to both the Bliss independence and the Loewe additivity criteria. Zero interaction response surfaces are calculated from single-agent dose–response relations and compared to experimental combination data. The calculation of response surfaces for Loewe additivity is based on a new approach which combines the implicit definition equation in terms of doses alone with single-agent dose–response relations. The simultaneous analysis of experimental data according to both Loewe additivity and Bliss independence within one program can hopefully contribute to a better understanding of the meaning and limits of the two criteria. CombiTool has a built-in graphics facility which allows the direct visualization of the response surfaces or the corresponding contour plots and the experimental data.
{"title":"CombiTool—A New Computer Program for Analyzing Combination Experiments with Biologically Active Agents","authors":"Valeska Dreβler, Gerhard Müller, Jürgen Sühnel","doi":"10.1006/cbmr.1999.1509","DOIUrl":"10.1006/cbmr.1999.1509","url":null,"abstract":"<div><p>CombiTool is a new computer program for the analysis of combination effects of biologically active agents. It performs model calculations and an analysis of experimental combination effects for two or three agents according to both the Bliss independence and the Loewe additivity criteria. Zero interaction response surfaces are calculated from single-agent dose–response relations and compared to experimental combination data. The calculation of response surfaces for Loewe additivity is based on a new approach which combines the implicit definition equation in terms of doses alone with single-agent dose–response relations. The simultaneous analysis of experimental data according to both Loewe additivity and Bliss independence within one program can hopefully contribute to a better understanding of the meaning and limits of the two criteria. CombiTool has a built-in graphics facility which allows the direct visualization of the response surfaces or the corresponding contour plots and the experimental data.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 2","pages":"Pages 145-160"},"PeriodicalIF":0.0,"publicationDate":"1999-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1999.1509","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"21206812","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Understanding the risks and benefits of available treatments represents an essential element of clinical practice. Previous work has demonstrated that knowledge of net benefits and net risks can relate to our decisions on whether or not to administer a particular treatment or order a diagnostic test. A wider application of this model has been difficult because data on net benefits and net risks are not directly reported. We used more frequently reported data on treatment efficacy (E) and risks (Rrx) to obtain an equation for the treatment threshold probability above which treatment should be given and below which it should be withheld. The diagnostic test should only be performed if the probability of a disease is between the testing threshold and the treatment threshold. We first described a theoretical background for these calculations. We then used a JavaScript programming language to write a computer program which physicians can use to calculate these threshold probabilities effortlessly through the Internet. In most clinical situations we do not have to achieve maximum diagnostic certainty in order to act. However, we should never treat or order a diagnostic test if the risk of the treatment is greater than its efficacy. The minimally requiredE/Rratio of a particular treatment is equal to the reciprocal value of the mortality/morbidity of untreated disease. Similarly, the lowest number of patients needed to be treated (NNT) for therapy to be worth administering is equal to the reciprocal of the treatment risk. We show how evidence-based summary measures of therapeutic effects, such as the treatment efficacy, harms, and NNT, can successfully be integrated within a decision analytic model. This in turn will facilitate wider use of the quantitative benefit–risk analysis. Accessing the Internet for direct and immediate approach to the formulas described here should make this task even easier in everyday clinical decision making.
{"title":"Using the Internet to Calculate Clinical Action Thresholds","authors":"Iztok Hozo , Benjamin Djulbegovic","doi":"10.1006/cbmr.1998.1505","DOIUrl":"10.1006/cbmr.1998.1505","url":null,"abstract":"<div><p>Understanding the risks and benefits of available treatments represents an essential element of clinical practice. Previous work has demonstrated that knowledge of net benefits and net risks can relate to our decisions on whether or not to administer a particular treatment or order a diagnostic test. A wider application of this model has been difficult because data on net benefits and net risks are not directly reported. We used more frequently reported data on treatment efficacy (<em>E</em>) and risks (<em>R</em><sub><em>rx</em></sub>) to obtain an equation for the treatment threshold probability above which treatment should be given and below which it should be withheld. The diagnostic test should only be performed if the probability of a disease is between the testing threshold and the treatment threshold. We first described a theoretical background for these calculations. We then used a JavaScript programming language to write a computer program which physicians can use to calculate these threshold probabilities effortlessly through the Internet. In most clinical situations we do not have to achieve maximum diagnostic certainty in order to act. However, we should never treat or order a diagnostic test if the risk of the treatment is greater than its efficacy. The minimally required<em>E/R</em>ratio of a particular treatment is equal to the reciprocal value of the mortality/morbidity of untreated disease. Similarly, the lowest number of patients needed to be treated (NNT) for therapy to be worth administering is equal to the reciprocal of the treatment risk. We show how evidence-based summary measures of therapeutic effects, such as the treatment efficacy, harms, and NNT, can successfully be integrated within a decision analytic model. This in turn will facilitate wider use of the quantitative benefit–risk analysis. Accessing the Internet for direct and immediate approach to the formulas described here should make this task even easier in everyday clinical decision making.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 2","pages":"Pages 168-185"},"PeriodicalIF":0.0,"publicationDate":"1999-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1998.1505","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"21206813","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
One of the problems in the management of the diabetic patient is to balance the dose of insulin without exactly knowing how the patient's blood glucose concentration will respond. Being able to predict the blood glucose level would simplify the management. This paper describes an attempt to predict blood glucose levels using a hybrid AI technique combining the principal component method and neural networks. With this approach, no complicated models or algorithms need be considered. The results obtained from this fairly simple model show a correlation coefficient of 0.76 between the observed and the predicted values during the first 15 days of prediction. By using this technique, all the factors affecting this patient's blood glucose level are considered, since they are integrated in the data collected during this time period. It must be emphasized that the present method results in an individual model, valid for that particular patient under a limited period of time. However, the method itself has general validity, since the blood glucose variations over time have similar properties in any diabetic patient.
{"title":"Prediction of Blood Glucose Levels in Diabetic Patients Using a Hybrid AI Technique","authors":"Jan John Liszka-Hackzell","doi":"10.1006/cbmr.1998.1506","DOIUrl":"10.1006/cbmr.1998.1506","url":null,"abstract":"<div><p>One of the problems in the management of the diabetic patient is to balance the dose of insulin without exactly knowing how the patient's blood glucose concentration will respond. Being able to predict the blood glucose level would simplify the management. This paper describes an attempt to predict blood glucose levels using a hybrid AI technique combining the principal component method and neural networks. With this approach, no complicated models or algorithms need be considered. The results obtained from this fairly simple model show a correlation coefficient of 0.76 between the observed and the predicted values during the first 15 days of prediction. By using this technique, all the factors affecting this patient's blood glucose level are considered, since they are integrated in the data collected during this time period. It must be emphasized that the present method results in an individual model, valid for that particular patient under a limited period of time. However, the method itself has general validity, since the blood glucose variations over time have similar properties in any diabetic patient.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 2","pages":"Pages 132-144"},"PeriodicalIF":0.0,"publicationDate":"1999-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1998.1506","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"21206922","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The electroencephalogram (EEG) visualization software was developed containing two-dimensional (2D) and three-dimensional (3D) brain mapping modules. The input to the program is standard clinical individual patient data recorded using digital EEG and magnetic resonance imaging (MRI). The software utilizes several techniques, such as heuristic triangulation, ray casting, Gouraud shading, and image fusion to form multimodal 3D images. The program has been applied to the 3D visualization of various EEG signals, “cortical” EEG signals, and potential fields generated by a computer model. The developed program appears to operate efficiently and intuitively in PC/Windows environment.
{"title":"Implementation of Three-Dimensional EEG Brain Mapping","authors":"Tomi Heinonen , Antti Lahtinen , Veikko Häkkinen","doi":"10.1006/cbmr.1998.1503","DOIUrl":"10.1006/cbmr.1998.1503","url":null,"abstract":"<div><p>The electroencephalogram (EEG) visualization software was developed containing two-dimensional (2D) and three-dimensional (3D) brain mapping modules. The input to the program is standard clinical individual patient data recorded using digital EEG and magnetic resonance imaging (MRI). The software utilizes several techniques, such as heuristic triangulation, ray casting, Gouraud shading, and image fusion to form multimodal 3D images. The program has been applied to the 3D visualization of various EEG signals, “cortical” EEG signals, and potential fields generated by a computer model. The developed program appears to operate efficiently and intuitively in PC/Windows environment.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 2","pages":"Pages 123-131"},"PeriodicalIF":0.0,"publicationDate":"1999-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1998.1503","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"21206921","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Many vital substances, such as receptors, transporters, and ion channels, in cells occur associated with membranes. To an increasing extent their precise localization is demonstrated by immunocytochemical methods including labeling with gold particles followed by electron microscopy. PALIREL has primarily been developed to facilitate such research, enabling rapid analysis of topographic relations of particles (gold or others) to neighboring linear interfaces (membranes). After digitization of membranes and particles, the program particularly allows computation of (1) the particle number and number per unit length of membrane, in individual bins (membrane lengths) interactively defined along the membrane; (2) the distance of each particle from the membrane; (3) the particle number, and the density (number per μm2), in zones defined along (over and under) the membrane; and (4) the particle number and density in “zonebins” resulting from zones and bins being defined simultaneously. If there occurs, somewhere in the membrane, a segment of different nature, such as a synapse, the quantitative data may be had separately for that and the adjoining parts of the membrane. PALIREL allows interactive redefinition of bins, zones, or objects (particle-line files) while other definitions are retained. The results can be presented on the screen as tables and histograms and be printed on request. A dedicated graphic routine permits inspection on screen of lines, particles, zones, and bins. PALIREL is equally applicable to biological investigations of other kinds, in which the topographic relations of points (structures represented as points) to lines (boundaries) are to be examined. PALIREL is available from the authors on a noncommercial basis.
{"title":"PALIREL, a Computer Program for Analyzing Particle-to-Membrane Relations, with Emphasis on Electron Micrographs of Immunocytochemical Preparations and Gold Labeled Molecules","authors":"H.K. Ruud, T.W. Blackstad","doi":"10.1006/cbmr.1999.1508","DOIUrl":"10.1006/cbmr.1999.1508","url":null,"abstract":"<div><p>Many vital substances, such as receptors, transporters, and ion channels, in cells occur associated with membranes. To an increasing extent their precise localization is demonstrated by immunocytochemical methods including labeling with gold particles followed by electron microscopy. <em>PALIREL</em> has primarily been developed to facilitate such research, enabling rapid analysis of topographic relations of particles (gold or others) to neighboring linear interfaces (membranes). After digitization of membranes and particles, the program particularly allows computation of (1) the particle number and number per unit length of membrane, in individual <em>bins</em> (membrane lengths) interactively defined along the membrane; (2) the distance of each particle from the membrane; (3) the particle number, and the density (number per μm<sup>2</sup>), in <em>zones</em> defined along (over and under) the membrane; and (4) the particle number and density in “zonebins” resulting from zones and bins being defined simultaneously. If there occurs, somewhere in the membrane, a segment of different nature, such as a synapse, the quantitative data may be had separately for that and the adjoining parts of the membrane. <em>PALIREL</em> allows interactive redefinition of bins, zones, or objects (particle-line files) while other definitions are retained. The results can be presented on the screen as tables and histograms and be printed on request. A dedicated graphic routine permits inspection on screen of lines, particles, zones, and bins. <em>PALIREL</em> is equally applicable to biological investigations of other kinds, in which the topographic relations of points (structures represented as points) to lines (boundaries) are to be examined. <em>PALIREL</em> is available from the authors on a noncommercial basis.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 2","pages":"Pages 93-122"},"PeriodicalIF":0.0,"publicationDate":"1999-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1999.1508","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"21206920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. Aittokallio , O. Nevalainen , U. Pursiheimo , T. Saaresranta , O. Polo
In a significant proportion of individuals, the physiologic decrease of muscle tone during sleep results in increased collapsibility of the upper respiratory airway. At peak inspiratory flow, the pharyngeal soft tissues may collapse and cause airflow limitation or even complete occlusion of the upper airway (sleep apnea). While there are plenty of methods to detect sleep apnea, only a few can be used to monitor flow limitation in sleeping individuals. Nasal prongs connected to pressure sensor provide information of the nasal airflow over time. This paper documents a method to automatically classify each nasal inspiratory pressure profile into one without flow limitation or six flow-limited ones. The recognition of the sample signals consists of three phases: preprocessing, primitive extraction, and word parsing phases. In the last one, a sequence of signal primitives is treated as a word and we test its membership in the attribute grammars constructed to the signal categories. The method gave in practical tests surprisingly high performance. Classifying 94;pc of the inspiratory profiles in agreement with the visual judgment of an expert physician, the performance of the method was considered good enough to warrant further testing in well-defined patient populations to determine the pressure profile distributions of different subject classes.
{"title":"Classification of Nasal Inspiratory Flow Shapes by Attributed Finite Automata","authors":"T. Aittokallio , O. Nevalainen , U. Pursiheimo , T. Saaresranta , O. Polo","doi":"10.1006/cbmr.1998.1499","DOIUrl":"10.1006/cbmr.1998.1499","url":null,"abstract":"<div><p>In a significant proportion of individuals, the physiologic decrease of muscle tone during sleep results in increased collapsibility of the upper respiratory airway. At peak inspiratory flow, the pharyngeal soft tissues may collapse and cause airflow limitation or even complete occlusion of the upper airway (<em>sleep apnea</em>). While there are plenty of methods to detect sleep apnea, only a few can be used to monitor flow limitation in sleeping individuals. Nasal prongs connected to pressure sensor provide information of the nasal airflow over time. This paper documents a method to automatically classify each nasal inspiratory pressure profile into one without flow limitation or six flow-limited ones. The recognition of the sample signals consists of three phases: preprocessing, primitive extraction, and word parsing phases. In the last one, a sequence of signal primitives is treated as a word and we test its membership in the attribute grammars constructed to the signal categories. The method gave in practical tests surprisingly high performance. Classifying 94;pc of the inspiratory profiles in agreement with the visual judgment of an expert physician, the performance of the method was considered good enough to warrant further testing in well-defined patient populations to determine the pressure profile distributions of different subject classes.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 1","pages":"Pages 34-55"},"PeriodicalIF":0.0,"publicationDate":"1999-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1998.1499","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"20939828","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Investigation of heart rate variability is the subject of considerable interest in physiology, clinical medicine, and clinical pharmacology. The functional assessment of the autonomic nerve system by observation of its main actors, the sympathetic and parasympathetic branch, is emphasizing the importance of autonomic regulation under different physiological circum stances, in several disease states, and under drug therapy. This paper describes a PC-based system designed with LabView that performs time-domain and frequency-domain analyses of heart rate variability as suggested by the guidelines of theEuropean Society of Cardiologyand theNorth American Society of Pacing and Electrophysiology.Examples for heart rate variability are given for different physiological states along with an analysis and evaluation by the system described.
{"title":"Design of a PC-Based System for Time-Domain and Spectral Analysis of Heart Rate Variability","authors":"Holger G. Adelmann","doi":"10.1006/cbmr.1998.1502","DOIUrl":"10.1006/cbmr.1998.1502","url":null,"abstract":"<div><p>Investigation of heart rate variability is the subject of considerable interest in physiology, clinical medicine, and clinical pharmacology. The functional assessment of the autonomic nerve system by observation of its main actors, the sympathetic and parasympathetic branch, is emphasizing the importance of autonomic regulation under different physiological circum stances, in several disease states, and under drug therapy. This paper describes a PC-based system designed with LabView that performs time-domain and frequency-domain analyses of heart rate variability as suggested by the guidelines of the<em>European Society of Cardiology</em>and the<em>North American Society of Pacing and Electrophysiology.</em>Examples for heart rate variability are given for different physiological states along with an analysis and evaluation by the system described.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 1","pages":"Pages 77-92"},"PeriodicalIF":0.0,"publicationDate":"1999-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1998.1502","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"20939831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
F. Loeve, R. Boer, G.J. van Oortmarssen, M. van Ballegooijen, J.D.F. Habbema
A general model for evaluation of colorectal cancer screening has been implemented in the microsimulation program MISCAN-COLON. A large number of fictitious individual life histories are simulated in each of which several colorectal lesions can emerge. Next, screening for colorectal cancer is simulated, which will change some of the life histories. The demographic characteristics, the epidemiology and natural history of the disease, and the characteristics of screening are defined in the input. All kinds of assumptions on the natural history of colorectal cancer and screening and surveillance strategies can easily be incorporated in the model. MISCAN-COLON gives detailed output of incidence, prevalence and mortality, and the results and effects of screening. It can be used to test hypotheses about the natural history of colorectal cancer, such as the duration of progressive adenomas, and screening characteristics, such as sensitivity of tests, against empirical data. In decision making about screening, the model can be used for evaluation of screening policies, and for choosing between competing policies by comparing their simulated incremental costs and effectiveness outcomes.
{"title":"The MISCAN-COLON Simulation Model for the Evaluation of Colorectal Cancer Screening","authors":"F. Loeve, R. Boer, G.J. van Oortmarssen, M. van Ballegooijen, J.D.F. Habbema","doi":"10.1006/cbmr.1998.1498","DOIUrl":"10.1006/cbmr.1998.1498","url":null,"abstract":"<div><p>A general model for evaluation of colorectal cancer screening has been implemented in the microsimulation program MISCAN-COLON. A large number of fictitious individual life histories are simulated in each of which several colorectal lesions can emerge. Next, screening for colorectal cancer is simulated, which will change some of the life histories. The demographic characteristics, the epidemiology and natural history of the disease, and the characteristics of screening are defined in the input. All kinds of assumptions on the natural history of colorectal cancer and screening and surveillance strategies can easily be incorporated in the model. MISCAN-COLON gives detailed output of incidence, prevalence and mortality, and the results and effects of screening. It can be used to test hypotheses about the natural history of colorectal cancer, such as the duration of progressive adenomas, and screening characteristics, such as sensitivity of tests, against empirical data. In decision making about screening, the model can be used for evaluation of screening policies, and for choosing between competing policies by comparing their simulated incremental costs and effectiveness outcomes.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 1","pages":"Pages 13-33"},"PeriodicalIF":0.0,"publicationDate":"1999-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1998.1498","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"20939827","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yoav Smith , Gershom Zajicek , Michael Werman , Galina Pizov , Yoav Sherman
A similarity measurement method for the classification of architecturally differentiated image sections is described. The strength of the method is demonstrated by performing the complex task of assigning severity grading (Gleason grading) to histological slides of prostate cancer. As shown, all that is required to employ the method is a small set of preclassified images. The images can be real world images acquired by means of a camera, computer tomography, etc., or schematic drawings representing samples of different classes. The schematic option allows a quick test of the method for a particular classification problem.
{"title":"Similarity Measurement Method for the Classification of Architecturally Differentiated Images","authors":"Yoav Smith , Gershom Zajicek , Michael Werman , Galina Pizov , Yoav Sherman","doi":"10.1006/cbmr.1998.1500","DOIUrl":"10.1006/cbmr.1998.1500","url":null,"abstract":"<div><p>A similarity measurement method for the classification of architecturally differentiated image sections is described. The strength of the method is demonstrated by performing the complex task of assigning severity grading (Gleason grading) to histological slides of prostate cancer. As shown, all that is required to employ the method is a small set of preclassified images. The images can be real world images acquired by means of a camera, computer tomography, etc., or schematic drawings representing samples of different classes. The schematic option allows a quick test of the method for a particular classification problem.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 1","pages":"Pages 1-12"},"PeriodicalIF":0.0,"publicationDate":"1999-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1998.1500","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"20939883","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A new method for the analysis of heart rate variability in short-term recordings is presented which consists of an analysis of the respiratory sinus arrhythmia in the time domain by means of a polar representation. Its main advantage is that it is applicable in experiments in which the respiration of the subject is not controlled. The algorithm is applied to data recorded on two astronauts during the Euromir-95 space mission. Statistical hypothesis tests demonstrate that the presence of a mouthpiece induces an increase of the respiratory sinus arrhythmia amplitude.
{"title":"A Novel Algorithm for the Heart Rate Variability Analysis of Short-Term Recordings: Polar Representation of Respiratory Sinus Arrhythmia","authors":"Pierre-Franois Migeotte, Yves Verbandt","doi":"10.1006/cbmr.1998.1495","DOIUrl":"10.1006/cbmr.1998.1495","url":null,"abstract":"<div><p>A new method for the analysis of heart rate variability in short-term recordings is presented which consists of an analysis of the respiratory sinus arrhythmia in the time domain by means of a polar representation. Its main advantage is that it is applicable in experiments in which the respiration of the subject is not controlled. The algorithm is applied to data recorded on two astronauts during the Euromir-95 space mission. Statistical hypothesis tests demonstrate that the presence of a mouthpiece induces an increase of the respiratory sinus arrhythmia amplitude.</p></div>","PeriodicalId":75733,"journal":{"name":"Computers and biomedical research, an international journal","volume":"32 1","pages":"Pages 56-66"},"PeriodicalIF":0.0,"publicationDate":"1999-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/cbmr.1998.1495","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"20939829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}