This paper presents a leaf sequencing software called SLS (static leaf sequencing) for intensity-modulated radiation therapy (IMRT). SLS seeks to produce improved clinical IMRT treatment plans by (1) shortening their treatment times and (2) minimizing their machine delivery errors. Our SLS software is implemented using the C programming language on Linux workstations and is designed as a separate module to complement the current commercial treatment planning systems. The input to SLS is discrete radiation intensity maps computed by current planning systems, and its output is (modified) optimized control sequences for the radiotherapy machines. Our SLS approach is very different from the commonly used planning methods in medical literature in that it is based on graph algorithms and computational geometry techniques. Comparisons of SLS with the CORVUS commercial planning system indicated that for the same set of discrete radiation intensity maps, treatment times can be shortened by over 30% by our SLS plans while maintaining the same treatment quality. We have used SLS in clinical applications at two cancer treatment centers. This paper discusses the various aspects of the implementation, installation, commissioning, and testing of our SLS software system
{"title":"A Leaf Sequencing Software for Intensity-Modulated Radiation Therapy","authors":"S. Luan, Chao Wang, D. Chen, X. Hu","doi":"10.1109/CBMS.2006.14","DOIUrl":"https://doi.org/10.1109/CBMS.2006.14","url":null,"abstract":"This paper presents a leaf sequencing software called SLS (static leaf sequencing) for intensity-modulated radiation therapy (IMRT). SLS seeks to produce improved clinical IMRT treatment plans by (1) shortening their treatment times and (2) minimizing their machine delivery errors. Our SLS software is implemented using the C programming language on Linux workstations and is designed as a separate module to complement the current commercial treatment planning systems. The input to SLS is discrete radiation intensity maps computed by current planning systems, and its output is (modified) optimized control sequences for the radiotherapy machines. Our SLS approach is very different from the commonly used planning methods in medical literature in that it is based on graph algorithms and computational geometry techniques. Comparisons of SLS with the CORVUS commercial planning system indicated that for the same set of discrete radiation intensity maps, treatment times can be shortened by over 30% by our SLS plans while maintaining the same treatment quality. We have used SLS in clinical applications at two cancer treatment centers. This paper discusses the various aspects of the implementation, installation, commissioning, and testing of our SLS software system","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"186 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128620692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Constructing time decompositions of time stamped documents is an important step for uncovering temporal relationships and trends of keywords and topics contained in the document set. This paper describes the use of time decompositions to extract temporal information from a small set of PubMed abstracts related to the Wnt signaling pathway. A time decomposition of the document set is constructed to identify temporal information such as keywords/topics significant in some time interval and to also identify temporal progression of the significant keywords. Keywords were assigned temporal significance values using two different measure functions based on notions of entropy and ratio. It is shown how optimal lossy decompositions of the document set are effective in reducing noise both in terms of the number of keywords as well as in terms of smoothing out the temporal progressions of keywords. Several optimal lossy decompositions for the document set are constructed and it is shown that the temporal information captured by an optimal lossy decomposition increases as its size (number of intervals) increases
{"title":"Using Time Decompositions to Analyze PubMed Abstracts","authors":"Rui Zhang, P. Chundi","doi":"10.1109/CBMS.2006.168","DOIUrl":"https://doi.org/10.1109/CBMS.2006.168","url":null,"abstract":"Constructing time decompositions of time stamped documents is an important step for uncovering temporal relationships and trends of keywords and topics contained in the document set. This paper describes the use of time decompositions to extract temporal information from a small set of PubMed abstracts related to the Wnt signaling pathway. A time decomposition of the document set is constructed to identify temporal information such as keywords/topics significant in some time interval and to also identify temporal progression of the significant keywords. Keywords were assigned temporal significance values using two different measure functions based on notions of entropy and ratio. It is shown how optimal lossy decompositions of the document set are effective in reducing noise both in terms of the number of keywords as well as in terms of smoothing out the temporal progressions of keywords. Several optimal lossy decompositions for the document set are constructed and it is shown that the temporal information captured by an optimal lossy decomposition increases as its size (number of intervals) increases","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"130 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134379250","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Marshall, K. Cairns, F. Kee, M. Moore, A. Hamilton, A. Adgey
This paper describes the development of a model to assess the distribution of response times for mobile volunteers of a public access defibrillation (PAD) scheme in Northern Ireland. Using parameters based on a trial period, the model predicts that a PAD volunteer would arrive before the emergency medical services (EMS) to 18.8% of events to which they are paged in a given year period. This is in agreement with what has actually been observed during the trial period (where volunteers have actually reached 15% of events before the EMS), and thus assisting validation of the model. Results from this model illustrate how ongoing volunteer commitment is key to the success of the scheme
{"title":"A Monte Carlo Simulation Model to Assess Volunteer Response Times in a Public Access Defibrillation Scheme in Northern Ireland","authors":"A. Marshall, K. Cairns, F. Kee, M. Moore, A. Hamilton, A. Adgey","doi":"10.1109/CBMS.2006.19","DOIUrl":"https://doi.org/10.1109/CBMS.2006.19","url":null,"abstract":"This paper describes the development of a model to assess the distribution of response times for mobile volunteers of a public access defibrillation (PAD) scheme in Northern Ireland. Using parameters based on a trial period, the model predicts that a PAD volunteer would arrive before the emergency medical services (EMS) to 18.8% of events to which they are paged in a given year period. This is in agreement with what has actually been observed during the trial period (where volunteers have actually reached 15% of events before the EMS), and thus assisting validation of the model. Results from this model illustrate how ongoing volunteer commitment is key to the success of the scheme","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122579419","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Frank-Michael Schleif, T. Elssner, M. Kostrzewa, T. Villmann, B. Hammer
We extend the self-organizing map in the variant as proposed by Heskes to a supervised fuzzy classification method. This leads to a robust classifier where efficient learning with fuzzy labeled or partially contradictory data is possible. Further, the integration of labeling into the location of prototypes in a self-organizing map leads to a visualization of those parts of the data relevant for the classification. The method is incorporated in a clinical proteomics toolkit dedicated for biomarker search which allows the necessary preprocessing and further data analysis with additional visualizations
{"title":"Analysis and Visualization of Proteomic Data by Fuzzy Labeled Self-Organizing Maps","authors":"Frank-Michael Schleif, T. Elssner, M. Kostrzewa, T. Villmann, B. Hammer","doi":"10.1109/CBMS.2006.44","DOIUrl":"https://doi.org/10.1109/CBMS.2006.44","url":null,"abstract":"We extend the self-organizing map in the variant as proposed by Heskes to a supervised fuzzy classification method. This leads to a robust classifier where efficient learning with fuzzy labeled or partially contradictory data is possible. Further, the integration of labeling into the location of prototypes in a self-organizing map leads to a visualization of those parts of the data relevant for the classification. The method is incorporated in a clinical proteomics toolkit dedicated for biomarker search which allows the necessary preprocessing and further data analysis with additional visualizations","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"81 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121434510","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present the need for risk stratification in the monitoring of cardiac surgical practice and review the frequentist and Bayesian approaches to the problem. Developments in the available databases are described. Enhancements to the Parsonnet and EuroSCORE systems are reviewed. We argue that in the UK, although the use of the Parsonnet system is inappropriate and that the EuroSCORE system is a clear improvement, there are advantages in adopting a system based on a Bayesian model for risk assessment
{"title":"Risk Stratification in Assessing Risk in Coronary Artery Bypass Surgery","authors":"Mike Rees, Jitesh Dineschandra","doi":"10.1109/CBMS.2006.141","DOIUrl":"https://doi.org/10.1109/CBMS.2006.141","url":null,"abstract":"We present the need for risk stratification in the monitoring of cardiac surgical practice and review the frequentist and Bayesian approaches to the problem. Developments in the available databases are described. Enhancements to the Parsonnet and EuroSCORE systems are reviewed. We argue that in the UK, although the use of the Parsonnet system is inappropriate and that the EuroSCORE system is a clear improvement, there are advantages in adopting a system based on a Bayesian model for risk assessment","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"321 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115866367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Barhamgi, D. Benslimane, Pierre-Antoine Champin
Interoperability is a crucial issue in the health care domain. It must be conducted along two tightly-coupled levels: "data" and "application" levels. The former is about bridging together heterogeneous medical data sources. The later is concerned with enabling medical applications to understand each other when they interact through Web services composition. In this paper we present a peer-to-peer framework for ensuring the interoperability along its two levels. Then we present a semantic mediation model dealing with data's "context heterogeneity" between Web services within a composition process
{"title":"A Framework for Data and Web Services Semantic Mediation in Peer-to-Peer Based Medical Information Systems","authors":"M. Barhamgi, D. Benslimane, Pierre-Antoine Champin","doi":"10.1109/CBMS.2006.11","DOIUrl":"https://doi.org/10.1109/CBMS.2006.11","url":null,"abstract":"Interoperability is a crucial issue in the health care domain. It must be conducted along two tightly-coupled levels: \"data\" and \"application\" levels. The former is about bridging together heterogeneous medical data sources. The later is concerned with enabling medical applications to understand each other when they interact through Web services composition. In this paper we present a peer-to-peer framework for ensuring the interoperability along its two levels. Then we present a semantic mediation model dealing with data's \"context heterogeneity\" between Web services within a composition process","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116850713","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mykola Pechenizkiy, A. Tsymbal, S. Puuronen, Oleksandr Pechenizkiy
Inductive learning systems have been successfully applied in a number of medical domains. It is generally accepted that the highest accuracy results that an inductive learning system can achieve depend on the quality of data and on the appropriate selection of a learning algorithm for the data. In this paper we analyze the effect of class noise on supervised learning in medical domains. We review the related work on learning from noisy data and propose to use feature extraction as a pre-processing step to diminish the effect of class noise on the learning process. Our experiments with 8 medical datasets show that feature extraction indeed helps to deal with class noise. It clearly results in higher classification accuracy of learnt models without the separate explicit elimination of noisy instances
{"title":"Class Noise and Supervised Learning in Medical Domains: The Effect of Feature Extraction","authors":"Mykola Pechenizkiy, A. Tsymbal, S. Puuronen, Oleksandr Pechenizkiy","doi":"10.1109/CBMS.2006.65","DOIUrl":"https://doi.org/10.1109/CBMS.2006.65","url":null,"abstract":"Inductive learning systems have been successfully applied in a number of medical domains. It is generally accepted that the highest accuracy results that an inductive learning system can achieve depend on the quality of data and on the appropriate selection of a learning algorithm for the data. In this paper we analyze the effect of class noise on supervised learning in medical domains. We review the related work on learning from noisy data and propose to use feature extraction as a pre-processing step to diminish the effect of class noise on the learning process. Our experiments with 8 medical datasets show that feature extraction indeed helps to deal with class noise. It clearly results in higher classification accuracy of learnt models without the separate explicit elimination of noisy instances","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117023919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper describes a methodology for increasing the scope and precision of diagnostic knowledge based (KB) Systems. It has been stated that medical KB systems are either highly specialised, lack accuracy or are just too simple. To resolve this problem of scope we propose the use of a phased approach to diagnosis. The first phase being the querying of a symptoms ontology, to direct diagnostic systems to the most appropriate domain or class reference given input symptoms. Additional symptoms can then be targeted, extracted and analysed with a domain specific set of KB systems. This process allows us to forecast key symptoms, patient characteristics and increase the value of available data in decision making. In addition this approach could allow a system to dynamically correct an inappropriate domain decision. Such an approach also has the potential to be used to build a bridge between existing specialised medical KB systems
{"title":"Symptoms Ontology for Mapping Diagnostic Knowledge Systems","authors":"R. Minchin, F. Porto, C. Vangenot, Sven Hartmann","doi":"10.1109/CBMS.2006.152","DOIUrl":"https://doi.org/10.1109/CBMS.2006.152","url":null,"abstract":"This paper describes a methodology for increasing the scope and precision of diagnostic knowledge based (KB) Systems. It has been stated that medical KB systems are either highly specialised, lack accuracy or are just too simple. To resolve this problem of scope we propose the use of a phased approach to diagnosis. The first phase being the querying of a symptoms ontology, to direct diagnostic systems to the most appropriate domain or class reference given input symptoms. Additional symptoms can then be targeted, extracted and analysed with a domain specific set of KB systems. This process allows us to forecast key symptoms, patient characteristics and increase the value of available data in decision making. In addition this approach could allow a system to dynamically correct an inappropriate domain decision. Such an approach also has the potential to be used to build a bridge between existing specialised medical KB systems","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"144 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115789009","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
V. Korampally, S. Bhattacharya, Yuanfang Gao, S. Grant, S. Kleiboeker, K. Gangopadhyay, Jinglu Tan, S. Gangopadhyay
A polymerase chain reaction (PCR) micro-chip with integrated thin film heaters and temperature detectors has been realized on a silicon-SOG-PDMS (poly-di(methyl) siloxane) platform. Accurate temperature sensing and control is important for a PCR reaction. This precludes the placement of the temperature sensor anywhere else but within the PCR chamber which can, in certain microchip designs complicate the fabrication methodology. This paper presents the design and optimal placement of a thin film resistance based temperature detector (RTD) for sensing of temperature response on the bottom of the chip (heater side) and predicting the temperature response on the top of the chip (PCR chamber side). Thermal modeling of the system has been performed using a parametric black-box approach based on the input-output data. From the steady state response of the system, pseudo random binary sequences (PRBS) have been generated and used to excite it. Second and fourth order ARX (auto regressive with exogenous inputs) models have been derived for optimal control and their performances have been compared. Reduction of fabrication complexity in regards to optimal placement of temperature sensor has been proposed
{"title":"Optimization of Fabrication Process for a PDMS-SOG-Silicon Based PCR Micro Chip through System Identification Techniques","authors":"V. Korampally, S. Bhattacharya, Yuanfang Gao, S. Grant, S. Kleiboeker, K. Gangopadhyay, Jinglu Tan, S. Gangopadhyay","doi":"10.1109/CBMS.2006.125","DOIUrl":"https://doi.org/10.1109/CBMS.2006.125","url":null,"abstract":"A polymerase chain reaction (PCR) micro-chip with integrated thin film heaters and temperature detectors has been realized on a silicon-SOG-PDMS (poly-di(methyl) siloxane) platform. Accurate temperature sensing and control is important for a PCR reaction. This precludes the placement of the temperature sensor anywhere else but within the PCR chamber which can, in certain microchip designs complicate the fabrication methodology. This paper presents the design and optimal placement of a thin film resistance based temperature detector (RTD) for sensing of temperature response on the bottom of the chip (heater side) and predicting the temperature response on the top of the chip (PCR chamber side). Thermal modeling of the system has been performed using a parametric black-box approach based on the input-output data. From the steady state response of the system, pseudo random binary sequences (PRBS) have been generated and used to excite it. Second and fourth order ARX (auto regressive with exogenous inputs) models have been derived for optimal control and their performances have been compared. Reduction of fabrication complexity in regards to optimal placement of temperature sensor has been proposed","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125346645","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yeshwanth Srinivasan, B. Nutter, S. Mitra, B. Phillips, E. Sinzinger
This paper explores the classification of texture patterns observed in digital images of the cervix. In particular, the problem of identifying and segmenting punctations and mosaic patterns is considered. First, the ability of large scale filter banks in characterizing punctations and mosaic structures is studied using texton models. However, texton-based models fail to consistently classify punctation and mosaic sections obtained from cervix images of different subjects. We present a novel method to segment punctations that combines matched filtering using a Gaussian template with Gaussian mixture models. Features extracted from the objects detected using this novel method on punctation and mosaic sections are shown to provide excellent classification between punctation and mosaicism. Results demonstrate the effectiveness of our approach in detecting punctations and separating punctation sections from mosaic sections
{"title":"Classification of Cervix Lesions Using Filter Bank-Based Texture Mode","authors":"Yeshwanth Srinivasan, B. Nutter, S. Mitra, B. Phillips, E. Sinzinger","doi":"10.1109/CBMS.2006.66","DOIUrl":"https://doi.org/10.1109/CBMS.2006.66","url":null,"abstract":"This paper explores the classification of texture patterns observed in digital images of the cervix. In particular, the problem of identifying and segmenting punctations and mosaic patterns is considered. First, the ability of large scale filter banks in characterizing punctations and mosaic structures is studied using texton models. However, texton-based models fail to consistently classify punctation and mosaic sections obtained from cervix images of different subjects. We present a novel method to segment punctations that combines matched filtering using a Gaussian template with Gaussian mixture models. Features extracted from the objects detected using this novel method on punctation and mosaic sections are shown to provide excellent classification between punctation and mosaicism. Results demonstrate the effectiveness of our approach in detecting punctations and separating punctation sections from mosaic sections","PeriodicalId":208693,"journal":{"name":"19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06)","volume":"95 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122670649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}