Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944102
K. Rajaram, C. Babu, Arun Adiththan
Service orientation is gaining momentum in distributed software applications, mainly because it facilitates interoperability. Service composition has been acknowledged as a promising approach to meet the user demands, whenever a single service cannot fulfill the needs. It is essential to ensure the reliability of the composed service, as the component services are offered by multiple providers from different organizations. The behavioral or transactional properties of component services determine the reliability of the composite service. The guaranteed values on transactional properties of every service must be recorded in a contract as agreed by a consumer and a provider, in order to avoid unpredictable performance in service provisioning. The approach proposed in this paper enables generation of contracts at runtime along with transactional guarantees, based on the frequently changing business requirements of service consumers. The approach is experimented with a case study of Scan Report Generation in healthcare domain.
{"title":"Dynamic generation of transactional contracts for hierarchical workflows","authors":"K. Rajaram, C. Babu, Arun Adiththan","doi":"10.1109/ICCCSP.2017.7944102","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944102","url":null,"abstract":"Service orientation is gaining momentum in distributed software applications, mainly because it facilitates interoperability. Service composition has been acknowledged as a promising approach to meet the user demands, whenever a single service cannot fulfill the needs. It is essential to ensure the reliability of the composed service, as the component services are offered by multiple providers from different organizations. The behavioral or transactional properties of component services determine the reliability of the composite service. The guaranteed values on transactional properties of every service must be recorded in a contract as agreed by a consumer and a provider, in order to avoid unpredictable performance in service provisioning. The approach proposed in this paper enables generation of contracts at runtime along with transactional guarantees, based on the frequently changing business requirements of service consumers. The approach is experimented with a case study of Scan Report Generation in healthcare domain.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127558967","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944076
T. Shanmuganatham, Deepanshu Kaushal
This paper demonstrates the structure and the result characterization of a multi band microstrip patch antenna closely resembling Microsoft Calculator Accessory logo. This antenna is intended to be used for several applications. The substrate used is FR4 epoxy substrate with a relative permittivity of 4.4, dielectric loss tangent of 0.002 and a thickness of 1.6 mm. The design uses a probe feeding mechanism owing to numerous advantages offered by it. The simulation software used is HFSS (High Frequency Structure Simulator). The structure resonates at 6 different frequencies including 1.2 GHz offering a reflection coefficient of −24.9 dB and a bandwidth of 47 MHz for aeronautical radio navigation, 1.53 GHz with a reflection coefficient of −16.9 dB and a bandwidth of 74.2 MHz for satellite communication, 2.56 GHz with a reflection coefficient of −29.7 dB and bandwidth of 121.7 MHz for wireless communication, 1.962 dB at 3.27 GHz with a reflection coefficient of −12.3 dB and a bandwidth of 62.7 MHz for private land mobile devices, 3.89 GHz with a reflection coefficient of −13.4 dB and a bandwidth of 68.4 MHz for fixed microwave devices and 5.91 GHz with a reflection coefficient of −17.3 dB and a bandwidth of 340 MHz for ISM equipment, personal land mobile, personal radio and amateur radio.
{"title":"Design of multi utility multi band microstrip calculator shaped patch antenna using coaxial feed","authors":"T. Shanmuganatham, Deepanshu Kaushal","doi":"10.1109/ICCCSP.2017.7944076","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944076","url":null,"abstract":"This paper demonstrates the structure and the result characterization of a multi band microstrip patch antenna closely resembling Microsoft Calculator Accessory logo. This antenna is intended to be used for several applications. The substrate used is FR4 epoxy substrate with a relative permittivity of 4.4, dielectric loss tangent of 0.002 and a thickness of 1.6 mm. The design uses a probe feeding mechanism owing to numerous advantages offered by it. The simulation software used is HFSS (High Frequency Structure Simulator). The structure resonates at 6 different frequencies including 1.2 GHz offering a reflection coefficient of −24.9 dB and a bandwidth of 47 MHz for aeronautical radio navigation, 1.53 GHz with a reflection coefficient of −16.9 dB and a bandwidth of 74.2 MHz for satellite communication, 2.56 GHz with a reflection coefficient of −29.7 dB and bandwidth of 121.7 MHz for wireless communication, 1.962 dB at 3.27 GHz with a reflection coefficient of −12.3 dB and a bandwidth of 62.7 MHz for private land mobile devices, 3.89 GHz with a reflection coefficient of −13.4 dB and a bandwidth of 68.4 MHz for fixed microwave devices and 5.91 GHz with a reflection coefficient of −17.3 dB and a bandwidth of 340 MHz for ISM equipment, personal land mobile, personal radio and amateur radio.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127904737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944069
A. Beulah, T. Sharmila
Image segmentation is well known in partitioning a digital image into several segments. Recent days lower back pain in human being increases and so the lumber spine pathology detection becomes a predominant research area in Computer Aided Diagnosis (CAD) system. In the process of lumbar spine pathology detection, the segmentation of the Intervertebral Disc (IVD) is the major step as it identifies the IVDs or the boundaries of the IVDs either normal or abnormal in images. When the axial or the sagittal View of lumbar spine MR image is given as input, this proposed work segments the IVD in both the axial and sagittal views. The segmentation of IVD is a four stage process. First, Expectation-Maximization (EM) segmentation is performed on the MR Image. EM segmentation yields an advantage over K-means with the case of the size of clustering. The second stage is to carry out the morphological operators and third, apply edge detection method and obtain the edges. The final stage is to remove unwanted objects from the obtained output image. If this proposed segmentation is utilized as part of the CAD, the experts will be benefited for localizing the IVD and to diagnose the IVD disease.
{"title":"EM algorithm based intervertebral disc segmentation on MR images","authors":"A. Beulah, T. Sharmila","doi":"10.1109/ICCCSP.2017.7944069","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944069","url":null,"abstract":"Image segmentation is well known in partitioning a digital image into several segments. Recent days lower back pain in human being increases and so the lumber spine pathology detection becomes a predominant research area in Computer Aided Diagnosis (CAD) system. In the process of lumbar spine pathology detection, the segmentation of the Intervertebral Disc (IVD) is the major step as it identifies the IVDs or the boundaries of the IVDs either normal or abnormal in images. When the axial or the sagittal View of lumbar spine MR image is given as input, this proposed work segments the IVD in both the axial and sagittal views. The segmentation of IVD is a four stage process. First, Expectation-Maximization (EM) segmentation is performed on the MR Image. EM segmentation yields an advantage over K-means with the case of the size of clustering. The second stage is to carry out the morphological operators and third, apply edge detection method and obtain the edges. The final stage is to remove unwanted objects from the obtained output image. If this proposed segmentation is utilized as part of the CAD, the experts will be benefited for localizing the IVD and to diagnose the IVD disease.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128734606","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944082
J. Julina, T. Sharmila
Face recognition is widely used in computer vision and in many other biometric applications where security is a major concern. The most common problem in recognizing a face arises due to pose variations, different illumination conditions and so on. The main focus of this paper is to recognize whether a given face input corresponds to a registered person in the database. Face recognition is done using Histogram of Oriented Gradients (HOG) technique in AT & T database with an inclusion of a real time subject to evaluate the performance of the algorithm. The feature vectors generated by HOG descriptor are used to train Support Vector Machines (SVM) and results are verified against a given test input. The proposed method checks whether a test image in different pose and lighting conditions is matched correctly with trained images of the facial database. The results of the proposed approach show minimal false positives and improved detection accuracy.
{"title":"Facial recognition using histogram of gradients and support vector machines","authors":"J. Julina, T. Sharmila","doi":"10.1109/ICCCSP.2017.7944082","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944082","url":null,"abstract":"Face recognition is widely used in computer vision and in many other biometric applications where security is a major concern. The most common problem in recognizing a face arises due to pose variations, different illumination conditions and so on. The main focus of this paper is to recognize whether a given face input corresponds to a registered person in the database. Face recognition is done using Histogram of Oriented Gradients (HOG) technique in AT & T database with an inclusion of a real time subject to evaluate the performance of the algorithm. The feature vectors generated by HOG descriptor are used to train Support Vector Machines (SVM) and results are verified against a given test input. The proposed method checks whether a test image in different pose and lighting conditions is matched correctly with trained images of the facial database. The results of the proposed approach show minimal false positives and improved detection accuracy.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114720345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944073
L. Pavithra, T. Sharmila
This paper proposes a new chrominance feature extraction method in HMMD color space. Image dependent multi-level thresholding is performed in the HMMD color space to obtain the 64-IeveI quantized images. The occurrence count of each color pixel represents the color information of those quantized images. This technique is tested over Wang's database of 10 different category images. The distance measure of this feature between the query and database image are calculated. Then, the proposed method performance is evaluated using average precision and recall. Moreover, the proposed method is a benchmark against the state-of-the-art color feature extraction methods and gives approximately 6.3% to 18.05% and 7.54% to 14.52 % high precision and recall than the conventional techniques.
{"title":"Image retrieval based on chrominance feature of the HMMD color space","authors":"L. Pavithra, T. Sharmila","doi":"10.1109/ICCCSP.2017.7944073","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944073","url":null,"abstract":"This paper proposes a new chrominance feature extraction method in HMMD color space. Image dependent multi-level thresholding is performed in the HMMD color space to obtain the 64-IeveI quantized images. The occurrence count of each color pixel represents the color information of those quantized images. This technique is tested over Wang's database of 10 different category images. The distance measure of this feature between the query and database image are calculated. Then, the proposed method performance is evaluated using average precision and recall. Moreover, the proposed method is a benchmark against the state-of-the-art color feature extraction methods and gives approximately 6.3% to 18.05% and 7.54% to 14.52 % high precision and recall than the conventional techniques.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114693375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944064
H. Habiba, T. Prashanth, S. Keerthipriya, L. N. A. Sayeed, R. Sandhya
A compact fullmode SIW UWBband pass filter using novel input/output transmission-line-structure is proposed in this paper. This wide band SIW resonator can be evolved from a conventional SIW transmission lineor a two-conductor transmission line. This filter has wide passband of 3–8GHz with return loss of nearly −20dB.
{"title":"A compact full mode SIW UWB Band pass filter using novel input/output transmission-line-stnicture","authors":"H. Habiba, T. Prashanth, S. Keerthipriya, L. N. A. Sayeed, R. Sandhya","doi":"10.1109/ICCCSP.2017.7944064","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944064","url":null,"abstract":"A compact fullmode SIW UWBband pass filter using novel input/output transmission-line-structure is proposed in this paper. This wide band SIW resonator can be evolved from a conventional SIW transmission lineor a two-conductor transmission line. This filter has wide passband of 3–8GHz with return loss of nearly −20dB.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127331231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944061
N. Moratanch, S. Chitrakala
Text Summarization is the process of obtaining salient information from an authentic text document. In this technique, the extracted information is achieved as a summarized report and conferred as a concise summary to the user. It is very crucial for humans to understand and to describe the content of the text. Text Summarization techniques are classified into abstractive and extractive summarization. The extractive summarization technique focuses on choosing how paragraphs, important sentences, etc produces the original documents in precise form. The implication of sentences is determined based on linguistic and statistical features. In this work, a comprehensive review of extractive text summarization process methods has been ascertained. In this paper, the various techniques, populous benchmarking datasets and challenges of extractive summarization have been reviewed. This paper interprets extractive text summarization methods with a less redundant summary, highly adhesive, coherent and depth information.
{"title":"A survey on extractive text summarization","authors":"N. Moratanch, S. Chitrakala","doi":"10.1109/ICCCSP.2017.7944061","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944061","url":null,"abstract":"Text Summarization is the process of obtaining salient information from an authentic text document. In this technique, the extracted information is achieved as a summarized report and conferred as a concise summary to the user. It is very crucial for humans to understand and to describe the content of the text. Text Summarization techniques are classified into abstractive and extractive summarization. The extractive summarization technique focuses on choosing how paragraphs, important sentences, etc produces the original documents in precise form. The implication of sentences is determined based on linguistic and statistical features. In this work, a comprehensive review of extractive text summarization process methods has been ascertained. In this paper, the various techniques, populous benchmarking datasets and challenges of extractive summarization have been reviewed. This paper interprets extractive text summarization methods with a less redundant summary, highly adhesive, coherent and depth information.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127495331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944084
J. Jennifer, T. Sharmila
Living in an information age the whole earth is a small globe in our hands with the advancements of computers, smartphones etc. The usage of computers in our day-to-day activities has increased enormously leading to both positive and negative effects in our lives. The negative effects are related to health problems such as Computer Vision Syndrome (CVS) etc. Prolonged use of computers would lead to a significant reduction of spontaneous eye blink rate due to the high visual demand of the screen and concentration on the work. The proposed system develops a prototype using blink as a solution to prevent CVS. The first part of the work captures video frames using web-camera mounted on the computer or laptop. These frames are processed dynamically by cropping only the eyes. The algorithms performed on the eye-frames are direct pixel count, gradient. Canny edge and Laplacian of Gaussian (LoG). These determine the eye-status based on the threshold value and the proposed idea, the difference between upper and lower eye frames. Various experiments are done and their algorithms are compared and concluded that the proposed algorithm yields 99.95% accuracy.
{"title":"Edge based eye-blink detection for computer vision syndrome","authors":"J. Jennifer, T. Sharmila","doi":"10.1109/ICCCSP.2017.7944084","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944084","url":null,"abstract":"Living in an information age the whole earth is a small globe in our hands with the advancements of computers, smartphones etc. The usage of computers in our day-to-day activities has increased enormously leading to both positive and negative effects in our lives. The negative effects are related to health problems such as Computer Vision Syndrome (CVS) etc. Prolonged use of computers would lead to a significant reduction of spontaneous eye blink rate due to the high visual demand of the screen and concentration on the work. The proposed system develops a prototype using blink as a solution to prevent CVS. The first part of the work captures video frames using web-camera mounted on the computer or laptop. These frames are processed dynamically by cropping only the eyes. The algorithms performed on the eye-frames are direct pixel count, gradient. Canny edge and Laplacian of Gaussian (LoG). These determine the eye-status based on the threshold value and the proposed idea, the difference between upper and lower eye frames. Various experiments are done and their algorithms are compared and concluded that the proposed algorithm yields 99.95% accuracy.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123955518","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944107
D. V. Ranganathan, R. Vishal, V. Krishnamurthy, Prashant Mahesh, Roopeshwar Devarajan
Many multiplayer card games have interesting structures that can be exploited while designing a computer application to simulate these games. Critical features that most games have are that they are turn based and have common data types like Cards and Decks. This paper aims to demonstrate certain design patterns for implementing multiplayer card games that are capable of scaling well and are easily understandable and maintainable. Two card games ‘Ace’ and ‘Literature’ were developed from which the patterns were extracted. The patterns explained in this paper, can be applied to any turn-based multiplayer card game and in some cases, to any multiplayer game in general. The two patterns discussed in this paper are represented in small caps to have better understanding.
{"title":"Design patterns for multiplayer card games","authors":"D. V. Ranganathan, R. Vishal, V. Krishnamurthy, Prashant Mahesh, Roopeshwar Devarajan","doi":"10.1109/ICCCSP.2017.7944107","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944107","url":null,"abstract":"Many multiplayer card games have interesting structures that can be exploited while designing a computer application to simulate these games. Critical features that most games have are that they are turn based and have common data types like Cards and Decks. This paper aims to demonstrate certain design patterns for implementing multiplayer card games that are capable of scaling well and are easily understandable and maintainable. Two card games ‘Ace’ and ‘Literature’ were developed from which the patterns were extracted. The patterns explained in this paper, can be applied to any turn-based multiplayer card game and in some cases, to any multiplayer game in general. The two patterns discussed in this paper are represented in small caps to have better understanding.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130963419","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944097
M. R. Ram, K. Sivani, K. Reddy
Pulse Oximeter (PO) employed in critical care units is crucial equipment to measure the vital parameters like oxygen blood saturation levels and heart rate of the patient. Using PO device, required medical data is acquired using the photoplethysmographic (PPG) data utilizing PPG sensors attached on forehead/ to finger/at earlobe of the patient and then Ratio parameter (R) is computed pertaining to amplitudes of acquired red and IR PPG signals. Further, ‘R’ is used to estimate oxygen saturation levels with the help of calibration curve. Subject movements while recording the medical data may result in erroneous estimation of required estimation parameter and in turn may result in wrong diagnosis by the clinician. Reduction of Motion Artifacts (MA) component from raw PPG data recorded may guarantee error-free measurement of oxygen blood saturation level (SpO2). MA's can be removed from raw PPG signal (corrupted) using band pass filtering method, but the persisting in-band noise component cannot be removed. In this paper, authors propose a filtering method using tunable Q-factor wavelet transform (TQWT) to remove MA components. Advantage of TQWT sytems from the fact that, the realization of practical narrow band pass filter with a specific Q-factor value can be designed, which motivated the authors to use for this application. Experimental results have shown a good acceptance for the proposed method as the MA reduced PPG signals obtained are having efficient morphological features. SpO2 is estimated from MA reduced PPGs by utilizing the calibration curve. The superiority of proposed technique has been proved by comparing the experimental results with results obtained using basic least mean squares (LMS) method. Signal data can be acquired with different MA components (bending, horizontal and vertical movements of patient's finger) is considered for experiment analysis. Obtained SpO2 parameter calculations proved the efficacy of estimation technique in measurement of reliable and accurate SpO2, helpful for medical diagnosis.
{"title":"Reduction of motion artifacts from pulse oximeter signals using tunable Q-factor wavelet transform technique","authors":"M. R. Ram, K. Sivani, K. Reddy","doi":"10.1109/ICCCSP.2017.7944097","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944097","url":null,"abstract":"Pulse Oximeter (PO) employed in critical care units is crucial equipment to measure the vital parameters like oxygen blood saturation levels and heart rate of the patient. Using PO device, required medical data is acquired using the photoplethysmographic (PPG) data utilizing PPG sensors attached on forehead/ to finger/at earlobe of the patient and then Ratio parameter (R) is computed pertaining to amplitudes of acquired red and IR PPG signals. Further, ‘R’ is used to estimate oxygen saturation levels with the help of calibration curve. Subject movements while recording the medical data may result in erroneous estimation of required estimation parameter and in turn may result in wrong diagnosis by the clinician. Reduction of Motion Artifacts (MA) component from raw PPG data recorded may guarantee error-free measurement of oxygen blood saturation level (SpO2). MA's can be removed from raw PPG signal (corrupted) using band pass filtering method, but the persisting in-band noise component cannot be removed. In this paper, authors propose a filtering method using tunable Q-factor wavelet transform (TQWT) to remove MA components. Advantage of TQWT sytems from the fact that, the realization of practical narrow band pass filter with a specific Q-factor value can be designed, which motivated the authors to use for this application. Experimental results have shown a good acceptance for the proposed method as the MA reduced PPG signals obtained are having efficient morphological features. SpO2 is estimated from MA reduced PPGs by utilizing the calibration curve. The superiority of proposed technique has been proved by comparing the experimental results with results obtained using basic least mean squares (LMS) method. Signal data can be acquired with different MA components (bending, horizontal and vertical movements of patient's finger) is considered for experiment analysis. Obtained SpO2 parameter calculations proved the efficacy of estimation technique in measurement of reliable and accurate SpO2, helpful for medical diagnosis.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"318 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133831737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}