Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944095
K. V. Madhav, E. Krishna, K. Reddy
For diagnosing obstructive sleep apnea (OSA), polysomnography (PSG) is used. Use of PSG is gold standard for detection of sleep apnea. This research is basically aimed at detection of sleep apnea from more commonly available physiological signals such as electrocardiogram (ECG) and photoplethysmographic (PPG) signals in any simple bedside multiparameter monitors. Respiratory activity extracted from ECG and PPG signals is used for the detection of apnea episodes. This process is useful in situations when recording of PSG is not possible or as a preliminary screening test of possible OSA in patients. In the present work ECG-derived respiration (EDR) and PPG derived respiration (PDR) signals, obtained using empirical mode decomposition (EMD) method, and are used to detect OSA episodes. Signals from MIMIC database were used for experimentation. The test results have revealed that the proposed method has efficiently extracted respiratory information from ECG and PPG signals for detection of obstructive sleep apnea syndrome (OSAS). The similarity parameters computed in both time and frequency domains have confirmed the same. High sensitivity and positive predictivity levels have revealed high degree of correctness.
{"title":"Detection of sleep apnea from multiparameter monitor signals using empirical mode decomposition","authors":"K. V. Madhav, E. Krishna, K. Reddy","doi":"10.1109/ICCCSP.2017.7944095","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944095","url":null,"abstract":"For diagnosing obstructive sleep apnea (OSA), polysomnography (PSG) is used. Use of PSG is gold standard for detection of sleep apnea. This research is basically aimed at detection of sleep apnea from more commonly available physiological signals such as electrocardiogram (ECG) and photoplethysmographic (PPG) signals in any simple bedside multiparameter monitors. Respiratory activity extracted from ECG and PPG signals is used for the detection of apnea episodes. This process is useful in situations when recording of PSG is not possible or as a preliminary screening test of possible OSA in patients. In the present work ECG-derived respiration (EDR) and PPG derived respiration (PDR) signals, obtained using empirical mode decomposition (EMD) method, and are used to detect OSA episodes. Signals from MIMIC database were used for experimentation. The test results have revealed that the proposed method has efficiently extracted respiratory information from ECG and PPG signals for detection of obstructive sleep apnea syndrome (OSAS). The similarity parameters computed in both time and frequency domains have confirmed the same. High sensitivity and positive predictivity levels have revealed high degree of correctness.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"144 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114174036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944067
Ashiv Dhondea, A. Mishra, M. Inggs
Space object (satellite or space debris) tracking has been identified as a key component of Space Situational Awareness. Space object tracking is a continuous-discrete filtering problem. Conventional extended Kahnan filter (EKF) and un-scented Kalman filter (UKF) methods are formulated for discrete-discrete filtering problems. New versions of the EKF have been engineered recently for continuous-discrete filtering problems. We first discuss the dynamic and observation model of the space object tracking problem on which we later on test 5 filters in the CD-EKF framework. Our results show that for the space object tracking problem with radar, solutions which discretize the Langevin equation give results comparable to Moment-Matching based EKFs at very high discretization resolutions.
{"title":"Investigation of variable discretization resolution for CD-EKFs in space object tracking","authors":"Ashiv Dhondea, A. Mishra, M. Inggs","doi":"10.1109/ICCCSP.2017.7944067","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944067","url":null,"abstract":"Space object (satellite or space debris) tracking has been identified as a key component of Space Situational Awareness. Space object tracking is a continuous-discrete filtering problem. Conventional extended Kahnan filter (EKF) and un-scented Kalman filter (UKF) methods are formulated for discrete-discrete filtering problems. New versions of the EKF have been engineered recently for continuous-discrete filtering problems. We first discuss the dynamic and observation model of the space object tracking problem on which we later on test 5 filters in the CD-EKF framework. Our results show that for the space object tracking problem with radar, solutions which discretize the Langevin equation give results comparable to Moment-Matching based EKFs at very high discretization resolutions.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123316107","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944103
S. Poornima, N. Sripriya, B. Vijayalakshmi, P. Vishnupriya
Maintaining and taking log of attendance in a class is not much effective through manual process. Since bunking the classes or giving proxies for the absentees become fun and fantasy among the current generation students. Manual entering of attendance in logbooks becomes a difficult task and it can be easily manipulated. Therefore, this paper aims in presenting an automated attendance System — AUDACE. This system automatically detects the student in the class room and marks the attendance by recognizing their face. This system is developed by capturing real time human faces in the class. The detected faces are matched against the reference faces in the dataset and marked the attendance for the attendees. Finally the absentee lists are said aloud through voice conversion system for confirmation. Secondly, the system is trained to classify the gender of the students present in the class.
{"title":"Attendance monitoring system using facial recognition with audio output and gender classification","authors":"S. Poornima, N. Sripriya, B. Vijayalakshmi, P. Vishnupriya","doi":"10.1109/ICCCSP.2017.7944103","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944103","url":null,"abstract":"Maintaining and taking log of attendance in a class is not much effective through manual process. Since bunking the classes or giving proxies for the absentees become fun and fantasy among the current generation students. Manual entering of attendance in logbooks becomes a difficult task and it can be easily manipulated. Therefore, this paper aims in presenting an automated attendance System — AUDACE. This system automatically detects the student in the class room and marks the attendance by recognizing their face. This system is developed by capturing real time human faces in the class. The detected faces are matched against the reference faces in the dataset and marked the attendance for the attendees. Finally the absentee lists are said aloud through voice conversion system for confirmation. Secondly, the system is trained to classify the gender of the students present in the class.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130262732","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944071
K. Kumar, Srividya, S. Mohanavalli
Due to exponential growth of data over the years, there are lot of formats in which the data is available. Many schema-less databases are identified as the data these days do not pertain itself to a particular scheme. So, the effective storage and processing of such data were not possible with the existing RDBMS. The looming of NoSQL databases proved to be one of the best solutions for handling these kind of schema-less data. This work comprises about the various characteristics of NOSQL databases. The performance comparison of two widely used Document-oriented NOSQL databases viz MongoDB and CouchDB are analysed in this work. Both qualitative as well as quantitative features are taken and a comparison for streaming applications among those features are provided using the two databases under study.
{"title":"A performance comparison of document oriented NoSQL databases","authors":"K. Kumar, Srividya, S. Mohanavalli","doi":"10.1109/ICCCSP.2017.7944071","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944071","url":null,"abstract":"Due to exponential growth of data over the years, there are lot of formats in which the data is available. Many schema-less databases are identified as the data these days do not pertain itself to a particular scheme. So, the effective storage and processing of such data were not possible with the existing RDBMS. The looming of NoSQL databases proved to be one of the best solutions for handling these kind of schema-less data. This work comprises about the various characteristics of NOSQL databases. The performance comparison of two widely used Document-oriented NOSQL databases viz MongoDB and CouchDB are analysed in this work. Both qualitative as well as quantitative features are taken and a comparison for streaming applications among those features are provided using the two databases under study.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127621299","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944056
K S Krishnapriya
Fingerprint is an important measure used to detect an unknown victim, suspect or witness. It has a major role in verifying records to explore links and matches between a suspect and a crime. Fingerprints are also used for security reasons, such as an entrance control at important buildings. But the quality of fingerprint images can easily get degraded by skin dryness, wet, wound and other types of noises. Hence denoising of fingerprint images is a necessary step in systems for automatic fingerprint recognition. This paper suggests a 3-stage process for the removal of noise from fingerprint images, through exploring external correlations and internal correlations, with the help of a set of correlated images. Internal and external data cubes are built for each noisy patch by discovering identical patches from the corresponding noisy and internet based images. External denoising in the first stage is done by a graph based optimization method and internal denoising is done by means of a frequency truncation process. Internal denoising results and external denoising results are combined to obtain the preliminary denoising result. The second stage performs filtering of external and internal cubes and the fused result is in turn passed to the third stage. In the third stage, an image enhancement technique is carried out to obtain the final denoised result. This method is compared with the existing algorithms and the experimental results, in terms of its PSNR (Peak Signal to Noise Ratio) values and SSIM (Structural Similarity Measure) values proved that the method is efficient than all of them.
{"title":"Denoising of fingerprint images by exploring external and internal correlations","authors":"K S Krishnapriya","doi":"10.1109/ICCCSP.2017.7944056","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944056","url":null,"abstract":"Fingerprint is an important measure used to detect an unknown victim, suspect or witness. It has a major role in verifying records to explore links and matches between a suspect and a crime. Fingerprints are also used for security reasons, such as an entrance control at important buildings. But the quality of fingerprint images can easily get degraded by skin dryness, wet, wound and other types of noises. Hence denoising of fingerprint images is a necessary step in systems for automatic fingerprint recognition. This paper suggests a 3-stage process for the removal of noise from fingerprint images, through exploring external correlations and internal correlations, with the help of a set of correlated images. Internal and external data cubes are built for each noisy patch by discovering identical patches from the corresponding noisy and internet based images. External denoising in the first stage is done by a graph based optimization method and internal denoising is done by means of a frequency truncation process. Internal denoising results and external denoising results are combined to obtain the preliminary denoising result. The second stage performs filtering of external and internal cubes and the fused result is in turn passed to the third stage. In the third stage, an image enhancement technique is carried out to obtain the final denoised result. This method is compared with the existing algorithms and the experimental results, in terms of its PSNR (Peak Signal to Noise Ratio) values and SSIM (Structural Similarity Measure) values proved that the method is efficient than all of them.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124669163","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944086
R. Kirtana, Y. V. Lokeswari
Heart Rate Variability (HRV) is a measure of variation in the time interval between consecutive heart beats. HRV analysis is highly sensitive for risks linked with Cardiovascular disease, Diabetic Mellitus, disease states associated with Autonomic Dysrhythmia such as Hypertension and a large array of chronic degenerative medical condition. Sensitivity of HRV towards various medical condition accounts for its increased usage by doctors as a diagnostic, prognostic tool and to evaluate the effectiveness of the treatment offered. Often borderline hypertensive patients with and without history of a cardiac event are subjected to stroke as well as cardiac mortality at high risk. Monitoring of HRV parameters for such cases of high risk will prove useful in providing adequate medical care at needed times. In this paper, the authors propose a low-cost and easy to use Remote HRV Monitoring System based on the Internet of Things (IoT) technology for borderline Hypertensive patients. In the proposed system, HRV parameters are derived using Wireless Zigbee based pulse sensor. Arduino transmits patient data to server using MQTT protocol. The application server collects HRV data and plots graphs. In case of an emergency situation, the care taker and doctor are intimated through Short Message Service (SMS) for providing adequate medical help. While there are currently no HRV analysis systems that alerts at times of high risk for hypertensive patients along with the aid of a remote doctor, the proposed system aims at achieving the same. The proposed system combines the dual benefits of Zigbee and WiFi technology. By doing so, it successfully fulfils all the ideal traits of a remote health monitoring system in terms of low-cost, long range, security, promptness and easy-to-use that serves in saving lives.
{"title":"An IoT based remote HRV monitoring system for hypertensive patients","authors":"R. Kirtana, Y. V. Lokeswari","doi":"10.1109/ICCCSP.2017.7944086","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944086","url":null,"abstract":"Heart Rate Variability (HRV) is a measure of variation in the time interval between consecutive heart beats. HRV analysis is highly sensitive for risks linked with Cardiovascular disease, Diabetic Mellitus, disease states associated with Autonomic Dysrhythmia such as Hypertension and a large array of chronic degenerative medical condition. Sensitivity of HRV towards various medical condition accounts for its increased usage by doctors as a diagnostic, prognostic tool and to evaluate the effectiveness of the treatment offered. Often borderline hypertensive patients with and without history of a cardiac event are subjected to stroke as well as cardiac mortality at high risk. Monitoring of HRV parameters for such cases of high risk will prove useful in providing adequate medical care at needed times. In this paper, the authors propose a low-cost and easy to use Remote HRV Monitoring System based on the Internet of Things (IoT) technology for borderline Hypertensive patients. In the proposed system, HRV parameters are derived using Wireless Zigbee based pulse sensor. Arduino transmits patient data to server using MQTT protocol. The application server collects HRV data and plots graphs. In case of an emergency situation, the care taker and doctor are intimated through Short Message Service (SMS) for providing adequate medical help. While there are currently no HRV analysis systems that alerts at times of high risk for hypertensive patients along with the aid of a remote doctor, the proposed system aims at achieving the same. The proposed system combines the dual benefits of Zigbee and WiFi technology. By doing so, it successfully fulfils all the ideal traits of a remote health monitoring system in terms of low-cost, long range, security, promptness and easy-to-use that serves in saving lives.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121361351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944100
E. Krishna, K. Sivani, K. Reddy
Spectrally efficient orthogonal frequency division multiplexing (OFDM) offers an ideal solution to inter symbol interference (ISI), by maintaining the orthogonality among subcarriers. For a multi carrier (MC) communication system, the wireless channel impulse response (CIR) varies rapidly under severe channel fading conditions. Fixed step size (FSS) adaptive filtering (AF) provides a way out for channel estimation and equalization, in slow varying fading channels. In this work, an adaptive step size (ASS) least mean squares (LMS) AF is utilized for the purpose of OFDM channel equalization. The performance of the proposed ASS-LMS AF is compared with variable step size (VSS) LMS and normalized LMS (NLMS) algorithms. The superiority of the proposed ASS LMS algorithm is evaluated by the bit error rate (BER) and mean square error (MSE) parameters, which shows an improvement.
{"title":"OFDM channel estimation using novel LMS adaptive algorithm","authors":"E. Krishna, K. Sivani, K. Reddy","doi":"10.1109/ICCCSP.2017.7944100","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944100","url":null,"abstract":"Spectrally efficient orthogonal frequency division multiplexing (OFDM) offers an ideal solution to inter symbol interference (ISI), by maintaining the orthogonality among subcarriers. For a multi carrier (MC) communication system, the wireless channel impulse response (CIR) varies rapidly under severe channel fading conditions. Fixed step size (FSS) adaptive filtering (AF) provides a way out for channel estimation and equalization, in slow varying fading channels. In this work, an adaptive step size (ASS) least mean squares (LMS) AF is utilized for the purpose of OFDM channel equalization. The performance of the proposed ASS-LMS AF is compared with variable step size (VSS) LMS and normalized LMS (NLMS) algorithms. The superiority of the proposed ASS LMS algorithm is evaluated by the bit error rate (BER) and mean square error (MSE) parameters, which shows an improvement.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126544521","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/icccsp.2017.7944051
Yong Meng Teo
Dr. TEO Yong Meng is an Associate Professor of Computer Science at the National University of Singapore (NUS) and an Affiliate Professor at the NUS Business Analytics Centre. At NUS, he is the technical leader for Systems Research and he leads the Computer Systems Research Group. He was a Visiting Professor at the Chinese Academy of Science in China from 2010–2014. He received his PhD and MSc in Computer Science from the University of Manchester. His research interests include parallel computing, systems modeling and simulation and performance analysis. His recent work focuses on modeling the performance of heterogeneous parallel systems and emergent properties in complex systems among others. He has over 150 journal and conference publications and a number of best paper awards including the Best Applied Paper Award at the annual Wintersim Conference in 2015 and the Best Paper Award at the 10th International Conference on Algorithms and Architectures for Parallel Processing in 2010. Another paper, co-authored with his PhD student won the ACM SIGSIM Best PhD Student Paper Award in 2009. He has received various research grants including European Commission, Fujitsu Computers (Singapore) Pte Ltd, Fujitsu Laboratories Ltd (Japan), Sun Microsystems/Oracle (USA), NVIDIA and PSA Corporation.
{"title":"Keynote speakers: Social interactions and system vulnerabilities","authors":"Yong Meng Teo","doi":"10.1109/icccsp.2017.7944051","DOIUrl":"https://doi.org/10.1109/icccsp.2017.7944051","url":null,"abstract":"Dr. TEO Yong Meng is an Associate Professor of Computer Science at the National University of Singapore (NUS) and an Affiliate Professor at the NUS Business Analytics Centre. At NUS, he is the technical leader for Systems Research and he leads the Computer Systems Research Group. He was a Visiting Professor at the Chinese Academy of Science in China from 2010–2014. He received his PhD and MSc in Computer Science from the University of Manchester. His research interests include parallel computing, systems modeling and simulation and performance analysis. His recent work focuses on modeling the performance of heterogeneous parallel systems and emergent properties in complex systems among others. He has over 150 journal and conference publications and a number of best paper awards including the Best Applied Paper Award at the annual Wintersim Conference in 2015 and the Best Paper Award at the 10th International Conference on Algorithms and Architectures for Parallel Processing in 2010. Another paper, co-authored with his PhD student won the ACM SIGSIM Best PhD Student Paper Award in 2009. He has received various research grants including European Commission, Fujitsu Computers (Singapore) Pte Ltd, Fujitsu Laboratories Ltd (Japan), Sun Microsystems/Oracle (USA), NVIDIA and PSA Corporation.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115685861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944072
Jayaram Hariharakrishnan, S. Mohanavalli, Srividya, K. Kumar
Big Data analytics has become important as many administrations, organizations, and companies both public and private have been collecting and analyzing huge amounts of domain-specific information, which can contain useful information about problems such as national intelligence, cyber security, fraud detection, marketing, and medical informatics. With more and more data being generated the ever dynamic size, scale, diversity, and complexity has made the requirement for newer architectures, techniques, algorithms, and analytics to manage it and extract value from the data collected. The progress and innovation is no longer hindered by the ability to collect data but, by the ability to manage, analyze, summarize, visualize, and discover knowledge from the collected data in a timely manner and in a scalable fashion as well as a credible clean and noise free data sets. This paper mainly makes an attempt to understand the different problems to solve in the processes of data preprocessing, to also familiarize with the problems related to cleaning data, know the problems to apply data cleaning and noise removal techniques for big data analytics and to mitigate the imperfect data, together with some techniques to solve them and also to identify the shortcomings in the existing methods of the reduction techniques in the necessary respective areas of application and also to identify the current big data preprocessing proposal's effectiveness to various data sets.
{"title":"Survey of pre-processing techniques for mining big data","authors":"Jayaram Hariharakrishnan, S. Mohanavalli, Srividya, K. Kumar","doi":"10.1109/ICCCSP.2017.7944072","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944072","url":null,"abstract":"Big Data analytics has become important as many administrations, organizations, and companies both public and private have been collecting and analyzing huge amounts of domain-specific information, which can contain useful information about problems such as national intelligence, cyber security, fraud detection, marketing, and medical informatics. With more and more data being generated the ever dynamic size, scale, diversity, and complexity has made the requirement for newer architectures, techniques, algorithms, and analytics to manage it and extract value from the data collected. The progress and innovation is no longer hindered by the ability to collect data but, by the ability to manage, analyze, summarize, visualize, and discover knowledge from the collected data in a timely manner and in a scalable fashion as well as a credible clean and noise free data sets. This paper mainly makes an attempt to understand the different problems to solve in the processes of data preprocessing, to also familiarize with the problems related to cleaning data, know the problems to apply data cleaning and noise removal techniques for big data analytics and to mitigate the imperfect data, together with some techniques to solve them and also to identify the shortcomings in the existing methods of the reduction techniques in the necessary respective areas of application and also to identify the current big data preprocessing proposal's effectiveness to various data sets.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114505548","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.1109/ICCCSP.2017.7944087
N. Sangeetha, K. Anusudha
Digital image processing techniques are commonly used to enhance an image to extract the useful information from it. Images acquired by a visual system are seriously degraded under hazy and foggy weather, which will affect the detection, tracking, and recognition of targets. The degraded images have reduced contrast and the local information is lost. Thus, restoring the true scene from such a foggy image is of significance. The paper focuses on enhancing the contrast and visibility of the foggy image by using various enhancement techniques. The performance of the proposed techniques are analyzed based on standard parameters.
{"title":"Image defogging using enhancement techniques","authors":"N. Sangeetha, K. Anusudha","doi":"10.1109/ICCCSP.2017.7944087","DOIUrl":"https://doi.org/10.1109/ICCCSP.2017.7944087","url":null,"abstract":"Digital image processing techniques are commonly used to enhance an image to extract the useful information from it. Images acquired by a visual system are seriously degraded under hazy and foggy weather, which will affect the detection, tracking, and recognition of targets. The degraded images have reduced contrast and the local information is lost. Thus, restoring the true scene from such a foggy image is of significance. The paper focuses on enhancing the contrast and visibility of the foggy image by using various enhancement techniques. The performance of the proposed techniques are analyzed based on standard parameters.","PeriodicalId":269595,"journal":{"name":"2017 International Conference on Computer, Communication and Signal Processing (ICCCSP)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131202474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}