Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866662
Alecsandru Patrascu, V. Patriciu
Cloud computing has emerged as a paradigm that attracts more and more researchers. In this context, the need for knowing where, how and under what conditions data is either processed or stored in datacenters, becomes a prime interest due to the continuously developing field of cloud computing forensics. In this paper we describe in a detailed manner an essential part of our cloud forensic framework that can be built on top of both new and existing datacenters - the logging part. We discuss the problems that must be dealt with in such architectures and we detail our proposed solutions to them. We explain how our architecture and findings can help forensics investigators that conduct investigations in a cloud environment.
{"title":"Logging framework for cloud computing forensic environments","authors":"Alecsandru Patrascu, V. Patriciu","doi":"10.1109/ICCOMM.2014.6866662","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866662","url":null,"abstract":"Cloud computing has emerged as a paradigm that attracts more and more researchers. In this context, the need for knowing where, how and under what conditions data is either processed or stored in datacenters, becomes a prime interest due to the continuously developing field of cloud computing forensics. In this paper we describe in a detailed manner an essential part of our cloud forensic framework that can be built on top of both new and existing datacenters - the logging part. We discuss the problems that must be dealt with in such architectures and we detail our proposed solutions to them. We explain how our architecture and findings can help forensics investigators that conduct investigations in a cloud environment.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117241040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866752
A. Temtam, D. Popescu, O. Popescu
The paper studies the use of pilot tone information for detecting active transmissions from Long Term Evolution (LTE) wireless systems. LTE employs Orthogonal Frequency Division Multiplexing (OFDM) at the physical layer and, according to the LTE standard specifications, the transmitted signals contain periodic pilot information placed on specific subcarriers to be used for synchronization and channel estimation. The pilot tone information can also be used for detecting the presence of active LTE signals by using a cross-correlation approach. The method presented in the paper uses the Time-Domain Symbol Cross-correlation (TDSC) technique and takes advantage of the fact that the mean of the cumulative correlation of distinct symbols with the same pilot tone positions is constant while its variance changes, allowing it to be used for detecting LTE transmissions in environments with low signal-to-noise ratios (SNRs). Application of the TDSC method for LTE systems is presented analytically and is illustrated with numerical results obtained from simulations that include additive white Gaussian noise (AWGN) as well as Rice and Rayleigh channel models outlined in the LTE standard.
{"title":"Using OFDM pilot tone information to detect active 4G LTE transmissions","authors":"A. Temtam, D. Popescu, O. Popescu","doi":"10.1109/ICCOMM.2014.6866752","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866752","url":null,"abstract":"The paper studies the use of pilot tone information for detecting active transmissions from Long Term Evolution (LTE) wireless systems. LTE employs Orthogonal Frequency Division Multiplexing (OFDM) at the physical layer and, according to the LTE standard specifications, the transmitted signals contain periodic pilot information placed on specific subcarriers to be used for synchronization and channel estimation. The pilot tone information can also be used for detecting the presence of active LTE signals by using a cross-correlation approach. The method presented in the paper uses the Time-Domain Symbol Cross-correlation (TDSC) technique and takes advantage of the fact that the mean of the cumulative correlation of distinct symbols with the same pilot tone positions is constant while its variance changes, allowing it to be used for detecting LTE transmissions in environments with low signal-to-noise ratios (SNRs). Application of the TDSC method for LTE systems is presented analytically and is illustrated with numerical results obtained from simulations that include additive white Gaussian noise (AWGN) as well as Rice and Rayleigh channel models outlined in the LTE standard.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"223 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127178854","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866748
I. Dumitrache, Alina Sultana, R. Dogaru
Medical imaging is an area of great interest in terms of accuracy, speed and capacity of integration. In order to improve results and ease the physicians' task, some feature enhancement and image processing should be done automatically in order to lead to some features that allow an automatic classification of the images. This paper presents an original approach to construct an automatic melanoma detection system, based on employing natural computing methods for image preprocessing, feature extraction and classification. Among these methods we rely on cellular automata, reaction diffusion cellular neural networks, nonlinear time-series analysis.
{"title":"Automatic detection of skin melanoma from images using natural computing approaches","authors":"I. Dumitrache, Alina Sultana, R. Dogaru","doi":"10.1109/ICCOMM.2014.6866748","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866748","url":null,"abstract":"Medical imaging is an area of great interest in terms of accuracy, speed and capacity of integration. In order to improve results and ease the physicians' task, some feature enhancement and image processing should be done automatically in order to lead to some features that allow an automatic classification of the images. This paper presents an original approach to construct an automatic melanoma detection system, based on employing natural computing methods for image preprocessing, feature extraction and classification. Among these methods we rely on cellular automata, reaction diffusion cellular neural networks, nonlinear time-series analysis.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126703627","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866687
D. Aiordachioaie
The work considers the artifacts compensation in the context of airborne ultrasonic image generation. Almost all airborne ultrasonic images used in robotics have artifacts and distortions, when comparing the real objects to those presented or identified in ultrasonic images. The differences between the real and reference images can have various causes, such as asymmetries in the directivity of the ultrasonic transducers, reflections in the explored environment, and nonlinearities of the signal processing blocks. A fusion-based method of reducing artifacts of sonar images is proposed and partially implemented. The object image from the ultrasonic image is fused with a reference image, selected from a database after an identification procedure. The selection rule used in image fusion is considered a supervised process. The results are encouraging and suggest a trade-off between complexity of fusion technique and quality of the result.
{"title":"On reducing the artifacts of sonar images with image fusion technique","authors":"D. Aiordachioaie","doi":"10.1109/ICCOMM.2014.6866687","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866687","url":null,"abstract":"The work considers the artifacts compensation in the context of airborne ultrasonic image generation. Almost all airborne ultrasonic images used in robotics have artifacts and distortions, when comparing the real objects to those presented or identified in ultrasonic images. The differences between the real and reference images can have various causes, such as asymmetries in the directivity of the ultrasonic transducers, reflections in the explored environment, and nonlinearities of the signal processing blocks. A fusion-based method of reducing artifacts of sonar images is proposed and partially implemented. The object image from the ultrasonic image is fused with a reference image, selected from a database after an identification procedure. The selection rule used in image fusion is considered a supervised process. The results are encouraging and suggest a trade-off between complexity of fusion technique and quality of the result.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"149 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127025996","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866756
D. Dinu, Mihai Togan
In this paper we give an overview of the DHCP security issues and the related work done to secure the protocol. Then we propose a method based on the use of public key cryptography and digital certificates in order to authenticate the DHCP server and DHCP server responses, and to prevent in this way the rogue DHCP server attacks. We implemented and tested the proposed solution using different key and certificate types in order to find out the packet overhead and time consumed by the new added authentication option.
{"title":"DHCP server authentication using digital certificates","authors":"D. Dinu, Mihai Togan","doi":"10.1109/ICCOMM.2014.6866756","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866756","url":null,"abstract":"In this paper we give an overview of the DHCP security issues and the related work done to secure the protocol. Then we propose a method based on the use of public key cryptography and digital certificates in order to authenticate the DHCP server and DHCP server responses, and to prevent in this way the rogue DHCP server attacks. We implemented and tested the proposed solution using different key and certificate types in order to find out the packet overhead and time consumed by the new added authentication option.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126299783","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866741
C. Rîncu, Vasile-Gabriel Iana
Ensuring good confusion and diffusion in symmetrical cryptosystems substitution is very important. Beginning with the attacks against DES, construction of good S-boxes gained more interest due to the nonlinearity necessity. During the past years, the features of the dynamical chaotic systems were exploited to design cryptographic techniques, in order to give an alternative to classic methods. In this paper we propose a solution that combines three simple chaotic maps in order to achieve S-box with good results regarding most of the validating tests.
{"title":"S-box design based on chaotic maps combination","authors":"C. Rîncu, Vasile-Gabriel Iana","doi":"10.1109/ICCOMM.2014.6866741","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866741","url":null,"abstract":"Ensuring good confusion and diffusion in symmetrical cryptosystems substitution is very important. Beginning with the attacks against DES, construction of good S-boxes gained more interest due to the nonlinearity necessity. During the past years, the features of the dynamical chaotic systems were exploited to design cryptographic techniques, in order to give an alternative to classic methods. In this paper we propose a solution that combines three simple chaotic maps in order to achieve S-box with good results regarding most of the validating tests.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121421907","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866744
R. Preda
This paper proposes a new watermarking-based self-recovery system for unauthentic images in the Wavelet domain. The LL wavelet sub-band of the second wavelet decomposition of the original image is used as a recovery watermark and is embedded in the horizontal, vertical and diagonal sub-bands of the first wavelet decomposition using a quantization approach. The algorithm obtains good image quality with mean PSNR values for 100 watermarked images above 37 dB. The method protects the entire image and is able to reliably extract the recovery image, even after the watermarked image has been tampered with to a degree of 25%.
{"title":"Self-recovery of unauthentic images using a new digital watermarking approach in the wavelet domain","authors":"R. Preda","doi":"10.1109/ICCOMM.2014.6866744","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866744","url":null,"abstract":"This paper proposes a new watermarking-based self-recovery system for unauthentic images in the Wavelet domain. The LL wavelet sub-band of the second wavelet decomposition of the original image is used as a recovery watermark and is embedded in the horizontal, vertical and diagonal sub-bands of the first wavelet decomposition using a quantization approach. The algorithm obtains good image quality with mean PSNR values for 100 watermarked images above 37 dB. The method protects the entire image and is able to reliably extract the recovery image, even after the watermarked image has been tampered with to a degree of 25%.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132696120","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866724
Georgiana Marin, S. Radu, Gheorghe Samoilescu, O. Baltag
Gradiometer techniques are generally employed to detect near sources of weak magnetic field, especially in biomagnetic measurements, but they are also effective in the detection of the magnetic signature of surface vessels and submarines. In this paper there is analyzed the recorded signal of a fluxgate gradiometer consisting of two coaxial transducers based on the rejection coefficient. There are also discussed error sources affecting the gradiometer signal. The object of analysis is the magnetic signature gradient of a ship model. The measurement setup consists of a shielded room surrounded by a triaxial Helmholtz coil, with magnetic field automatic compensation system. The measuring device is a triaxial fluxgate magnetometer picking up the values of the magnetic signature components for the ship model. Magnetic field measurements were processed by the time derivative gradiometer technique. Thus there is determined the spatial variation of the magnetic signature components and of the total field signature, on the three axes of the rectangular coordinate system.
{"title":"The analysis of gradiometer signal in magnetic field measurement with fluxgate transducer","authors":"Georgiana Marin, S. Radu, Gheorghe Samoilescu, O. Baltag","doi":"10.1109/ICCOMM.2014.6866724","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866724","url":null,"abstract":"Gradiometer techniques are generally employed to detect near sources of weak magnetic field, especially in biomagnetic measurements, but they are also effective in the detection of the magnetic signature of surface vessels and submarines. In this paper there is analyzed the recorded signal of a fluxgate gradiometer consisting of two coaxial transducers based on the rejection coefficient. There are also discussed error sources affecting the gradiometer signal. The object of analysis is the magnetic signature gradient of a ship model. The measurement setup consists of a shielded room surrounded by a triaxial Helmholtz coil, with magnetic field automatic compensation system. The measuring device is a triaxial fluxgate magnetometer picking up the values of the magnetic signature components for the ship model. Magnetic field measurements were processed by the time derivative gradiometer technique. Thus there is determined the spatial variation of the magnetic signature components and of the total field signature, on the three axes of the rectangular coordinate system.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"128 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133173159","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866732
C. Paleologu, J. Benesty, S. Ciochină
The regularization of the affine projection algorithm (APA) is of great importance in echo cancellation applications. The regularization parameter, which depends on the level of the near-end signal, is added to the main diagonal of the input signal correlation matrix to ensure the stability of the APA. In this paper, we propose a practical way for evaluating the power of the near-end signal or, equivalently, the signal-to-noise ratio that is explicitly related to the regularization parameter. Simulation results obtained in the context of acoustic echo cancellation support the appealing performance of the proposed solution.
{"title":"A practical solution for the regularization of the affine projection algorithm","authors":"C. Paleologu, J. Benesty, S. Ciochină","doi":"10.1109/ICCOMM.2014.6866732","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866732","url":null,"abstract":"The regularization of the affine projection algorithm (APA) is of great importance in echo cancellation applications. The regularization parameter, which depends on the level of the near-end signal, is added to the main diagonal of the input signal correlation matrix to ensure the stability of the APA. In this paper, we propose a practical way for evaluating the power of the near-end signal or, equivalently, the signal-to-noise ratio that is explicitly related to the regularization parameter. Simulation results obtained in the context of acoustic echo cancellation support the appealing performance of the proposed solution.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"111 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121124914","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-29DOI: 10.1109/ICCOMM.2014.6866668
P. Matei, C. Rotaru, Mihai Mihăilă-Andres, I. Edu
The main goal of this paper is to provide the latest developments in high level integrating of two different sides of a software system for pilots' performances enhancing, focusing on the data extraction and communication aspects concerning the assessment of the contribution of the lateral component of the virtual flight in the optimizing process. To achieve this goal, two main sides are considered: the first dealing with the data structures related to the acquisition component of the system able to carry out the task of performances enhancing and the second oriented to presenting the results materialized in exposing the contribution of the lateral component of the virtual flight in the communications between different stages of the enhancing process. An entirely new specially designed intelligent system for high precision assessments of aircraft piloting abilities assisted by a multi-stream data acquisition and processing system for integrating the simulated flight data and the physiological and behavior data form the basis for all optimizations. As a direct result, a more solid basis for the decision-making process for setting the pilots' and candidates' hierarchy for admittance to specific flight training programs is provided.
{"title":"Software system for data extraction and communication in virtual environment for pilots' performances optimization","authors":"P. Matei, C. Rotaru, Mihai Mihăilă-Andres, I. Edu","doi":"10.1109/ICCOMM.2014.6866668","DOIUrl":"https://doi.org/10.1109/ICCOMM.2014.6866668","url":null,"abstract":"The main goal of this paper is to provide the latest developments in high level integrating of two different sides of a software system for pilots' performances enhancing, focusing on the data extraction and communication aspects concerning the assessment of the contribution of the lateral component of the virtual flight in the optimizing process. To achieve this goal, two main sides are considered: the first dealing with the data structures related to the acquisition component of the system able to carry out the task of performances enhancing and the second oriented to presenting the results materialized in exposing the contribution of the lateral component of the virtual flight in the communications between different stages of the enhancing process. An entirely new specially designed intelligent system for high precision assessments of aircraft piloting abilities assisted by a multi-stream data acquisition and processing system for integrating the simulated flight data and the physiological and behavior data form the basis for all optimizations. As a direct result, a more solid basis for the decision-making process for setting the pilots' and candidates' hierarchy for admittance to specific flight training programs is provided.","PeriodicalId":366043,"journal":{"name":"2014 10th International Conference on Communications (COMM)","volume":"311 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122602175","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}