Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072741
P. Borovska, M. Lazarova
The paper presents a software package BlueVision for parallel processing of multispectral data from remote sensing of the Earth. The purpose is to provide a software framework for high-performance analysis aimed at detection and monitoring of different natural hazards. The package consists of five modules that implement parallel computational modules for fire detection, deforestation, soil salinity, water pollution, flooding. Additional module visualizes the detected areas on a geographic map. Performance analysis are based on the experimental results on a heterogeneous multicomputer cluster and the Bulgarian supercomputer BlueGene/P.
{"title":"Parallel software framework for high-performance multispectral analysis for earth monitoring","authors":"P. Borovska, M. Lazarova","doi":"10.1109/IDAACS.2011.6072741","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072741","url":null,"abstract":"The paper presents a software package BlueVision for parallel processing of multispectral data from remote sensing of the Earth. The purpose is to provide a software framework for high-performance analysis aimed at detection and monitoring of different natural hazards. The package consists of five modules that implement parallel computational modules for fire detection, deforestation, soil salinity, water pollution, flooding. Additional module visualizes the detected areas on a geographic map. Performance analysis are based on the experimental results on a heterogeneous multicomputer cluster and the Bulgarian supercomputer BlueGene/P.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115893341","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072867
Albert H Carlson, R. Hiromoto, R. Wells
The security of block and product ciphers is considered using a set theoretic estimation (STE) approach to decryption. Known-ciphertext attacks are studied using permutation (P) and substitution (S) keys. The blocks are formed form three (3) alphabetic characters (metacharacters) where the applications of P and S upon the ASCII byte representation of each of the three characters are allowed to cross byte boundaries. The application of STE forgoes the need to employ chosen-plaintext or known-plaintext attacks.
{"title":"Breaking block and product ciphers applied across byte boundaries","authors":"Albert H Carlson, R. Hiromoto, R. Wells","doi":"10.1109/IDAACS.2011.6072867","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072867","url":null,"abstract":"The security of block and product ciphers is considered using a set theoretic estimation (STE) approach to decryption. Known-ciphertext attacks are studied using permutation (P) and substitution (S) keys. The blocks are formed form three (3) alphabetic characters (metacharacters) where the applications of P and S upon the ASCII byte representation of each of the three characters are allowed to cross byte boundaries. The application of STE forgoes the need to employ chosen-plaintext or known-plaintext attacks.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"7 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124278652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072719
J. Kominek, Juergen Straub, J. Zidek
The task of programming instruments in a test system has always been a concern for end users and a major cost for the overall system development. Many users know that programming can often be the most time-consuming part of developing a system. The developer spends much valuable time learning the specific programming requirements of each instrument in the system. Almost all instruments are designed for interactive use through a physical front panel and also offer remote control capability via a communication port on the back of the instrument. An instrument driver, in the simplest definition, is a set of software routines that handles the programmatic details of controlling and communicating with a specific instrument. The most successful instrument driver concepts have always distributed instrument drivers in source code and provided end users with access to the same tools developers use to write drivers. With this philosophy, new instrument drivers were often easily developed by end users through modifying an existing driver for another instrument.
{"title":"Attribute based instrument drivers","authors":"J. Kominek, Juergen Straub, J. Zidek","doi":"10.1109/IDAACS.2011.6072719","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072719","url":null,"abstract":"The task of programming instruments in a test system has always been a concern for end users and a major cost for the overall system development. Many users know that programming can often be the most time-consuming part of developing a system. The developer spends much valuable time learning the specific programming requirements of each instrument in the system. Almost all instruments are designed for interactive use through a physical front panel and also offer remote control capability via a communication port on the back of the instrument. An instrument driver, in the simplest definition, is a set of software routines that handles the programmatic details of controlling and communicating with a specific instrument. The most successful instrument driver concepts have always distributed instrument drivers in source code and provided end users with access to the same tools developers use to write drivers. With this philosophy, new instrument drivers were often easily developed by end users through modifying an existing driver for another instrument.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"121 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124827351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072824
Gerold Bausch, H. Beikirch
A large amount of neurophysiological research is based on the study of biological neural populations. The data is gathered from extra-cellular recordings with multi-electrode arrays (MEAs). The signal is a stream containing an unknown number of neural sources superpositioned with non-stationary noise components. In order to analyze the recorded spike trains, a prior separation into its individual components is required. The increasing number of sensors on a MEA surface demand an automatic spike sorting procedure. In this article the proposed spike sorting method replaces the manual steps with artificial neural networks to enable an unsupervised separation of mixed signal components. Furthermore the artificial neural network based feature extraction allows realtime processing and ensures high flexbibility and adaptivity for unknown signals with non-stationary noise components.
{"title":"An unsupervised method for realtime spike sorting","authors":"Gerold Bausch, H. Beikirch","doi":"10.1109/IDAACS.2011.6072824","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072824","url":null,"abstract":"A large amount of neurophysiological research is based on the study of biological neural populations. The data is gathered from extra-cellular recordings with multi-electrode arrays (MEAs). The signal is a stream containing an unknown number of neural sources superpositioned with non-stationary noise components. In order to analyze the recorded spike trains, a prior separation into its individual components is required. The increasing number of sensors on a MEA surface demand an automatic spike sorting procedure. In this article the proposed spike sorting method replaces the manual steps with artificial neural networks to enable an unsupervised separation of mixed signal components. Furthermore the artificial neural network based feature extraction allows realtime processing and ensures high flexbibility and adaptivity for unknown signals with non-stationary noise components.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129894611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072799
M. Kubinyi, R. Smid
Enhancing signal of interest over a noise level represents a common task for signal processing methods used in measurement systems. Digital filters are used for frequency band limited signals as a common signal of interest enhancement tool. If is the information localised not only in frequency domain but also in a limited time window within the signal, then wavelet processing represents a possible signal denoising solution with an advantage of a higher denoising efficiency over a common digital filter. Development of the wavelets has been focused on signal thresholding methods and adjustment to new sensor systems. We would like to present a fast algorithm for enhancing wavelet denoising performance. This is done by implementing an ideal band-pass filter over common wavelet thresholding process. We propose a new technique for signal denoising which is concentrated on the described ultrasonic signals. The proposed algorithm was verified on ultrasonic sensors which are widely used in the measurement of dimensions and quality of a product. Results showed a significant improvement in wavelet denoising performance over previously published wavelet denoising algorithms.
{"title":"Ultrasonic denoising with a modified wavelet filter","authors":"M. Kubinyi, R. Smid","doi":"10.1109/IDAACS.2011.6072799","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072799","url":null,"abstract":"Enhancing signal of interest over a noise level represents a common task for signal processing methods used in measurement systems. Digital filters are used for frequency band limited signals as a common signal of interest enhancement tool. If is the information localised not only in frequency domain but also in a limited time window within the signal, then wavelet processing represents a possible signal denoising solution with an advantage of a higher denoising efficiency over a common digital filter. Development of the wavelets has been focused on signal thresholding methods and adjustment to new sensor systems. We would like to present a fast algorithm for enhancing wavelet denoising performance. This is done by implementing an ideal band-pass filter over common wavelet thresholding process. We propose a new technique for signal denoising which is concentrated on the described ultrasonic signals. The proposed algorithm was verified on ultrasonic sensors which are widely used in the measurement of dimensions and quality of a product. Results showed a significant improvement in wavelet denoising performance over previously published wavelet denoising algorithms.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127125892","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072802
N. Chuong
This paper describes a work on selecting a sentence set, which are used for designing an audio-visual speech corpus for Vietnamese language in developing an Audio-Visual Speech Recognition system. An iterative procedure combined with some additional conditions was applied to an original sentence set of 540,744 sentences to select a minimum sentence set.
{"title":"Selection of sentence set for vietnamese audiovisual corpus design","authors":"N. Chuong","doi":"10.1109/IDAACS.2011.6072802","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072802","url":null,"abstract":"This paper describes a work on selecting a sentence set, which are used for designing an audio-visual speech corpus for Vietnamese language in developing an Audio-Visual Speech Recognition system. An iterative procedure combined with some additional conditions was applied to an original sentence set of 540,744 sentences to select a minimum sentence set.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127345681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072912
Katarzyna Jakowska-Suwalska, A. Sojda, M. Wolny
Evolutional algorithms are an excellent optimization tool performing well in such tasks where classic methods have no use. In this thesis the use of those algorithms in stock control was presented for the problem of stochastic function of cost criterion. Such functions are used in case of random demand with a constant delivery period. The operation of this algorithm was presented on the example strongly referring to the real problem of stock control. This algorithm is a recommendation for implementing changes in the information system of supporting the planning of material needs in the area of research project.
{"title":"Evolutional algorithm in stock control","authors":"Katarzyna Jakowska-Suwalska, A. Sojda, M. Wolny","doi":"10.1109/IDAACS.2011.6072912","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072912","url":null,"abstract":"Evolutional algorithms are an excellent optimization tool performing well in such tasks where classic methods have no use. In this thesis the use of those algorithms in stock control was presented for the problem of stochastic function of cost criterion. Such functions are used in case of random demand with a constant delivery period. The operation of this algorithm was presented on the example strongly referring to the real problem of stock control. This algorithm is a recommendation for implementing changes in the information system of supporting the planning of material needs in the area of research project.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130916819","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072829
Tomas Brychcin, Miloslav Konopík
This paper shows a method to improve the language modeling for inflectional languages such as the Czech and Slovak language. Methods are based upon the principle of class-based language models, where word classes are derived from morphological information. Our experiments show that the linear interpolation with the class-based language models outperforms the stand-alone word N-gram language model about 10–30%.
{"title":"Morphological based language models for inflectional languages","authors":"Tomas Brychcin, Miloslav Konopík","doi":"10.1109/IDAACS.2011.6072829","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072829","url":null,"abstract":"This paper shows a method to improve the language modeling for inflectional languages such as the Czech and Slovak language. Methods are based upon the principle of class-based language models, where word classes are derived from morphological information. Our experiments show that the linear interpolation with the class-based language models outperforms the stand-alone word N-gram language model about 10–30%.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128877036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072783
O. Boumbarov, Stanislav Panev, I. Paliy, P. Petrov, L. Dimitrov
This paper presents a framework for determining the orientation of human faces with a fixed monocular camera which can be used for the purposes of the gaze tracking afterwards. We use homography relation between two views/frames to handle with the lack of depth information. In order to compensate for the lack of depth information in the relationships between the 2D images in the image plane and the 3D Euclidean space, we present a complete vision-based approach to pose estimation. The homography relates corresponding points captured at two different locations of the face and determines the relationships between the two locations using pixel information and intrinsic parameters of the camera. In order to determine the mapping between the two images, it is assumed that in each frame in the video sequence, we are able to locate, extract and labeled four feature points of the face located at virtual plane attached to the face. Face detection and facial feature extraction are executed with Viola-Jones method. The verification stage for face detection use combined cascade of neural network classifiers uses the convolutional neural network.
{"title":"Homography-based face orientation determination from a fixed monocular camera","authors":"O. Boumbarov, Stanislav Panev, I. Paliy, P. Petrov, L. Dimitrov","doi":"10.1109/IDAACS.2011.6072783","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072783","url":null,"abstract":"This paper presents a framework for determining the orientation of human faces with a fixed monocular camera which can be used for the purposes of the gaze tracking afterwards. We use homography relation between two views/frames to handle with the lack of depth information. In order to compensate for the lack of depth information in the relationships between the 2D images in the image plane and the 3D Euclidean space, we present a complete vision-based approach to pose estimation. The homography relates corresponding points captured at two different locations of the face and determines the relationships between the two locations using pixel information and intrinsic parameters of the camera. In order to determine the mapping between the two images, it is assumed that in each frame in the video sequence, we are able to locate, extract and labeled four feature points of the face located at virtual plane attached to the face. Face detection and facial feature extraction are executed with Viola-Jones method. The verification stage for face detection use combined cascade of neural network classifiers uses the convolutional neural network.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126446822","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-10DOI: 10.1109/IDAACS.2011.6072830
Veska Gancheva, Bogdan Shishedjiev, Elena Kalcheva-Yovkova
Data obtained from the scientific experiments are heterogeneous and distributed. They are stored in different file formats so it is difficult to be integrated, analyzed, processed and visualised. Scientific experiments require effective and efficient data management. In this paper an approach to convert raw data obtained from scientific experiments into relational database using XML based description is proposed. The approach is tested and verified by database and web service implementations. The implemented web service provides authorized access to the experimental data. In order to provide a secure data transmission and data access the database is encrypted through an asymmetric 128-bit encrypting method implementation based on public and private key. A number of experiments using specific scientific data obtained from various experimental instruments are conducted to illustrate the usage of the proposed approach.
{"title":"An approach to convert scientific data description","authors":"Veska Gancheva, Bogdan Shishedjiev, Elena Kalcheva-Yovkova","doi":"10.1109/IDAACS.2011.6072830","DOIUrl":"https://doi.org/10.1109/IDAACS.2011.6072830","url":null,"abstract":"Data obtained from the scientific experiments are heterogeneous and distributed. They are stored in different file formats so it is difficult to be integrated, analyzed, processed and visualised. Scientific experiments require effective and efficient data management. In this paper an approach to convert raw data obtained from scientific experiments into relational database using XML based description is proposed. The approach is tested and verified by database and web service implementations. The implemented web service provides authorized access to the experimental data. In order to provide a secure data transmission and data access the database is encrypted through an asymmetric 128-bit encrypting method implementation based on public and private key. A number of experiments using specific scientific data obtained from various experimental instruments are conducted to illustrate the usage of the proposed approach.","PeriodicalId":106306,"journal":{"name":"Proceedings of the 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122297947","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}