Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323037
T. Marty
Point to multipoint multicast (MC) packets delivery without optimal distribution trees is prone to unnecessary retransmissions that adversely influences narrow-band VHF networks. The bandwidth efficient MC packet delivery path has to be calculated to best reach all destinations using the inherent broadcast nature of VHF radio subnets by avoiding retransmissions of the same message/packet in the same radio subnet due to parallel gateways or parallel paths to destinations. Optimal source trees for point to multipoint routing in a mixture of broadcast (VHF) radio subnets and unicast radio links (line of sight links) are proposed. These trees avoid unnecessary retransmissions using the broadcast nature of VHF radios.
{"title":"Optimal source tree point to multipoint routing for mobile radio networks","authors":"T. Marty","doi":"10.1109/ICDIPC.2015.7323037","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323037","url":null,"abstract":"Point to multipoint multicast (MC) packets delivery without optimal distribution trees is prone to unnecessary retransmissions that adversely influences narrow-band VHF networks. The bandwidth efficient MC packet delivery path has to be calculated to best reach all destinations using the inherent broadcast nature of VHF radio subnets by avoiding retransmissions of the same message/packet in the same radio subnet due to parallel gateways or parallel paths to destinations. Optimal source trees for point to multipoint routing in a mixture of broadcast (VHF) radio subnets and unicast radio links (line of sight links) are proposed. These trees avoid unnecessary retransmissions using the broadcast nature of VHF radios.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130949536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323036
Mahmoud E. A. Abdel-Hadi, Reda A. El-Khoribi, M. Shoman, M. M. Refaey
Motivated by the need to deal with critical disorders that involve death of neurons, such as Amyotrophic Lateral Sclerosis (ALS) and brainstem stroke, interpretation of the brain's Motor Imagery (MI) activities is highly needed. Brain signals can be translated into control commands. Electroencephalography (EEG) is considered in this work, EEG is a low-cost non-invasive technique. A big challenge is faced due to the poor signal-to-noise ratio of EEG signals. The dataset used in this work is based on asynchronous or self-paced motor imagery problem. The used self-paced Brain Computer Interface (BCI) problem poses a considerable challenge by introducing an additional class, a relax class, or non-intentional control periods that are not included in the training set and should be classified. In this work, a number of subject dependent parameters and their values are determined. These parameters are: the best frequency range, the best Common Spatial Pattern (CSP) channels, and the number of these CSP channels. System parameters are determined dynamically in the offline training phase. Energy based features are extracted afterwards from the best selected signals. The Least-Squares Support Vector Machine (LS-SVM) classifier is used as a classification back end. Results of the proposed system show superiority over the previously introduced systems in terms of the Mean Square Error (MSE) when tested on the Berlin BCI (BBCI) competition IV dataset 1.
{"title":"Classification of motor imagery tasks with LS-SVM in EEG-based self-paced BCI","authors":"Mahmoud E. A. Abdel-Hadi, Reda A. El-Khoribi, M. Shoman, M. M. Refaey","doi":"10.1109/ICDIPC.2015.7323036","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323036","url":null,"abstract":"Motivated by the need to deal with critical disorders that involve death of neurons, such as Amyotrophic Lateral Sclerosis (ALS) and brainstem stroke, interpretation of the brain's Motor Imagery (MI) activities is highly needed. Brain signals can be translated into control commands. Electroencephalography (EEG) is considered in this work, EEG is a low-cost non-invasive technique. A big challenge is faced due to the poor signal-to-noise ratio of EEG signals. The dataset used in this work is based on asynchronous or self-paced motor imagery problem. The used self-paced Brain Computer Interface (BCI) problem poses a considerable challenge by introducing an additional class, a relax class, or non-intentional control periods that are not included in the training set and should be classified. In this work, a number of subject dependent parameters and their values are determined. These parameters are: the best frequency range, the best Common Spatial Pattern (CSP) channels, and the number of these CSP channels. System parameters are determined dynamically in the offline training phase. Energy based features are extracted afterwards from the best selected signals. The Least-Squares Support Vector Machine (LS-SVM) classifier is used as a classification back end. Results of the proposed system show superiority over the previously introduced systems in terms of the Mean Square Error (MSE) when tested on the Berlin BCI (BBCI) competition IV dataset 1.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128862137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323041
M. Hassan, Jongsuk Lee
This techno-policy paper presents the policymakers' perspective towards the relative importance of the critical success factors (CSFs) that are paramount for the e-Government (e-Gov) success in Pakistan. We proposed a novel policy framework for the e-Gov success in Pakistan by deploying the CSFs and Analytic Hierarchy Process (AHP) approach. We conducted an official survey of all the policymakers and stakeholders, who are engaged in consulting, developing, implementing, promoting, and using the e-Gov programs in Pakistan. The empirical results indicate that the CSFs main-categories: Governance, Management, and Resources are relatively more important than the Socio-Economics. Further, the CSFs sub-categories: Political, Managerial, Legislative, Non-Technical, and Technical are relatively more important than Social, Economic, and Scope. Finally, the CSFs: Political Stability, Managerial Strategy, ICT Policies, Funding, Portal Technology, Education & Skills, Cost, and Autonomy are the most important factors to achieve their respective CSFs sub-categories, which ultimately affect the e-Gov success in Pakistan. This empirical study presents valuable policy implications and recommendations to the GOP, its policymakers and stakeholders - who are striving hard for the e-Gov success in Pakistan since 2000. Our study can pave the way forward to attain the Good Governance in Pakistan.
{"title":"Policymakers' perspective towards e-Gov success: A potent technology for attaining Good Governance in Pakistan","authors":"M. Hassan, Jongsuk Lee","doi":"10.1109/ICDIPC.2015.7323041","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323041","url":null,"abstract":"This techno-policy paper presents the policymakers' perspective towards the relative importance of the critical success factors (CSFs) that are paramount for the e-Government (e-Gov) success in Pakistan. We proposed a novel policy framework for the e-Gov success in Pakistan by deploying the CSFs and Analytic Hierarchy Process (AHP) approach. We conducted an official survey of all the policymakers and stakeholders, who are engaged in consulting, developing, implementing, promoting, and using the e-Gov programs in Pakistan. The empirical results indicate that the CSFs main-categories: Governance, Management, and Resources are relatively more important than the Socio-Economics. Further, the CSFs sub-categories: Political, Managerial, Legislative, Non-Technical, and Technical are relatively more important than Social, Economic, and Scope. Finally, the CSFs: Political Stability, Managerial Strategy, ICT Policies, Funding, Portal Technology, Education & Skills, Cost, and Autonomy are the most important factors to achieve their respective CSFs sub-categories, which ultimately affect the e-Gov success in Pakistan. This empirical study presents valuable policy implications and recommendations to the GOP, its policymakers and stakeholders - who are striving hard for the e-Gov success in Pakistan since 2000. Our study can pave the way forward to attain the Good Governance in Pakistan.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124189315","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323045
Oommen Mathews, Hakduran Koc, Muberra N. Akcaman
In this work, we propose a technique based on task recomputation in order to improve reliability without incurring any performance degradation in embedded systems. The technique focuses on recomputing the task using the slack available on idle processors thereby maximizing the usage of the processing elements. In conjunction with task recomputation, we employed two metrics called as Fault Propagation Scope (FPS) and Degree of Criticality (DoC). Our technique, named as Hybrid Recomputation, improves the reliability even as the scope for fault propagation is reduced. The fault propagation scope of the task graph is reduced by incorporating the fault propagation scope of each task and its degree of criticality into the scheduling algorithm. Our technique of hybrid recomputation is analyzed using various automatically generated task graphs using TGFF. The experimental results clearly indicate the viability of the proposed approach under different latency constraints.
{"title":"Improving reliability through fault propagation scope in embedded systems","authors":"Oommen Mathews, Hakduran Koc, Muberra N. Akcaman","doi":"10.1109/ICDIPC.2015.7323045","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323045","url":null,"abstract":"In this work, we propose a technique based on task recomputation in order to improve reliability without incurring any performance degradation in embedded systems. The technique focuses on recomputing the task using the slack available on idle processors thereby maximizing the usage of the processing elements. In conjunction with task recomputation, we employed two metrics called as Fault Propagation Scope (FPS) and Degree of Criticality (DoC). Our technique, named as Hybrid Recomputation, improves the reliability even as the scope for fault propagation is reduced. The fault propagation scope of the task graph is reduced by incorporating the fault propagation scope of each task and its degree of criticality into the scheduling algorithm. Our technique of hybrid recomputation is analyzed using various automatically generated task graphs using TGFF. The experimental results clearly indicate the viability of the proposed approach under different latency constraints.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124708073","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323042
Enrico De Giovanni, S. Flesca, A. Folino, R. Guarasci, Elisa Sorrentino
The aim of this paper is to illustrate a significant issue introduced by the current Italian normative framework in the field of E-Government concerning digital documents. We focus on the probative value of paper and digital documents and in particular on the problem of digital document copies and duplicates, as defined by the digital administration code, by analysing their status both from a juridical and technical point of view and by hypothesizing possible solutions to overcome their inherent vulnerabilities.
{"title":"Digital document copies and duplicates","authors":"Enrico De Giovanni, S. Flesca, A. Folino, R. Guarasci, Elisa Sorrentino","doi":"10.1109/ICDIPC.2015.7323042","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323042","url":null,"abstract":"The aim of this paper is to illustrate a significant issue introduced by the current Italian normative framework in the field of E-Government concerning digital documents. We focus on the probative value of paper and digital documents and in particular on the problem of digital document copies and duplicates, as defined by the digital administration code, by analysing their status both from a juridical and technical point of view and by hypothesizing possible solutions to overcome their inherent vulnerabilities.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133371875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323044
Mohamed Soliman Halawa, M. E. Shehab, Essam M. Ramzy Hamed
E-learning has become an essential factor in the modern educational system. In today's diverse student population, E-learning must recognize the differences in student personalities to make the learning process more personalized. The objective of this study is to create a data model to identify both the student personality type and the dominant preference based on the Myers-Briggs Type Indicator (MBTI) theory. The proposed model utilizes data from student engagement with the learning management system (Moodle) and the social network, Facebook. The model helps students become aware of their personality, which in turn makes them more efficient in their study habits. The model also provides vital information for educators, equipping them with a better understanding of each student's personality. With this knowledge, educators will be more capable of matching students with their respective learning styles. The proposed model was applied on a sample data collected from the Business College at the German university in Cairo, Egypt (240 students). The model was tested using 10 data mining classification algorithms which were NaiveBayes, BayesNet, Kstar, Random forest, J48, OneR, JRIP, KNN /IBK, RandomTree, Decision Table. The results showed that OneR had the best accuracy percentage of 97.40%, followed by Random forest 93.23% and J48 92.19%.
{"title":"Predicting student personality based on a data-driven model from student behavior on LMS and social networks","authors":"Mohamed Soliman Halawa, M. E. Shehab, Essam M. Ramzy Hamed","doi":"10.1109/ICDIPC.2015.7323044","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323044","url":null,"abstract":"E-learning has become an essential factor in the modern educational system. In today's diverse student population, E-learning must recognize the differences in student personalities to make the learning process more personalized. The objective of this study is to create a data model to identify both the student personality type and the dominant preference based on the Myers-Briggs Type Indicator (MBTI) theory. The proposed model utilizes data from student engagement with the learning management system (Moodle) and the social network, Facebook. The model helps students become aware of their personality, which in turn makes them more efficient in their study habits. The model also provides vital information for educators, equipping them with a better understanding of each student's personality. With this knowledge, educators will be more capable of matching students with their respective learning styles. The proposed model was applied on a sample data collected from the Business College at the German university in Cairo, Egypt (240 students). The model was tested using 10 data mining classification algorithms which were NaiveBayes, BayesNet, Kstar, Random forest, J48, OneR, JRIP, KNN /IBK, RandomTree, Decision Table. The results showed that OneR had the best accuracy percentage of 97.40%, followed by Random forest 93.23% and J48 92.19%.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115703649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323038
A. Rassaki, A. Nel
We present an analysis of the problem of routing and bandwidth allocation problem in packet switched networks. In this problem, we identify a route for every pair of communicating nodes and then assign a capacity to each link in the network in order to minimize the total line capacity and delay costs. We have developed a mathematical programming formulation which is an efficient solution. This formulation is indicated to be effective procedure based on computational results across a variety of networks.
{"title":"Optimal capacity assignment in IP networks","authors":"A. Rassaki, A. Nel","doi":"10.1109/ICDIPC.2015.7323038","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323038","url":null,"abstract":"We present an analysis of the problem of routing and bandwidth allocation problem in packet switched networks. In this problem, we identify a route for every pair of communicating nodes and then assign a capacity to each link in the network in order to minimize the total line capacity and delay costs. We have developed a mathematical programming formulation which is an efficient solution. This formulation is indicated to be effective procedure based on computational results across a variety of networks.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"214 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116161928","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323031
D. Tomtsis, Sotirios Kontogiannis, G. Kokkonis, I. Kazanidis, S. Valsamidis
A Wearable Health Monitoring system is described that uses low-cost off the shelf sensors and computing components, along with custom made software, to provide an economical solution to personalized health care monitoring problems, while retaining all the functionality and flexibility of more expensive systems. Cloud technology is used to store sensor measurements which are categorized based on the criticality of bio-signals. Furthermore, a new session protocol for medical sensor data transmission is proposed. In a prototype system, tests based on different transmission architectures have so far yielded useful and favorable results in comparison with existing protocols and usually more expensive systems.
{"title":"Proposed cloud infrastructure of wearable and ubiquitous medical services","authors":"D. Tomtsis, Sotirios Kontogiannis, G. Kokkonis, I. Kazanidis, S. Valsamidis","doi":"10.1109/ICDIPC.2015.7323031","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323031","url":null,"abstract":"A Wearable Health Monitoring system is described that uses low-cost off the shelf sensors and computing components, along with custom made software, to provide an economical solution to personalized health care monitoring problems, while retaining all the functionality and flexibility of more expensive systems. Cloud technology is used to store sensor measurements which are categorized based on the criticality of bio-signals. Furthermore, a new session protocol for medical sensor data transmission is proposed. In a prototype system, tests based on different transmission architectures have so far yielded useful and favorable results in comparison with existing protocols and usually more expensive systems.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124527571","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323048
Taha Aljadir, Omar Mohyaldeen, Mouath Abdalrahman
Practically, digital watermarking is considered as an advanced field of investigating to avoid unauthorized copying and duplication. In this paper, a comparative analysis is conducted among embedding the watermark on the edges of a cover image in two domains, spatial and frequency domains. In this work, the Sobel edge detection algorithm is used to find edges and embed the watermark on the edge in each domain using DWT and DCT. Results demonstrated that the embedding process in frequency domain is more accurate and effective than it in spatial domain. Furthermore, the use of the four bands of frequency in DWT makes embedding on edge algorithm stronger and more robust with low MSE rates averagely.
{"title":"Comparison between embedding on edges in spatial and frequency domains","authors":"Taha Aljadir, Omar Mohyaldeen, Mouath Abdalrahman","doi":"10.1109/ICDIPC.2015.7323048","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323048","url":null,"abstract":"Practically, digital watermarking is considered as an advanced field of investigating to avoid unauthorized copying and duplication. In this paper, a comparative analysis is conducted among embedding the watermark on the edges of a cover image in two domains, spatial and frequency domains. In this work, the Sobel edge detection algorithm is used to find edges and embed the watermark on the edge in each domain using DWT and DCT. Results demonstrated that the embedding process in frequency domain is more accurate and effective than it in spatial domain. Furthermore, the use of the four bands of frequency in DWT makes embedding on edge algorithm stronger and more robust with low MSE rates averagely.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125499599","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.1109/ICDIPC.2015.7323028
Vladimír Hanušniak, Marian Svalec, Juraj Branický, L. Takac, M. Zábovský
It has been planned that the whole region of Slovak Republic's surface would be scanned, and there arose a need for storing the resulting data and making it publicly available. For this purpose, a scalable file-based database system for storing and accessing a large amount of geographic point cloud data was developed. The principle of the system was tested and proved to be sufficient in most situations, but under certain circumstances, single-computer solution was not satisfactory. So, the system was re-implemented using the Hadoop framework and experiments with many configurations were done. The results of the experiments are presented in this paper along with our conclusions.
{"title":"Exploitation of Hadoop framework for point cloud geographic data storage system","authors":"Vladimír Hanušniak, Marian Svalec, Juraj Branický, L. Takac, M. Zábovský","doi":"10.1109/ICDIPC.2015.7323028","DOIUrl":"https://doi.org/10.1109/ICDIPC.2015.7323028","url":null,"abstract":"It has been planned that the whole region of Slovak Republic's surface would be scanned, and there arose a need for storing the resulting data and making it publicly available. For this purpose, a scalable file-based database system for storing and accessing a large amount of geographic point cloud data was developed. The principle of the system was tested and proved to be sufficient in most situations, but under certain circumstances, single-computer solution was not satisfactory. So, the system was re-implemented using the Hadoop framework and experiments with many configurations were done. The results of the experiments are presented in this paper along with our conclusions.","PeriodicalId":339685,"journal":{"name":"2015 Fifth International Conference on Digital Information Processing and Communications (ICDIPC)","volume":"294 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126022357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}