Pub Date : 2018-10-01DOI: 10.1109/CYBERC.2018.00029
A. Valenzuela, Mauro Schwab, Adolfo A. Silnik, Alfredo F. Debattista, Roberto A. Kiessling
We present the development and evaluation of a basic building block for a future wireless sensor network for agriculture monitoring in Argentina. The module consists of a compact battery-powered wireless sensor node capable of monitoring the ambient air parameters of temperature, humidity, gas and air pressure in the agriculture industry of Argentina's Pampa region. Further in-and outputs allow the system to be extended flexibly by adding more sensors. Throughout the development, a simple, low-cost and open-source-based approach together with a lightweight communication protocol was pursued. The sensor nodes cover ranges of over 400 metres and can be operated on two AAA alkaline batteries for several years. Detailed current consumption values, range limits and battery life estimates are presented.
{"title":"Low Power Wireless Sensor Node Platform for Agriculture Monitoring in Argentina","authors":"A. Valenzuela, Mauro Schwab, Adolfo A. Silnik, Alfredo F. Debattista, Roberto A. Kiessling","doi":"10.1109/CYBERC.2018.00029","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00029","url":null,"abstract":"We present the development and evaluation of a basic building block for a future wireless sensor network for agriculture monitoring in Argentina. The module consists of a compact battery-powered wireless sensor node capable of monitoring the ambient air parameters of temperature, humidity, gas and air pressure in the agriculture industry of Argentina's Pampa region. Further in-and outputs allow the system to be extended flexibly by adding more sensors. Throughout the development, a simple, low-cost and open-source-based approach together with a lightweight communication protocol was pursued. The sensor nodes cover ranges of over 400 metres and can be operated on two AAA alkaline batteries for several years. Detailed current consumption values, range limits and battery life estimates are presented.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"50a 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120858343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01DOI: 10.1109/CYBERC.2018.00028
M. Rao, Admir Kaknjo, E. Omerdic, D. Toal, T. Newe
The LabVIEW FPGA platform is based on graphical programming approach, which makes easy the FPGA programming and the I/O interfacing. The LabVIEW FPGA significantly improves the design productivity and helps to reduce the time to market. On the other hand, traditional FPGA platform is helpful to get an efficient/optimized design by providing control over each bit using HDL programming languages. This work utilized traditional as well as LabVIEW FPGA platforms to get an optimized high speed design of AES (Advanced Encryption Standard). The AES is considered to be a secure and reliable cryptographic algorithm that is used worldwide to provide encryption services, which hide the information during communication over untrusted networks, like Internet. Here, AES core is proposed to secure the communication between ROV (Remotely Operated Vehicle) and control station in a marine environment; but this core can be fit in any other high speed electronic communications. This work provides encryption of 128-bytes, 256-bytes and 512-bytes set of inputs (individually and simultaneously) using a 128-bit key. In case of simultaneous implementation, all the above mentioned set of inputs is encrypted in parallel. This simultaneous implementation is resulted in throughput of Gbps range.
{"title":"An Efficient High Speed AES Implementation Using Traditional FPGA and LabVIEW FPGA Platforms","authors":"M. Rao, Admir Kaknjo, E. Omerdic, D. Toal, T. Newe","doi":"10.1109/CYBERC.2018.00028","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00028","url":null,"abstract":"The LabVIEW FPGA platform is based on graphical programming approach, which makes easy the FPGA programming and the I/O interfacing. The LabVIEW FPGA significantly improves the design productivity and helps to reduce the time to market. On the other hand, traditional FPGA platform is helpful to get an efficient/optimized design by providing control over each bit using HDL programming languages. This work utilized traditional as well as LabVIEW FPGA platforms to get an optimized high speed design of AES (Advanced Encryption Standard). The AES is considered to be a secure and reliable cryptographic algorithm that is used worldwide to provide encryption services, which hide the information during communication over untrusted networks, like Internet. Here, AES core is proposed to secure the communication between ROV (Remotely Operated Vehicle) and control station in a marine environment; but this core can be fit in any other high speed electronic communications. This work provides encryption of 128-bytes, 256-bytes and 512-bytes set of inputs (individually and simultaneously) using a 128-bit key. In case of simultaneous implementation, all the above mentioned set of inputs is encrypted in parallel. This simultaneous implementation is resulted in throughput of Gbps range.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121921114","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01DOI: 10.1109/CYBERC.2018.00016
Guangxia Xu, G. Gao, Mengxiao Hu
Micro-blog has become an important information dissemination and exchange platform in people's social lives. Massive micro-blog data contains a large number of valuable information, but the micro-blog platform appears to have a lot of spam behavior problems in recent years; behavior consistent with spammers and spam micro-blogs. The spam not only affects the impact of micro-blog's data mining and decision analysis, but also seriously affects the healthy development of micro-blog platform and user experience. In this paper, a new spammer detection method based on fuzzy multi-class support vector machines (FMCSVM) is proposed in micro-blog, it combines the SVM multi-class classifier with the fuzzy mathematics theory in spammer detection. Current researches on micro-blog spammers is to analyze the characteristics of the global spammers, so that the strength of these analyses is not enough, and these researches lack the feature analysis for a certain type spammer. As a result, this will enable the spammer to escape the spam detection system. In this paper, we divide spammers into three categories by analyzing the features of micro-blog spammers, and then construct one-versus-rest SVM multi-class classifier. The fuzzy clustering method is used to deal with the mixed samples generated by the multi class classifier, and the combination classifier is obtained, which improves the detection accuracy.
{"title":"Detecting Spammer on Micro-blogs Base on Fuzzy Multi-class SVM","authors":"Guangxia Xu, G. Gao, Mengxiao Hu","doi":"10.1109/CYBERC.2018.00016","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00016","url":null,"abstract":"Micro-blog has become an important information dissemination and exchange platform in people's social lives. Massive micro-blog data contains a large number of valuable information, but the micro-blog platform appears to have a lot of spam behavior problems in recent years; behavior consistent with spammers and spam micro-blogs. The spam not only affects the impact of micro-blog's data mining and decision analysis, but also seriously affects the healthy development of micro-blog platform and user experience. In this paper, a new spammer detection method based on fuzzy multi-class support vector machines (FMCSVM) is proposed in micro-blog, it combines the SVM multi-class classifier with the fuzzy mathematics theory in spammer detection. Current researches on micro-blog spammers is to analyze the characteristics of the global spammers, so that the strength of these analyses is not enough, and these researches lack the feature analysis for a certain type spammer. As a result, this will enable the spammer to escape the spam detection system. In this paper, we divide spammers into three categories by analyzing the features of micro-blog spammers, and then construct one-versus-rest SVM multi-class classifier. The fuzzy clustering method is used to deal with the mixed samples generated by the multi class classifier, and the combination classifier is obtained, which improves the detection accuracy.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128854441","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01DOI: 10.1109/CYBERC.2018.00094
Shumiao Yu, Weifeng Sun, Minghan Jia
With the development of geographic information system, digital earth and digital city play more and more important roles in life. The data generated by sensors or other edge nodes need to be collected by crawlers in the distributed systems in IoT, such as the GIS data in CyberGIS. In some edge networks, network operators have adopted methods to limit crawlers, such as blocking the request IP addresses, requiring logging in verification codes and other measures to avoid disturbance to servers. To collect data from web servers in these types of edge networks, a dynamic IP address based strategy DP-crawler is proposed to solve the anti-crawler strategies in the edge networks. DP-crawler can dynamic get proper IP addresses from a security-aware list and select the best available proxies. The security-aware list is designed to use the block-chain. Security and dynamic storage can be achieved by this method. DP-crawler is used to crawler webs, and the detailed information of Douban movies are obtained in the experiments. The experiment results show that the DP-Crawler can get more information by using the DP-Crawler.
{"title":"A Dynamic Proxy Based Crawler Strategy for Data Collection on CyberGIS","authors":"Shumiao Yu, Weifeng Sun, Minghan Jia","doi":"10.1109/CYBERC.2018.00094","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00094","url":null,"abstract":"With the development of geographic information system, digital earth and digital city play more and more important roles in life. The data generated by sensors or other edge nodes need to be collected by crawlers in the distributed systems in IoT, such as the GIS data in CyberGIS. In some edge networks, network operators have adopted methods to limit crawlers, such as blocking the request IP addresses, requiring logging in verification codes and other measures to avoid disturbance to servers. To collect data from web servers in these types of edge networks, a dynamic IP address based strategy DP-crawler is proposed to solve the anti-crawler strategies in the edge networks. DP-crawler can dynamic get proper IP addresses from a security-aware list and select the best available proxies. The security-aware list is designed to use the block-chain. Security and dynamic storage can be achieved by this method. DP-crawler is used to crawler webs, and the detailed information of Douban movies are obtained in the experiments. The experiment results show that the DP-Crawler can get more information by using the DP-Crawler.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128230690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01DOI: 10.1109/CYBERC.2018.00045
Yang Wang, Lan Li, Lei Fan
Data protection and the control of information flow are basic requirements for the security operation of enterprises or organizations. The data provenance of documents is a function that records the transmission of a specific document and provenance afterwards. As an important function of enterprise information security control, it has been confronted with the trouble of high management costs. Therefore, this paper attempts to recover the document content by proactively monitoring the internal traffic data of the enterprise and restore the document and find the parent document accurately through the proposed algorithm, thereby getting rid of the shackle of traditional document tracing. In order to ensure the flexibility and scalability of the streaming data restoration, this paper tries to build algorithm modules based on Flink, a streaming process platform, by migrating key computing services to its platform. In the process, the capture agent is set at the key node to collect traffic data, which is put into the stream processing system through the message queue. The stream processing system restores the file using document restoration algorithm, and finally the file is handed over to the feature extraction module. After the feature extraction module completes the file analysis, it is stored on file systems or structed data storage systems and waits for document tracking requests. The entire system solution achieved above and the daily business of the enterprise are completely seperated, while the load on the internal network flow is also very small. On the other hand, relying on the advantages of Flink's excellent distributed features, the experiments show that the data provenance results are satisfactory.
{"title":"A Binary Feature Extraction Based Data Provenance System Implemented on Flink Platform","authors":"Yang Wang, Lan Li, Lei Fan","doi":"10.1109/CYBERC.2018.00045","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00045","url":null,"abstract":"Data protection and the control of information flow are basic requirements for the security operation of enterprises or organizations. The data provenance of documents is a function that records the transmission of a specific document and provenance afterwards. As an important function of enterprise information security control, it has been confronted with the trouble of high management costs. Therefore, this paper attempts to recover the document content by proactively monitoring the internal traffic data of the enterprise and restore the document and find the parent document accurately through the proposed algorithm, thereby getting rid of the shackle of traditional document tracing. In order to ensure the flexibility and scalability of the streaming data restoration, this paper tries to build algorithm modules based on Flink, a streaming process platform, by migrating key computing services to its platform. In the process, the capture agent is set at the key node to collect traffic data, which is put into the stream processing system through the message queue. The stream processing system restores the file using document restoration algorithm, and finally the file is handed over to the feature extraction module. After the feature extraction module completes the file analysis, it is stored on file systems or structed data storage systems and waits for document tracking requests. The entire system solution achieved above and the daily business of the enterprise are completely seperated, while the load on the internal network flow is also very small. On the other hand, relying on the advantages of Flink's excellent distributed features, the experiments show that the data provenance results are satisfactory.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127979672","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01DOI: 10.1109/CYBERC.2018.00024
Chuan Tian, Wenge Rong, Y. Ouyang, Zhang Xiong
Distributed word representation has demonstrated impressive improvements on numerous natural language processing applications. However, most existing word representation learning methods rarely consider use of word order information, and lead to confusion of similarity and relevance. Targeting on this problem we propose a general learning approach DAV (Distributional Asymmetry Vector) to build better word representation by utilizing word pair distributional asymmetry, which contains word order information. Experimental study on two large benchmarks with several state-of-art word representation learning models has shown the potential of the proposed method.
{"title":"Improving Word Representation with Word Pair Distributional Asymmetry","authors":"Chuan Tian, Wenge Rong, Y. Ouyang, Zhang Xiong","doi":"10.1109/CYBERC.2018.00024","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00024","url":null,"abstract":"Distributed word representation has demonstrated impressive improvements on numerous natural language processing applications. However, most existing word representation learning methods rarely consider use of word order information, and lead to confusion of similarity and relevance. Targeting on this problem we propose a general learning approach DAV (Distributional Asymmetry Vector) to build better word representation by utilizing word pair distributional asymmetry, which contains word order information. Experimental study on two large benchmarks with several state-of-art word representation learning models has shown the potential of the proposed method.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116438071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01DOI: 10.1109/CYBERC.2018.00034
Suzhen Wang, Yanpiao Zhang, Lu Zhang, Ning Cao
The development of the Internet of things(IOT) has produced huge diversity of data. In view of the need for massive data processing and application by the Internet of things, the big data service platform arises at the historic moment. Our paper mainly studies the optimization of the big data platform in the background of Internet of things, and combines the Internet of things with the big data platform–Spark. In this paper, we proposed an improved Spark job scheduling scheme based on the genetic and tabu search algorithm. By optimizing the job scheduling algorithm of Spark, it will provide the better technical support for data processing in the Internet of things.
{"title":"The Optimization of Big Data Platform Under the Internet of Things","authors":"Suzhen Wang, Yanpiao Zhang, Lu Zhang, Ning Cao","doi":"10.1109/CYBERC.2018.00034","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00034","url":null,"abstract":"The development of the Internet of things(IOT) has produced huge diversity of data. In view of the need for massive data processing and application by the Internet of things, the big data service platform arises at the historic moment. Our paper mainly studies the optimization of the big data platform in the background of Internet of things, and combines the Internet of things with the big data platform–Spark. In this paper, we proposed an improved Spark job scheduling scheme based on the genetic and tabu search algorithm. By optimizing the job scheduling algorithm of Spark, it will provide the better technical support for data processing in the Internet of things.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121071434","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01DOI: 10.1109/CYBERC.2018.00065
G. Han, Linxin Zhang, X. Mu, Dalong Zhang, Yi Sun
A two-user uplink massive MIMO system is considered in this paper, where each user has a single antenna and the base station (BS) is equipped with a large number of antennas. It is assumed that the small scale channel fading is Rayleigh fading and the channel fading coefficients keep quasi-static in two consecutive slots, and then, change to other values independently in the next two slots. For such a massive MIMO uplink system, a QAM division based space-time modulation scheme is proposed to execute the simultaneous communication of the two users with the same frequency, and four detectors are given to adapt to different conditions. In addition, the channel coefficients can also be figured out after the signals are correctly detected. Computer simulations demonstrate that the proposed scheme performs well and need less than 100 BS antennas to make the average BER below 10^3.
{"title":"QAM Division Based Space-Time Modulation for Two-User Uplink Massive MIMO Systems","authors":"G. Han, Linxin Zhang, X. Mu, Dalong Zhang, Yi Sun","doi":"10.1109/CYBERC.2018.00065","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00065","url":null,"abstract":"A two-user uplink massive MIMO system is considered in this paper, where each user has a single antenna and the base station (BS) is equipped with a large number of antennas. It is assumed that the small scale channel fading is Rayleigh fading and the channel fading coefficients keep quasi-static in two consecutive slots, and then, change to other values independently in the next two slots. For such a massive MIMO uplink system, a QAM division based space-time modulation scheme is proposed to execute the simultaneous communication of the two users with the same frequency, and four detectors are given to adapt to different conditions. In addition, the channel coefficients can also be figured out after the signals are correctly detected. Computer simulations demonstrate that the proposed scheme performs well and need less than 100 BS antennas to make the average BER below 10^3.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126581357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01DOI: 10.1109/CYBERC.2018.00069
Lingxiao Zhao, Shuangzhi Li, Jiankang Zhang, X. Mu
In this paper, a multi-user massive multiple-input and multiple-output (MIMO) uplink system is considered, in which multiple single antenna users communicate with a target BS equipped with a large antenna array. We assume both the BS and K users have no knowledge of channel statement information. For such a system, by utilizing the unique factorization of three-way tensors, we proposed a parafac-based blind channel estimation and symbol detection scheme for the massive MIMO system, the proposed system can ensure the unique identification of the channel matrix and symbol matrix in a noise-free case. In a noisy case, a novel fitting algorithm called constrained bilinear alternating least squares is proposed to efficiently estimate the channel matrix and symbols. Numerical simulation results illustrate that the proposed scheme has a superior bit error ratio and normalized mean square error performance than traditional least square method. In addition, it has a faster convergence speed than typical alternation least square fitting algorithm.
{"title":"A Parafac-Based Blind Channel Estimation and Symbol Detection Scheme for Massive MIMO Systems","authors":"Lingxiao Zhao, Shuangzhi Li, Jiankang Zhang, X. Mu","doi":"10.1109/CYBERC.2018.00069","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00069","url":null,"abstract":"In this paper, a multi-user massive multiple-input and multiple-output (MIMO) uplink system is considered, in which multiple single antenna users communicate with a target BS equipped with a large antenna array. We assume both the BS and K users have no knowledge of channel statement information. For such a system, by utilizing the unique factorization of three-way tensors, we proposed a parafac-based blind channel estimation and symbol detection scheme for the massive MIMO system, the proposed system can ensure the unique identification of the channel matrix and symbol matrix in a noise-free case. In a noisy case, a novel fitting algorithm called constrained bilinear alternating least squares is proposed to efficiently estimate the channel matrix and symbols. Numerical simulation results illustrate that the proposed scheme has a superior bit error ratio and normalized mean square error performance than traditional least square method. In addition, it has a faster convergence speed than typical alternation least square fitting algorithm.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125609324","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Real-Time video analytics, whose applications range from safety, public security to smart cities, is a typical use case of Internet of Things (IoT). However, uploading the video stream to the cloud for analytics cannot meet the requirements of low latency and efficient bandwidth usage. Edge video analytics, which uploads the stream at the edge node, is a key to solve the abovementioned problem. This paper proposes an intelligent video analytics system on edge computing platform. Combining the edge computing and video analytics, this system can analyze the video stream by face recognition, indoor positioning, and semantic analytics in real time and archive the videos automatically. Specifically, applied in conference room, the video analytics system analyzes the conference room scenario and files the conference videos, which reduces the cost of manual recording and promotes the data sharing. The implementation results prove that our system can operate smoothly on the edge computing platform to provide real-time and efficient video analytics services.
{"title":"Design and Implementation of Video Analytics System Based on Edge Computing","authors":"Yuejun Chen, Yinghao Xie, Yihong Hu, Yaqiong Liu, Guochu Shou","doi":"10.1109/CYBERC.2018.00035","DOIUrl":"https://doi.org/10.1109/CYBERC.2018.00035","url":null,"abstract":"Real-Time video analytics, whose applications range from safety, public security to smart cities, is a typical use case of Internet of Things (IoT). However, uploading the video stream to the cloud for analytics cannot meet the requirements of low latency and efficient bandwidth usage. Edge video analytics, which uploads the stream at the edge node, is a key to solve the abovementioned problem. This paper proposes an intelligent video analytics system on edge computing platform. Combining the edge computing and video analytics, this system can analyze the video stream by face recognition, indoor positioning, and semantic analytics in real time and archive the videos automatically. Specifically, applied in conference room, the video analytics system analyzes the conference room scenario and files the conference videos, which reduces the cost of manual recording and promotes the data sharing. The implementation results prove that our system can operate smoothly on the edge computing platform to provide real-time and efficient video analytics services.","PeriodicalId":282903,"journal":{"name":"2018 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"79 290 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125965047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}