首页 > 最新文献

2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)最新文献

英文 中文
Investigation of Strong Geomagnetic Storms Using Multidisciplinary Big Data Sets 基于多学科大数据集的强地磁风暴研究
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010611
Lyubka Pashova, B. Srebrov, O. Kounchev
The paper contains an overview of world data centres as INETMAGNET, SWS, DIAS, IGS, and EUREF, which are repositories of scientific Big Data sets for studying geomagnetic storms, by the means of the available geomagnetic, ionospheric and GNSS data. As an example, the results of a study based on Wavelet Analysis of the local manifestation of the geomagnetic storm on September 7–8, 2017, using time series of observed geophysical parameters obtained from different stations in the Balkans and from satellite observations, are presented. These data include global geomagnetic indexes and local data, in which the H-component of the geomagnetic field, critical frequency foF2 of the ionospheric F2 layer, and VTEC from GPS observations at a separate measuring station are included too. Specific features of the local manifestation of this storm event are outlined, based on the performed joint analysis and comparison of the geophysical parameters.
本文概述了INETMAGNET、SWS、DIAS、IGS和EUREF等世界数据中心的概况,这些数据中心是利用现有地磁、电离层和GNSS数据研究地磁风暴的科学大数据集的存储库。以2017年9月7-8日巴尔干地区不同台站观测地球物理参数时间序列和卫星观测数据为例,介绍了基于小波分析的地磁风暴局地表现研究结果。这些资料包括全球地磁指数和局部资料,其中还包括地磁h分量、电离层F2层临界频率foF2和单独测量站GPS观测的VTEC。在联合分析和比较地球物理参数的基础上,概述了这次风暴事件局部表现的具体特征。
{"title":"Investigation of Strong Geomagnetic Storms Using Multidisciplinary Big Data Sets","authors":"Lyubka Pashova, B. Srebrov, O. Kounchev","doi":"10.1109/BdKCSE48644.2019.9010611","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010611","url":null,"abstract":"The paper contains an overview of world data centres as INETMAGNET, SWS, DIAS, IGS, and EUREF, which are repositories of scientific Big Data sets for studying geomagnetic storms, by the means of the available geomagnetic, ionospheric and GNSS data. As an example, the results of a study based on Wavelet Analysis of the local manifestation of the geomagnetic storm on September 7–8, 2017, using time series of observed geophysical parameters obtained from different stations in the Balkans and from satellite observations, are presented. These data include global geomagnetic indexes and local data, in which the H-component of the geomagnetic field, critical frequency foF2 of the ionospheric F2 layer, and VTEC from GPS observations at a separate measuring station are included too. Specific features of the local manifestation of this storm event are outlined, based on the performed joint analysis and comparison of the geophysical parameters.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132899637","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Using Big Data for Data Leak Prevention 利用大数据预防数据泄漏
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010596
Ivan Gaidarski, P. Kutinchev
The paper present our approach for protecting sensitive data, using the methods of Big Data. To effectively protect the valuable information within the organization, the following steps are needed: Employing a holistic approach for data classification, identifying sensitive data of the organization, Identifying critical exit points - communication channels, applications and connected devices and protecting the sensitive data by controlling the critical exit points. Our approach is based on creating of component-based architecture framework for ISS, conceptual models for data protection and implementation with COTS IT security products as Data Leak Prevention (DLP) solutions. Our approach is data centric, which is holistic by its nature to protect the meaningful data of the organization.
本文介绍了我们利用大数据方法保护敏感数据的方法。为了有效保护组织内的有价值信息,需要采取以下步骤:采用整体方法进行数据分类,识别组织的敏感数据,识别关键出口点-通信渠道,应用程序和连接的设备,并通过控制关键出口点来保护敏感数据。我们的方法是基于为ISS创建基于组件的体系结构框架,数据保护的概念模型和使用COTS IT安全产品作为数据泄漏预防(DLP)解决方案的实现。我们的方法是以数据为中心的,其本质是保护组织的有意义的数据。
{"title":"Using Big Data for Data Leak Prevention","authors":"Ivan Gaidarski, P. Kutinchev","doi":"10.1109/BdKCSE48644.2019.9010596","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010596","url":null,"abstract":"The paper present our approach for protecting sensitive data, using the methods of Big Data. To effectively protect the valuable information within the organization, the following steps are needed: Employing a holistic approach for data classification, identifying sensitive data of the organization, Identifying critical exit points - communication channels, applications and connected devices and protecting the sensitive data by controlling the critical exit points. Our approach is based on creating of component-based architecture framework for ISS, conceptual models for data protection and implementation with COTS IT security products as Data Leak Prevention (DLP) solutions. Our approach is data centric, which is holistic by its nature to protect the meaningful data of the organization.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"254 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115629990","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Graph Processing with Different Data Structures 不同数据结构的图处理
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010608
M. Chernoskutov
The paper describes graph algorithms performance when using different types of data structures. To achieve that, we developed a multi-level graph processing system, which allows to create graph applications independently of any implementation details such as graph data structure or underlying computational architecture. We measure the performance of breadth-first search, max flow and random graph building algorithms when using compressed sparse row and adjacency matrix data structures. Experiments reveal different graph processing rates for different data structures, which indicates the need of using specific data structures for specific algorithms to achieve highest performance.
本文描述了图算法在使用不同类型数据结构时的性能。为了实现这一目标,我们开发了一个多层次的图形处理系统,它允许独立于任何实现细节(如图形数据结构或底层计算架构)创建图形应用程序。当使用压缩稀疏行和邻接矩阵数据结构时,我们测量了宽度优先搜索、最大流量和随机图构建算法的性能。实验表明,对于不同的数据结构,不同的图处理速率是不同的,这表明需要使用特定的数据结构来实现特定的算法,以达到最高的性能。
{"title":"Graph Processing with Different Data Structures","authors":"M. Chernoskutov","doi":"10.1109/BdKCSE48644.2019.9010608","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010608","url":null,"abstract":"The paper describes graph algorithms performance when using different types of data structures. To achieve that, we developed a multi-level graph processing system, which allows to create graph applications independently of any implementation details such as graph data structure or underlying computational architecture. We measure the performance of breadth-first search, max flow and random graph building algorithms when using compressed sparse row and adjacency matrix data structures. Experiments reveal different graph processing rates for different data structures, which indicates the need of using specific data structures for specific algorithms to achieve highest performance.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124819835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Application of the InterCriteria Analysis Method to a Dataset of Malignant Neoplasms of the Digestive Organs for the Burgas Region for 2014–2018 标准间分析方法在2014-2018年布尔加斯地区消化器官恶性肿瘤数据集中的应用
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010609
E. Sotirova, V. Vasilev, G. Bozova, H. Bozov, S. Sotirov
The aim of this paper is to analyze a statistical data for the registered patients with malignant neoplasms of the digestive organs in Burgas region for the period 2014–2018. The InterCriteria Analysis method is applied. The results are commented from different points of view: relations between age of the patients with malignant neoplasms of the digestive organs, relations between marital status and year of registration of patient and type of Malignant neoplasms of the digestive organs. The obtained results by InterCriteria Analysis method are compared by statistical analysis according to Pearson, Kendall and Spearman.
本文的目的是分析2014-2018年布尔加斯地区登记的消化器官恶性肿瘤患者的统计数据。应用了标准间分析方法。从消化器官恶性肿瘤患者的年龄、患者的婚姻状况和登记年份与消化器官恶性肿瘤类型的关系等方面对结果进行了评述。根据Pearson、Kendall和Spearman进行统计分析,比较InterCriteria Analysis方法得到的结果。
{"title":"Application of the InterCriteria Analysis Method to a Dataset of Malignant Neoplasms of the Digestive Organs for the Burgas Region for 2014–2018","authors":"E. Sotirova, V. Vasilev, G. Bozova, H. Bozov, S. Sotirov","doi":"10.1109/BdKCSE48644.2019.9010609","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010609","url":null,"abstract":"The aim of this paper is to analyze a statistical data for the registered patients with malignant neoplasms of the digestive organs in Burgas region for the period 2014–2018. The InterCriteria Analysis method is applied. The results are commented from different points of view: relations between age of the patients with malignant neoplasms of the digestive organs, relations between marital status and year of registration of patient and type of Malignant neoplasms of the digestive organs. The obtained results by InterCriteria Analysis method are compared by statistical analysis according to Pearson, Kendall and Spearman.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131332539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Building Lepida ScpA BigData Infrastructure 建设乐必达ScpA大数据基础设施
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010604
G. P. Jesi, Elisabetta Gori, Stefano Micocci, G. Mazzini
This paper is about the design and the implementation of Lepida ScpA BigData infrastructure. Our goal is to provide the Regional PA with proper tools to address future challenges such as planning the allocation of resources and creating new business models involving public and private organizations. Our first design of the infrastructure started from a specific scenario and addresses a particular aspect: ingesting the Regional public WiFi data traffic and gathering interesting analytics. We describe the challenges we faced and the choices we made during the process and the final results we achieved.
本文是关于Lepida ScpA大数据基础设施的设计与实现。我们的目标是为区域PA提供适当的工具,以应对未来的挑战,如规划资源分配和创建涉及公共和私人组织的新业务模式。我们的第一个基础设施设计从一个特定的场景开始,并解决了一个特定的方面:摄取区域公共WiFi数据流量并收集有趣的分析。我们描述了我们面临的挑战,我们在这个过程中做出的选择,以及我们最终取得的结果。
{"title":"Building Lepida ScpA BigData Infrastructure","authors":"G. P. Jesi, Elisabetta Gori, Stefano Micocci, G. Mazzini","doi":"10.1109/BdKCSE48644.2019.9010604","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010604","url":null,"abstract":"This paper is about the design and the implementation of Lepida ScpA BigData infrastructure. Our goal is to provide the Regional PA with proper tools to address future challenges such as planning the allocation of resources and creating new business models involving public and private organizations. Our first design of the infrastructure started from a specific scenario and addresses a particular aspect: ingesting the Regional public WiFi data traffic and gathering interesting analytics. We describe the challenges we faced and the choices we made during the process and the final results we achieved.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114656238","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Optimal Data Traffic and Computer Processing by a Generalized Network Flow Model with Gains and Losses 具有损益的广义网络流模型的最优数据流量与计算机处理
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010613
V. Sgurev, L. Doukovska, S. Drangajov
Using of a generalized network flow model with gains and losses is proposed for solving the problem of optimal network traffic and data processing from sources to the consumers. The problem for finding traffic of minimal cost is reduced solving the linear network flow problem. A method for network flow optimization is proposed for finding the maximal data flow from sources to consumers. It is shown that this flow is equal to the minimal cut of the network between sources and consumers. Theoretical results, obtained in the present work are confirmed by two appropriate numerical examples.
提出了一种具有损益的广义网络流模型,用于解决从源到消费者的最优网络流量和数据处理问题。通过求解线性网络流问题,减少了寻找成本最小的流量的问题。提出了一种网络流优化方法,用于寻找从数据源到消费者的最大数据流。结果表明,该流量等于源和消费者之间的网络的最小切割。本文所得到的理论结果通过两个适当的数值算例得到了证实。
{"title":"Optimal Data Traffic and Computer Processing by a Generalized Network Flow Model with Gains and Losses","authors":"V. Sgurev, L. Doukovska, S. Drangajov","doi":"10.1109/BdKCSE48644.2019.9010613","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010613","url":null,"abstract":"Using of a generalized network flow model with gains and losses is proposed for solving the problem of optimal network traffic and data processing from sources to the consumers. The problem for finding traffic of minimal cost is reduced solving the linear network flow problem. A method for network flow optimization is proposed for finding the maximal data flow from sources to consumers. It is shown that this flow is equal to the minimal cut of the network between sources and consumers. Theoretical results, obtained in the present work are confirmed by two appropriate numerical examples.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121934532","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Effect of In-Drilling Alignment with General Dynamic Error Model on Azimuth Estimation 通用动态误差模型钻孔对准对方位估计的影响
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010651
Kelly Ursenbach, M. Mintchev
Directional drilling is common in oil and natural gas extraction, offering numerous advantages. However, directional drilling requires accurate trajectory measurements. Azimuth is a critical part of the trajectory measurement, but industry standard methods of determining azimuth have drawbacks. Inertial navigation systems (INS) have been proposed as a solution but require periodic calibration. In-drilling alignment (IDA) is a calibration method which uses measured motion of the inertial measurement unit (IMU) to overcome the observability problems that come with a static calibration. This paper presents a novel model which converts IDA motion and inertial measurements into an azimuth estimate. The model is validated using a device designed to carry out IDA under laboratory conditions.
定向钻井在石油和天然气开采中很常见,具有许多优点。然而,定向钻井需要精确的轨迹测量。方位角是弹道测量的关键部分,但工业标准的方位角确定方法存在缺陷。惯性导航系统(INS)被提出作为一种解决方案,但需要定期校准。钻孔对准(IDA)是一种利用惯性测量单元(IMU)的测量运动来克服静态校准所带来的可观测性问题的校准方法。本文提出了一种将IDA运动和惯性测量转换为方位角估计的新模型。该模型使用设计用于在实验室条件下进行IDA的设备进行验证。
{"title":"Effect of In-Drilling Alignment with General Dynamic Error Model on Azimuth Estimation","authors":"Kelly Ursenbach, M. Mintchev","doi":"10.1109/BdKCSE48644.2019.9010651","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010651","url":null,"abstract":"Directional drilling is common in oil and natural gas extraction, offering numerous advantages. However, directional drilling requires accurate trajectory measurements. Azimuth is a critical part of the trajectory measurement, but industry standard methods of determining azimuth have drawbacks. Inertial navigation systems (INS) have been proposed as a solution but require periodic calibration. In-drilling alignment (IDA) is a calibration method which uses measured motion of the inertial measurement unit (IMU) to overcome the observability problems that come with a static calibration. This paper presents a novel model which converts IDA motion and inertial measurements into an azimuth estimate. The model is validated using a device designed to carry out IDA under laboratory conditions.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"234 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124571904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Thermoelectric Cooling Driver for Laser Projection Systems 用于激光投影系统的热电冷却驱动器
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010606
S. Ilchev, Z. Ilcheva
In this paper, we present our new design of a thermoelectric cooling driver for laser projection systems. It is a reliable, effective cooling solution for semiconductor laser diodes that are used instead of traditional displays or projectors when higher brightness or contrast is required. Typical applications are multimedia shows, quality improvement in production facilities or marketing in public areas. The laser animations usually require rapid changes in the output power that are difficult to predict. Our driver is capable of reacting to them by actively pumping the excess power generated by the diodes into the main heatsink of the system. The heat transfer speed is regulated by the driver in real time to maintain an optimum working temperature.
本文介绍了一种用于激光投影系统的热电冷却驱动器的新设计。当需要更高的亮度或对比度时,它是半导体激光二极管的可靠,有效的冷却解决方案,而不是传统的显示器或投影仪。典型的应用是多媒体展示,生产设施的质量改进或公共区域的营销。激光动画通常需要输出功率的快速变化,这很难预测。我们的驱动器能够通过主动将二极管产生的多余功率泵入系统的主散热器来对它们做出反应。换热速度由驱动器实时调节,保持最佳工作温度。
{"title":"Thermoelectric Cooling Driver for Laser Projection Systems","authors":"S. Ilchev, Z. Ilcheva","doi":"10.1109/BdKCSE48644.2019.9010606","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010606","url":null,"abstract":"In this paper, we present our new design of a thermoelectric cooling driver for laser projection systems. It is a reliable, effective cooling solution for semiconductor laser diodes that are used instead of traditional displays or projectors when higher brightness or contrast is required. Typical applications are multimedia shows, quality improvement in production facilities or marketing in public areas. The laser animations usually require rapid changes in the output power that are difficult to predict. Our driver is capable of reacting to them by actively pumping the excess power generated by the diodes into the main heatsink of the system. The heat transfer speed is regulated by the driver in real time to maintain an optimum working temperature.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127029960","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Company Industry Classification with Neural and Attention-Based Learning Models 基于神经和注意力学习模型的公司行业分类
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010667
S. Slavov, Andrey Tagarev, Nikola Tulechki, S. Boytcheva
This paper compares different solutions for the task of classifying companies with an industry classification scheme. Recent advances in deep learning methods show better performance in the text classification task. The dataset consists of short textual descriptions of companies and their economic activities. Target classification schemes are built by mapping related open data in a semi-controlled manner. Target classes are built from the bottom up by DBpedia. For the experiments are used modifications of methods BERT, XLNet, Glove and ULMfit with pre-trained models for English. Two simple models with perceptron architecture are used as the baseline. The results show that the best performance for multi-label classification of DBpedia companies abstracts is achieved by BERT and XLnet models, even for unbalanced classes.
本文比较了用行业分类方案对公司分类任务的不同解决方案。深度学习方法的最新进展在文本分类任务中表现出更好的性能。该数据集由公司及其经济活动的简短文本描述组成。目标分类方案是通过半控制方式映射相关开放数据来构建的。目标类是由DBpedia自下而上构建的。实验中使用了BERT、XLNet、Glove和ULMfit方法的修改,并对英语模型进行了预训练。使用两个具有感知机架构的简单模型作为基线。结果表明,BERT和XLnet模型对DBpedia公司摘要的多标签分类效果最好,即使对不平衡类也是如此。
{"title":"Company Industry Classification with Neural and Attention-Based Learning Models","authors":"S. Slavov, Andrey Tagarev, Nikola Tulechki, S. Boytcheva","doi":"10.1109/BdKCSE48644.2019.9010667","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010667","url":null,"abstract":"This paper compares different solutions for the task of classifying companies with an industry classification scheme. Recent advances in deep learning methods show better performance in the text classification task. The dataset consists of short textual descriptions of companies and their economic activities. Target classification schemes are built by mapping related open data in a semi-controlled manner. Target classes are built from the bottom up by DBpedia. For the experiments are used modifications of methods BERT, XLNet, Glove and ULMfit with pre-trained models for English. Two simple models with perceptron architecture are used as the baseline. The results show that the best performance for multi-label classification of DBpedia companies abstracts is achieved by BERT and XLnet models, even for unbalanced classes.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128121455","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Analysis of the Functionalities of a Shared ICS Security Operations Center 共享ICS安全运营中心功能分析
Pub Date : 2019-11-01 DOI: 10.1109/BdKCSE48644.2019.9010607
Willian Dimitrov, Svetlana Syarova
The basic step in the design of a security operations center (SOC) is identifying the necessary functions it needs to perform. The article offers an analysis of the ICS SOC functionalities and is focused to create a part of the concept of operations before the real design of Shared ICS SOC. We offer a complex of functionalities of Shared ICS SOC and analyze their effectiveness. The survey is based on a review of the legal framework, the ICS security incidents, research on the gaps between cybersecurity products and real needs for the ICS and SCADA community. Shared SOC performs role of community service hub with integrated experience, supplying security services for multiple ICS. By outsourcing these services, a company can reduce security staff and focus on its core business.
安全操作中心(SOC)设计的基本步骤是确定其需要执行的必要功能。本文提供了对ICS SOC功能的分析,并重点介绍了在真正设计共享ICS SOC之前创建操作概念的一部分。我们提供了共享ICS SOC的复杂功能,并分析了它们的有效性。该调查基于对法律框架、ICS安全事件、网络安全产品与ICS和SCADA社区实际需求之间差距的研究。共享SOC发挥社区服务枢纽的作用,整合经验,为多个ICS提供安全服务。通过外包这些服务,公司可以减少安全人员并专注于其核心业务。
{"title":"Analysis of the Functionalities of a Shared ICS Security Operations Center","authors":"Willian Dimitrov, Svetlana Syarova","doi":"10.1109/BdKCSE48644.2019.9010607","DOIUrl":"https://doi.org/10.1109/BdKCSE48644.2019.9010607","url":null,"abstract":"The basic step in the design of a security operations center (SOC) is identifying the necessary functions it needs to perform. The article offers an analysis of the ICS SOC functionalities and is focused to create a part of the concept of operations before the real design of Shared ICS SOC. We offer a complex of functionalities of Shared ICS SOC and analyze their effectiveness. The survey is based on a review of the legal framework, the ICS security incidents, research on the gaps between cybersecurity products and real needs for the ICS and SCADA community. Shared SOC performs role of community service hub with integrated experience, supplying security services for multiple ICS. By outsourcing these services, a company can reduce security staff and focus on its core business.","PeriodicalId":206080,"journal":{"name":"2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115694763","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
期刊
2019 Big Data, Knowledge and Control Systems Engineering (BdKCSE)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1