首页 > 最新文献

2018 14th International Conference on Semantics, Knowledge and Grids (SKG)最新文献

英文 中文
2018 14th International Conference on Semantics, Knowledge and Grids 2018第十四届语义、知识与网格国际会议
Pub Date : 2018-09-01 DOI: 10.1109/skg.2018.00001
{"title":"2018 14th International Conference on Semantics, Knowledge and Grids","authors":"","doi":"10.1109/skg.2018.00001","DOIUrl":"https://doi.org/10.1109/skg.2018.00001","url":null,"abstract":"","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123736762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Requirement Analysis of Operational Network Organization Based on PDOA 基于PDOA的运营网络组织需求分析
Pub Date : 2018-09-01 DOI: 10.1109/SKG.2018.00043
Jianfeng Hou, Ruicheng Yan
Requirement analysis of network organization is critical for network efficiency as well as operational efficiency in operational planning. Currently, most researchers pay much attention to the organization itself, but pay little attention to the requirement analysis of organization. In this paper, we propose a novel approach of operational network organization based on Problem Domains Oriented Analysis (PDOA) model, aiming to meet the need of precise operational network organizing. We firstly introduce the PDOA model and the concrete analysis process. Afterwards, a list of the requirement of network capability is proposed and classified using the PDOA model. In the end, Basing on the context diagram and problem diagram analysis, we divide the problem domain in detail.
网络组织的需求分析是网络效率的关键,也是运营计划中运营效率的关键。目前,大多数研究人员关注的是组织本身,而对组织的需求分析关注较少。针对作战网络精确组织的需要,提出了一种基于面向问题域分析(PDOA)模型的作战网络组织方法。首先介绍了PDOA模型及其具体分析过程。然后,提出了网络能力需求列表,并利用PDOA模型进行了分类。最后,在上下文图和问题图分析的基础上,对问题域进行了详细划分。
{"title":"Requirement Analysis of Operational Network Organization Based on PDOA","authors":"Jianfeng Hou, Ruicheng Yan","doi":"10.1109/SKG.2018.00043","DOIUrl":"https://doi.org/10.1109/SKG.2018.00043","url":null,"abstract":"Requirement analysis of network organization is critical for network efficiency as well as operational efficiency in operational planning. Currently, most researchers pay much attention to the organization itself, but pay little attention to the requirement analysis of organization. In this paper, we propose a novel approach of operational network organization based on Problem Domains Oriented Analysis (PDOA) model, aiming to meet the need of precise operational network organizing. We firstly introduce the PDOA model and the concrete analysis process. Afterwards, a list of the requirement of network capability is proposed and classified using the PDOA model. In the end, Basing on the context diagram and problem diagram analysis, we divide the problem domain in detail.","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130080589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Provenance Collection Platform for the Weather Research and Forecasting Model 气象研究与预报模式来源收集平台
Pub Date : 2018-09-01 DOI: 10.1109/SKG.2018.00009
Alper Tufek, A. Gurbuz, Omer Faruk Ekuklu, M. Aktaş
Loss of life and property, disruptions to transportation and trading operations, etc. caused by meteorological events increasingly highlight the importance of fast and accurate weather forecasting. For this reason, there are various Numerical Weather Prediction (NWP) models worldwide that are run on either a local or a global scale. NWP models typically take hours to finish a complete run, however, depending on the input parameters and the size of the forecast domain. Provenance information is of central importance for detecting unexpected events that may develop during the course of model execution, and also for taking necessary action as early as possible. In addition, the need to share scientific data and results between researchers or scientists also highlights the importance of data quality and reliability. This can only be achieved through provenance information collected during the entire lifecycle of the data of interest. The Weather Research and Forecasting (WRF) Model is a Numerical Weather Prediction model developed as open source. In this study, we develop a framework for tracking the WRF model and for generating, storing and analyzing provenance data. The proposed system enables easy management and understanding of numerical weather forecast workflows by providing provenance graphs. By analyzing these graphs, potential faulty situations that may occur during the execution of WRF can be traced to their root causes. Our proposed system has been evaluated and has been shown to perform well even in a high-frequency provenance information flow.
气象事件造成的生命财产损失、交通和贸易中断等日益凸显快速准确的天气预报的重要性。由于这个原因,世界上有各种数值天气预报(NWP)模式,它们在本地或全球范围内运行。然而,NWP模型通常需要几个小时才能完成一个完整的运行,这取决于输入参数和预测域的大小。对于检测在模型执行过程中可能发生的意外事件,以及尽早采取必要的行动,来源信息是非常重要的。此外,在研究人员或科学家之间共享科学数据和结果的需要也突出了数据质量和可靠性的重要性。这只能通过在相关数据的整个生命周期中收集的来源信息来实现。天气研究与预报(WRF)模式是一个开源的数值天气预报模式。在本研究中,我们开发了一个框架,用于跟踪WRF模型以及生成、存储和分析种源数据。建议的系统通过提供来源图表,使数值天气预报工作流程易于管理和理解。通过分析这些图,可以追踪到WRF执行期间可能出现的潜在错误情况的根本原因。我们提出的系统已被评估,并已被证明即使在高频来源信息流中也表现良好。
{"title":"Provenance Collection Platform for the Weather Research and Forecasting Model","authors":"Alper Tufek, A. Gurbuz, Omer Faruk Ekuklu, M. Aktaş","doi":"10.1109/SKG.2018.00009","DOIUrl":"https://doi.org/10.1109/SKG.2018.00009","url":null,"abstract":"Loss of life and property, disruptions to transportation and trading operations, etc. caused by meteorological events increasingly highlight the importance of fast and accurate weather forecasting. For this reason, there are various Numerical Weather Prediction (NWP) models worldwide that are run on either a local or a global scale. NWP models typically take hours to finish a complete run, however, depending on the input parameters and the size of the forecast domain. Provenance information is of central importance for detecting unexpected events that may develop during the course of model execution, and also for taking necessary action as early as possible. In addition, the need to share scientific data and results between researchers or scientists also highlights the importance of data quality and reliability. This can only be achieved through provenance information collected during the entire lifecycle of the data of interest. The Weather Research and Forecasting (WRF) Model is a Numerical Weather Prediction model developed as open source. In this study, we develop a framework for tracking the WRF model and for generating, storing and analyzing provenance data. The proposed system enables easy management and understanding of numerical weather forecast workflows by providing provenance graphs. By analyzing these graphs, potential faulty situations that may occur during the execution of WRF can be traced to their root causes. Our proposed system has been evaluated and has been shown to perform well even in a high-frequency provenance information flow.","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126337469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Multi-Attribute Query Processing Through In-Network Aggregation in Edge Computing 边缘计算中基于网络聚合的多属性查询处理
Pub Date : 2018-09-01 DOI: 10.1109/SKG.2018.00027
Xiaocui Li, Zhangbing Zhou
This paper proposes a multi-attribute aggregation query mechanism in the context of edge computing, where an energy-aware IR-tree is constructed to process query processing in single edge networks, while an edge node routing graph is es-tablished to facilitate query processing for marginal smart things contained in contiguous edge networks. This in-network and localized strategy has shown its e □ ciency and applicability of query processing in IoT sensing networks, and experimental evaluation demonstrates that this technique performs better than the rivals in reducing the network tra □ c and energy consumption.
本文提出了一种边缘计算背景下的多属性聚合查询机制,通过构建能量感知ir树来处理单边缘网络中的查询处理,建立边缘节点路由图来促进对相邻边缘网络中包含的边缘智能事物的查询处理。该策略在物联网传感网络中显示了其查询处理的效率和适用性,实验评估表明,该技术在降低网络传输和能耗方面优于竞争对手。
{"title":"Multi-Attribute Query Processing Through In-Network Aggregation in Edge Computing","authors":"Xiaocui Li, Zhangbing Zhou","doi":"10.1109/SKG.2018.00027","DOIUrl":"https://doi.org/10.1109/SKG.2018.00027","url":null,"abstract":"This paper proposes a multi-attribute aggregation query mechanism in the context of edge computing, where an energy-aware IR-tree is constructed to process query processing in single edge networks, while an edge node routing graph is es-tablished to facilitate query processing for marginal smart things contained in contiguous edge networks. This in-network and localized strategy has shown its e □ ciency and applicability of query processing in IoT sensing networks, and experimental evaluation demonstrates that this technique performs better than the rivals in reducing the network tra □ c and energy consumption.","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127354493","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Matrix Factorization Recommendation Algorithm Based on User Characteristics 基于用户特征的矩阵分解推荐算法
Pub Date : 2018-09-01 DOI: 10.1109/SKG.2018.00012
Hongtao Liu, Ouyang Mao, Chen Long, Xueyan Liu, Zhenjia Zhu
Matrix Factorization is a popular and successful method. It is already a common model method for collaborative filtering in recommendation systems. As most of the scoring matrix is sparse and the dimensions are increasing rapidly, the prediction accuracy and calculation time of the current matrix decomposition are limited. In this paper, a matrix decomposition model based on user characteristics is proposed, which can effectively improve the accuracy of predictive scoring and reduce the number of iterations. By testing the actual data and comparing it with the existing recommendation algorithm, the experimental results show that the method proposed in this paper can predict user's score well.
矩阵分解是一种流行而成功的方法。它已经成为推荐系统中协同过滤的常用模型方法。由于评分矩阵大部分是稀疏的,且维数快速增加,限制了当前矩阵分解的预测精度和计算时间。本文提出了一种基于用户特征的矩阵分解模型,可以有效地提高预测评分的准确率,减少迭代次数。通过对实际数据进行测试,并与现有推荐算法进行比较,实验结果表明本文提出的方法可以很好地预测用户的评分。
{"title":"Matrix Factorization Recommendation Algorithm Based on User Characteristics","authors":"Hongtao Liu, Ouyang Mao, Chen Long, Xueyan Liu, Zhenjia Zhu","doi":"10.1109/SKG.2018.00012","DOIUrl":"https://doi.org/10.1109/SKG.2018.00012","url":null,"abstract":"Matrix Factorization is a popular and successful method. It is already a common model method for collaborative filtering in recommendation systems. As most of the scoring matrix is sparse and the dimensions are increasing rapidly, the prediction accuracy and calculation time of the current matrix decomposition are limited. In this paper, a matrix decomposition model based on user characteristics is proposed, which can effectively improve the accuracy of predictive scoring and reduce the number of iterations. By testing the actual data and comparing it with the existing recommendation algorithm, the experimental results show that the method proposed in this paper can predict user's score well.","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121257949","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
3D Hyper-Dense Connected Convolutional Neural Network for Brain Tumor Segmentation 三维高密度连接卷积神经网络用于脑肿瘤分割
Pub Date : 2018-09-01 DOI: 10.1109/SKG.2018.00024
Saqib Qamar, Hai Jin, Ran Zheng, Parvez Ahmad
Glioma is one of the most widespread and intense forms of primary brain tumors. Accurate subcortical brain segmentation is essential in the evaluation of gliomas which helps to monitor the growth of gliomas and assists in the assessment of medication effects. Manual segmentation is needed a lot of human resources on Magnetic Resonance Imaging (MRI) data. Deep learning methods have become a powerful tool to learn features automatically in medical imaging applications including brain tissue segmentation, liver segmentation, and brain tumor segmentation. The shape of gliomas, structure, and location are different among individual patients, and it is a challenge to developing a model. In this paper, 3D hyper-dense Convolutional Neural Network(Cnn)is developed to segment tumors, in which it captures the global and local contextual information from two scales of global and local patches along with the two scales of receptive field. Densely connected blocks are used to exploit the benefit of a CNN to boost the model segmentation performance in Enhancing Tumor (ET), Non-Enhancing Tumor (NET), and Peritumoral Edema (PE). This dense architecture adopts 3D Fully Convolutional Network (FCN) architecture that is used for end-to-end volumetric prediction. The dense connectivity can offer a chance of deep supervision and improve gradient flow information in the learning process. The network is trained hierarchically based on global and local patches. In this scenario, the both patches are processed in their separate path, and dense connections happen not only between same path layers but also between different path layers. Our approach is validated on the BraTS 2018 dataset with the dice-score of 0.87, 0.81 and 0.84 for the complete tumor, enhancing tumor, and tumor core respectively. These outcomes are very close to the reported state-of-the-art results, and our approach is preferable to present 3D-based approaches when it comes to compactness, time and parameter efficiency on MRI brain tumor segmentation.
胶质瘤是最广泛和最严重的原发性脑肿瘤之一。准确的皮质下脑区分割对胶质瘤的评估至关重要,它有助于监测胶质瘤的生长,并有助于评估药物效果。对磁共振成像(MRI)数据进行人工分割需要耗费大量人力资源。深度学习方法已经成为医学成像应用中自动学习特征的强大工具,包括脑组织分割、肝脏分割和脑肿瘤分割。胶质瘤的形状、结构和位置在个体患者之间是不同的,因此建立一个模型是一个挑战。本文采用三维高密度卷积神经网络(Cnn)对肿瘤进行分割,该网络从全局斑块和局部斑块两个尺度以及感受野两个尺度上捕获全局和局部上下文信息。在增强肿瘤(ET)、非增强肿瘤(NET)和肿瘤周围水肿(PE)中,使用密集连接的块来利用CNN的优势来提高模型分割性能。该密集架构采用3D全卷积网络(FCN)架构,用于端到端体积预测。密集的连通性为深度监督提供了机会,并改善了学习过程中的梯度流信息。基于全局和局部补丁对网络进行分层训练。在这种情况下,两个patch在各自的路径上进行处理,不仅在相同的路径层之间,而且在不同的路径层之间都发生了密集的连接。我们的方法在BraTS 2018数据集上得到了验证,完整肿瘤、增强肿瘤和肿瘤核心的骰子得分分别为0.87、0.81和0.84。这些结果非常接近报道的最先进的结果,当涉及到MRI脑肿瘤分割的紧凑性,时间和参数效率时,我们的方法优于目前基于3d的方法。
{"title":"3D Hyper-Dense Connected Convolutional Neural Network for Brain Tumor Segmentation","authors":"Saqib Qamar, Hai Jin, Ran Zheng, Parvez Ahmad","doi":"10.1109/SKG.2018.00024","DOIUrl":"https://doi.org/10.1109/SKG.2018.00024","url":null,"abstract":"Glioma is one of the most widespread and intense forms of primary brain tumors. Accurate subcortical brain segmentation is essential in the evaluation of gliomas which helps to monitor the growth of gliomas and assists in the assessment of medication effects. Manual segmentation is needed a lot of human resources on Magnetic Resonance Imaging (MRI) data. Deep learning methods have become a powerful tool to learn features automatically in medical imaging applications including brain tissue segmentation, liver segmentation, and brain tumor segmentation. The shape of gliomas, structure, and location are different among individual patients, and it is a challenge to developing a model. In this paper, 3D hyper-dense Convolutional Neural Network(Cnn)is developed to segment tumors, in which it captures the global and local contextual information from two scales of global and local patches along with the two scales of receptive field. Densely connected blocks are used to exploit the benefit of a CNN to boost the model segmentation performance in Enhancing Tumor (ET), Non-Enhancing Tumor (NET), and Peritumoral Edema (PE). This dense architecture adopts 3D Fully Convolutional Network (FCN) architecture that is used for end-to-end volumetric prediction. The dense connectivity can offer a chance of deep supervision and improve gradient flow information in the learning process. The network is trained hierarchically based on global and local patches. In this scenario, the both patches are processed in their separate path, and dense connections happen not only between same path layers but also between different path layers. Our approach is validated on the BraTS 2018 dataset with the dice-score of 0.87, 0.81 and 0.84 for the complete tumor, enhancing tumor, and tumor core respectively. These outcomes are very close to the reported state-of-the-art results, and our approach is preferable to present 3D-based approaches when it comes to compactness, time and parameter efficiency on MRI brain tumor segmentation.","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128659205","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
Study on Community Discovery Algorithm from the Perspection of Label Influence Propagation 标签影响传播视角下的社区发现算法研究
Pub Date : 2018-09-01 DOI: 10.1109/SKG.2018.00025
Jie Jian, Xiaoming Yu, Jinchi Zhu, Xueyan Liu, Hongtao Liu
In order to better discover the overlapping communities, this paper proposes an overlapping community detection method (INFELPA) based on the influence of the label and the spread of the edge tags. We use the influence of the node to initialize the label on the edge and sort the edge's influence to avoid these random factors during the edge label updating process. In order to retain multiple communities we retain multiple labels on the edges and restore the completed edge tag to the node. The experimental results show that this algorithm has certain competitive advantages
为了更好地发现重叠社区,本文提出了一种基于标签影响和边缘标签传播的重叠社区检测方法(INFELPA)。我们利用节点的影响来初始化边缘上的标签,并对边缘的影响进行排序,以避免在边缘标签更新过程中出现这些随机因素。为了保留多个社区,我们在边缘上保留多个标签,并将完成的边缘标签恢复到节点。实验结果表明,该算法具有一定的竞争优势
{"title":"Study on Community Discovery Algorithm from the Perspection of Label Influence Propagation","authors":"Jie Jian, Xiaoming Yu, Jinchi Zhu, Xueyan Liu, Hongtao Liu","doi":"10.1109/SKG.2018.00025","DOIUrl":"https://doi.org/10.1109/SKG.2018.00025","url":null,"abstract":"In order to better discover the overlapping communities, this paper proposes an overlapping community detection method (INFELPA) based on the influence of the label and the spread of the edge tags. We use the influence of the node to initialize the label on the edge and sort the edge's influence to avoid these random factors during the edge label updating process. In order to retain multiple communities we retain multiple labels on the edges and restore the completed edge tag to the node. The experimental results show that this algorithm has certain competitive advantages","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129425588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Semantic and Heuristic Based Approach for Paraphrase Identification 基于语义和启发式的释义识别方法
Pub Date : 2018-09-01 DOI: 10.1109/SKG.2018.00037
Muhidin A. Mohamed, M. Oussalah
In this paper, we propose a semantic-based paraphrase identification approach. The core concept of this proposal is to identify paraphrases when sentences contain a set of named-entities and common words. The developed approach distinguishes the computation of the semantic similarity of named-entity tokens from the rest of the sentence text. More specifically, this is based on the integration of word semantic similarity derived from WordNet taxonomic relations, and named-entity semantic relatedness inferred from the crowd-sourced knowledge in Wikipedia database. Besides, we improve WordNet similarity measure by nominalizing verbs, adjectives and adverbs with the aid of Categorial Variation database (CatVar). The paraphrase identification system is then evaluated using two different datasets; namely, Microsoft Research Paraphrase Corpus (MSRPC) and TREC-9 Question Variants. Experimental results on the aforementioned datasets show that our system outperforms baselines in the paraphrase identification task.
在本文中,我们提出了一种基于语义的释义识别方法。该建议的核心概念是当句子包含一组命名实体和常用词时识别释义。所开发的方法将命名实体标记的语义相似度计算与句子文本的其他部分区分开来。更具体地说,这是基于从WordNet分类关系中导出的词语义相似度和从Wikipedia数据库中众包知识推断的命名实体语义相关性的集成。此外,我们还借助范畴变异数据库(CatVar)对动词、形容词和副词的名词化进行了改进。然后使用两个不同的数据集评估释义识别系统;即微软研究释义语料库(MSRPC)和TREC-9问题变体。在上述数据集上的实验结果表明,我们的系统在意译识别任务中优于基线。
{"title":"Semantic and Heuristic Based Approach for Paraphrase Identification","authors":"Muhidin A. Mohamed, M. Oussalah","doi":"10.1109/SKG.2018.00037","DOIUrl":"https://doi.org/10.1109/SKG.2018.00037","url":null,"abstract":"In this paper, we propose a semantic-based paraphrase identification approach. The core concept of this proposal is to identify paraphrases when sentences contain a set of named-entities and common words. The developed approach distinguishes the computation of the semantic similarity of named-entity tokens from the rest of the sentence text. More specifically, this is based on the integration of word semantic similarity derived from WordNet taxonomic relations, and named-entity semantic relatedness inferred from the crowd-sourced knowledge in Wikipedia database. Besides, we improve WordNet similarity measure by nominalizing verbs, adjectives and adverbs with the aid of Categorial Variation database (CatVar). The paraphrase identification system is then evaluated using two different datasets; namely, Microsoft Research Paraphrase Corpus (MSRPC) and TREC-9 Question Variants. Experimental results on the aforementioned datasets show that our system outperforms baselines in the paraphrase identification task.","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129940727","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Towards Efficient for Learning Model Image Retrieval 面向高效的学习模型图像检索
Pub Date : 2018-09-01 DOI: 10.1109/SKG.2018.00020
M. J. J. Ghrabat, Guangzhi Ma, Chih Cheng
Image mining is widely concerned in processing geo-tagged landmark images of alphanumeric and real-time satellites. Useful information loss in feature extracting process may results in inappropriate image categorization. Reserving useful information is highly challenging and critical in feature extraction and reduction. This research work intends to utilize the hybrid features such as Local Binary Pattern (LBP), colour moments and statistical features for enhancing the categorization accuracy. Then, the k-means classification technique is used to determine the class labels used for model training. In order to mitigate overfitting and to increase the overall classification precision, the Component Reduced Naive Bayesian (CRNB) model is proposed. Also, the physical landmarks of the geo-tagged images are located by using the Hybrid Feature Extraction based Naive Bayesian (HFE-NB) approach. During experiments, two different datasets have been used to test the proposed model, and some other existing models are considered to compare the results. The results stated that the proposed method significantly improves the precision, recall and accuracy of image retrieval. When compared to the existing techniques, it provides the best results by using the texture and colour features with increased sensitivity and specificity such as 3.36% and 0.1 % respectively.
图像挖掘在处理字母数字和实时卫星的地理标记地标图像中受到广泛关注。特征提取过程中有用信息的丢失可能导致图像分类不正确。在特征提取和约简中,保留有用的信息是非常具有挑战性和关键的。本研究旨在利用局部二值模式(LBP)、颜色矩和统计特征等混合特征来提高分类精度。然后,使用k-means分类技术确定用于模型训练的类标签。为了缓解过拟合的问题,提高分类的整体精度,提出了一种成分约简朴素贝叶斯(Component reduction Naive Bayesian, CRNB)模型。此外,利用基于朴素贝叶斯(HFE-NB)混合特征提取的方法定位地理标记图像的物理地标。在实验中,我们使用了两个不同的数据集来测试所提出的模型,并考虑了其他一些现有的模型来比较结果。结果表明,该方法显著提高了图像检索的精密度、查全率和正确率。与现有技术相比,该方法利用纹理和颜色特征,灵敏度和特异性分别提高了3.36%和0.1%,获得了最好的结果。
{"title":"Towards Efficient for Learning Model Image Retrieval","authors":"M. J. J. Ghrabat, Guangzhi Ma, Chih Cheng","doi":"10.1109/SKG.2018.00020","DOIUrl":"https://doi.org/10.1109/SKG.2018.00020","url":null,"abstract":"Image mining is widely concerned in processing geo-tagged landmark images of alphanumeric and real-time satellites. Useful information loss in feature extracting process may results in inappropriate image categorization. Reserving useful information is highly challenging and critical in feature extraction and reduction. This research work intends to utilize the hybrid features such as Local Binary Pattern (LBP), colour moments and statistical features for enhancing the categorization accuracy. Then, the k-means classification technique is used to determine the class labels used for model training. In order to mitigate overfitting and to increase the overall classification precision, the Component Reduced Naive Bayesian (CRNB) model is proposed. Also, the physical landmarks of the geo-tagged images are located by using the Hybrid Feature Extraction based Naive Bayesian (HFE-NB) approach. During experiments, two different datasets have been used to test the proposed model, and some other existing models are considered to compare the results. The results stated that the proposed method significantly improves the precision, recall and accuracy of image retrieval. When compared to the existing techniques, it provides the best results by using the texture and colour features with increased sensitivity and specificity such as 3.36% and 0.1 % respectively.","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"150 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116358054","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A Hybrid Deep Learning Model for Text Classification 文本分类的混合深度学习模型
Pub Date : 2018-09-01 DOI: 10.1109/SKG.2018.00014
Xianglong Chen, Chunping Ouyang, Yongbin Liu, Lingyun Luo, Xiaohua Yang
Deep learning has shown its effectiveness in many tasks such as text classification and computer vision. Most text classification tasks are concentrated in the use of convolution neural network and recurrent neural network to obtain text feature representation. In some researches, Attention mechanism is usually adopted to improve classification accuracy. According to the target of task 6 in NLP&CC2018, a hybrid deep learning model which combined BiGRU, CNN and Attention mechanism was proposed to improve text classification. The experimental results show that the F1-score of the proposed model successfully excels the task's baseline model. Besides, this hybrid Deep Learning model gets higher Precision, Recall and F1-score comparing with some other popular Deep Learning models, and the improvement of on F1-score is 5.4% than the single CNN model.
深度学习在文本分类和计算机视觉等许多任务中都显示出其有效性。大多数文本分类任务都集中在使用卷积神经网络和递归神经网络来获取文本特征表示。在一些研究中,通常采用注意机制来提高分类精度。根据NLP&CC2018任务6的目标,提出了一种结合BiGRU、CNN和注意机制的混合深度学习模型来改进文本分类。实验结果表明,该模型的f1得分成功地优于任务基线模型。此外,与其他一些流行的深度学习模型相比,该混合深度学习模型具有更高的Precision、Recall和F1-score,其中F1-score比单一CNN模型提高了5.4%。
{"title":"A Hybrid Deep Learning Model for Text Classification","authors":"Xianglong Chen, Chunping Ouyang, Yongbin Liu, Lingyun Luo, Xiaohua Yang","doi":"10.1109/SKG.2018.00014","DOIUrl":"https://doi.org/10.1109/SKG.2018.00014","url":null,"abstract":"Deep learning has shown its effectiveness in many tasks such as text classification and computer vision. Most text classification tasks are concentrated in the use of convolution neural network and recurrent neural network to obtain text feature representation. In some researches, Attention mechanism is usually adopted to improve classification accuracy. According to the target of task 6 in NLP&CC2018, a hybrid deep learning model which combined BiGRU, CNN and Attention mechanism was proposed to improve text classification. The experimental results show that the F1-score of the proposed model successfully excels the task's baseline model. Besides, this hybrid Deep Learning model gets higher Precision, Recall and F1-score comparing with some other popular Deep Learning models, and the improvement of on F1-score is 5.4% than the single CNN model.","PeriodicalId":265760,"journal":{"name":"2018 14th International Conference on Semantics, Knowledge and Grids (SKG)","volume":"182 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122987532","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
期刊
2018 14th International Conference on Semantics, Knowledge and Grids (SKG)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1