首页 > 最新文献

2013 11th International Symposium on Programming and Systems (ISPS)最新文献

英文 中文
SLAM-ICP with a Boolean method applied on a car-like robot SLAM-ICP的布尔方法在汽车机器人上的应用
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581476
M. Djehaich, H. Ziane, N. Achour, R. Tiar, N. Ouadah
Scan matching is a popular way of recovering a mobile robot's motion and constitutes the basis of many localization and mapping approaches. Consequently, a variety of scan matching algorithms have been proposed in the past. All these algorithms share one common attribute: They match pairs of scans to obtain spatial relations between two robot poses. The work presented in this paper consists in the implementation of a SLAM algorithm (Simultaneous Localization and Mapping) on a car-like vehicle. Our algorithm is based on a measurement alignment method called “Iterative Closest Points” (ICP) using binary weighted method (Boolean). It helps find the rigid transformation that minimizes the distance between two clouds of points. The developed algorithm (SLAM-ICP) has been implemented and tested on the mobile robot. Experimental results given at the end of this paper are compared to classical localization technique (odometry) and SLAM-ICP with the recursive method that is already implemented on the Robucar.
扫描匹配是恢复移动机器人运动的一种流行方法,也是许多定位和映射方法的基础。因此,过去已经提出了各种扫描匹配算法。所有这些算法都有一个共同的属性:它们匹配成对的扫描,以获得两个机器人姿势之间的空间关系。本文提出的工作包括在类车车辆上实现SLAM算法(同时定位和映射)。我们的算法是基于一种称为“迭代最近点”(ICP)的测量对齐方法,使用二进制加权法(布尔)。它有助于找到使两个点云之间的距离最小的刚性变换。所开发的算法(SLAM-ICP)已在移动机器人上实现并进行了测试。本文最后给出的实验结果与经典定位技术(里程计)和SLAM-ICP与已经在罗布卡上实现的递归方法进行了比较。
{"title":"SLAM-ICP with a Boolean method applied on a car-like robot","authors":"M. Djehaich, H. Ziane, N. Achour, R. Tiar, N. Ouadah","doi":"10.1109/ISPS.2013.6581476","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581476","url":null,"abstract":"Scan matching is a popular way of recovering a mobile robot's motion and constitutes the basis of many localization and mapping approaches. Consequently, a variety of scan matching algorithms have been proposed in the past. All these algorithms share one common attribute: They match pairs of scans to obtain spatial relations between two robot poses. The work presented in this paper consists in the implementation of a SLAM algorithm (Simultaneous Localization and Mapping) on a car-like vehicle. Our algorithm is based on a measurement alignment method called “Iterative Closest Points” (ICP) using binary weighted method (Boolean). It helps find the rigid transformation that minimizes the distance between two clouds of points. The developed algorithm (SLAM-ICP) has been implemented and tested on the mobile robot. Experimental results given at the end of this paper are compared to classical localization technique (odometry) and SLAM-ICP with the recursive method that is already implemented on the Robucar.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127343614","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
The impact of ECC's scalar multiplication on wireless sensor networks ECC标量乘法对无线传感器网络的影响
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581488
Merad Boudia Omar Rafik, F. Mohammed
The security in wireless sensor networks (WSNs) becomes an attractive area of research, especially during the last few years. This is due to the large number of applications where the sensors are deployed and the security needs. Elliptic Curve Cryptography (ECC) is a public key approach that represents one of solution and even a serious candidate for providing security in WSN. Unfortunately, the execution time of its slightly complex operations makes the suitability of ECC broken to a limited number of applications. The most expensive operation in ECC is scalar point multiplication (SPM). However, the ECC based schemes depend strongly on the performance of SPM. The side channel attacks (SCA) on ECC exploit the information leaks during SPM execution in order to find the secret key. In this paper, we focus on scalar point multiplication, its efficiency and its security against SCA (in particular simple power analysis). The experimental results are realized on the 16 bits TelosB mote.
无线传感器网络(WSNs)的安全性是近年来研究的热点。这是由于部署传感器的应用程序数量众多以及安全需求所致。椭圆曲线加密(ECC)是一种公开密钥方法,是无线传感器网络安全的解决方案之一,甚至是重要的候选方案。不幸的是,其稍微复杂的操作的执行时间使得ECC的适用性仅限于有限数量的应用程序。在ECC中最昂贵的操作是标量点乘法(SPM)。然而,基于ECC的方案很大程度上依赖于SPM的性能。针对ECC的侧信道攻击(SCA)利用SPM执行过程中的信息泄漏来查找密钥。本文主要研究标量点乘法的效率和对SCA的安全性(特别是简单的功率分析)。实验结果是在16位TelosB芯片上实现的。
{"title":"The impact of ECC's scalar multiplication on wireless sensor networks","authors":"Merad Boudia Omar Rafik, F. Mohammed","doi":"10.1109/ISPS.2013.6581488","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581488","url":null,"abstract":"The security in wireless sensor networks (WSNs) becomes an attractive area of research, especially during the last few years. This is due to the large number of applications where the sensors are deployed and the security needs. Elliptic Curve Cryptography (ECC) is a public key approach that represents one of solution and even a serious candidate for providing security in WSN. Unfortunately, the execution time of its slightly complex operations makes the suitability of ECC broken to a limited number of applications. The most expensive operation in ECC is scalar point multiplication (SPM). However, the ECC based schemes depend strongly on the performance of SPM. The side channel attacks (SCA) on ECC exploit the information leaks during SPM execution in order to find the secret key. In this paper, we focus on scalar point multiplication, its efficiency and its security against SCA (in particular simple power analysis). The experimental results are realized on the 16 bits TelosB mote.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"601 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116320393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Fuzzy Alpha-cuts to capture customer requirements in improving product development 模糊Alpha-cuts捕捉客户需求以改进产品开发
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581493
F. Bencherif, L. Mouss, S. Benaicha
Quality Function Deployment is a tool to develop and design the quality in product and improve competitiveness advantages in the market. In developing new products and projects, we receive the needs from the customer, pass it around a corporate communication circle, and eventually return it to the customer in the form of the new product. First, needs and languages received from customer are often ambiguous, imprecise, and uncertain causing deviated studied results, and in a disregarding of the voice of customer. Second, to improve quality and solve the uncertainty in product development, numerous researchers try to apply the fuzzy set theory to product development. Their models usually focus only on customer requirements or on engineering characteristics. The subsequent stages of product design are rarely addressed. The correlation between engineering characteristics and benchmarking analysis disregarded in most of Quality Function Deployment practices related researches. This commonly upsets the consequences to delay and failed project development. Aiming to solve these three issues, the objective of this paper is to improve the accuracy of Quality Function Deployment, optimize and develop the customer requirements approach to attenuate risks in subsequent phases and in manufacturing process to increase industrial performance. This approach is based on Fuzzy sets theory and Alpha-cut operations, Pairwise comparison method, and fuzzy ranking and clustering method, and on theory of inventive problems solving (TRIZ).
质量功能展开是开发和设计产品质量,提高市场竞争优势的工具。在开发新产品、新项目的过程中,我们从客户那里接收需求,通过企业沟通圈传递,最终以新产品的形式回馈给客户。首先,从客户那里得到的需求和语言往往是模糊的、不精确的和不确定的,导致研究结果偏离,并且忽视了客户的声音。其次,为了提高产品质量和解决产品开发中的不确定性,许多研究者尝试将模糊集理论应用到产品开发中。他们的模型通常只关注客户需求或工程特性。产品设计的后续阶段很少被提及。在大多数质量功能部署实践相关的研究中,工程特征与基准分析之间的相关性被忽视。这通常会导致项目开发的延迟和失败。针对这三个问题,本文的目标是提高质量功能部署的准确性,优化和开发客户需求方法,以降低后续阶段和制造过程中的风险,从而提高工业绩效。该方法基于模糊集理论和Alpha-cut操作、两两比较方法、模糊排序和聚类方法以及创造性问题解决理论(TRIZ)。
{"title":"Fuzzy Alpha-cuts to capture customer requirements in improving product development","authors":"F. Bencherif, L. Mouss, S. Benaicha","doi":"10.1109/ISPS.2013.6581493","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581493","url":null,"abstract":"Quality Function Deployment is a tool to develop and design the quality in product and improve competitiveness advantages in the market. In developing new products and projects, we receive the needs from the customer, pass it around a corporate communication circle, and eventually return it to the customer in the form of the new product. First, needs and languages received from customer are often ambiguous, imprecise, and uncertain causing deviated studied results, and in a disregarding of the voice of customer. Second, to improve quality and solve the uncertainty in product development, numerous researchers try to apply the fuzzy set theory to product development. Their models usually focus only on customer requirements or on engineering characteristics. The subsequent stages of product design are rarely addressed. The correlation between engineering characteristics and benchmarking analysis disregarded in most of Quality Function Deployment practices related researches. This commonly upsets the consequences to delay and failed project development. Aiming to solve these three issues, the objective of this paper is to improve the accuracy of Quality Function Deployment, optimize and develop the customer requirements approach to attenuate risks in subsequent phases and in manufacturing process to increase industrial performance. This approach is based on Fuzzy sets theory and Alpha-cut operations, Pairwise comparison method, and fuzzy ranking and clustering method, and on theory of inventive problems solving (TRIZ).","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124350144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A new method for dimensionality reduction of multi-dimensional data using Copulas 利用copula对多维数据进行降维的新方法
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581491
Rima Houari, A. Bounceur, Mohand Tahar Kechadi
A new technique for the Dimensionality Reduction of Multi-Dimensional Data is presented in this paper. This technique employs the theory of Copulas to estimate the multivariate joint probability distribution without constraints to specific types of marginal distributions of random variables that represent the dimensions of our Data. A Copulas-based model, provides a complete and scale-free description of dependence that is more suitable to be modeled using well-known multivariate parametric laws. The model can be readily used for comparing of dependence of random variables by estimating the parameters of the Copula and to better see the relationship between data. This dependence is thereafter used for detecting the Redundant Values and noise in order to clean the original data, reduce them (eliminate Redundant attributes) and obtain representative Samples of good quality. We compared the proposed approach with singular values decomposition (SVD) technique, one of the most efficient method of Data mining.
提出了一种新的多维数据降维技术。该技术采用copula理论来估计多元联合概率分布,而不受代表数据维度的随机变量的特定类型的边际分布的约束。一个基于copulas的模型,提供了一个完整的和无标度的依赖性描述,更适合使用众所周知的多变量参数定律建模。该模型可方便地用于通过估计Copula的参数来比较随机变量之间的相关性,并能更好地看到数据之间的关系。这种依赖性随后被用于检测冗余值和噪声,以清理原始数据,减少它们(消除冗余属性)并获得高质量的代表性样本。我们将该方法与最有效的数据挖掘方法之一奇异值分解(SVD)技术进行了比较。
{"title":"A new method for dimensionality reduction of multi-dimensional data using Copulas","authors":"Rima Houari, A. Bounceur, Mohand Tahar Kechadi","doi":"10.1109/ISPS.2013.6581491","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581491","url":null,"abstract":"A new technique for the Dimensionality Reduction of Multi-Dimensional Data is presented in this paper. This technique employs the theory of Copulas to estimate the multivariate joint probability distribution without constraints to specific types of marginal distributions of random variables that represent the dimensions of our Data. A Copulas-based model, provides a complete and scale-free description of dependence that is more suitable to be modeled using well-known multivariate parametric laws. The model can be readily used for comparing of dependence of random variables by estimating the parameters of the Copula and to better see the relationship between data. This dependence is thereafter used for detecting the Redundant Values and noise in order to clean the original data, reduce them (eliminate Redundant attributes) and obtain representative Samples of good quality. We compared the proposed approach with singular values decomposition (SVD) technique, one of the most efficient method of Data mining.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117194201","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Information retrieval techniques for knowledge discovery in biomedical literature 生物医学文献中知识发现的信息检索技术
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581479
Sabrina Cherdioui, Fatiha Boubekeur
This paper presents our contribution to enhance literature-based discovery with information retrieval techniques. We propose the joint use of a flexible Information Retrieval model and MeSH Concepts for knowledge discovery in biomedical literature. The Information Retrieval model contributes to filter MEDLINE biomedical literature to the most relevant documents. Utilizing MeSH concepts allows to quickly identifying candidate concepts that could potentially validate a hypothesis. We have tested our approach by replicating the Swanson's first discovery on fish oil and Raynaud's disease correlation. The obtained results show the effectiveness of our approach.
本文介绍了我们在利用信息检索技术增强基于文献的发现方面的贡献。我们建议联合使用灵活的信息检索模型和MeSH概念来进行生物医学文献的知识发现。信息检索模型有助于将MEDLINE生物医学文献过滤到最相关的文献。利用MeSH概念可以快速识别可能验证假设的候选概念。我们通过复制斯旺森在鱼油和雷诺氏病相关性方面的首次发现来测试我们的方法。所得结果表明了该方法的有效性。
{"title":"Information retrieval techniques for knowledge discovery in biomedical literature","authors":"Sabrina Cherdioui, Fatiha Boubekeur","doi":"10.1109/ISPS.2013.6581479","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581479","url":null,"abstract":"This paper presents our contribution to enhance literature-based discovery with information retrieval techniques. We propose the joint use of a flexible Information Retrieval model and MeSH Concepts for knowledge discovery in biomedical literature. The Information Retrieval model contributes to filter MEDLINE biomedical literature to the most relevant documents. Utilizing MeSH concepts allows to quickly identifying candidate concepts that could potentially validate a hypothesis. We have tested our approach by replicating the Swanson's first discovery on fish oil and Raynaud's disease correlation. The obtained results show the effectiveness of our approach.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128546407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Fast simplification with sharp feature preserving for 3D point clouds 快速简化与尖锐的特征保留三维点云
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581492
H. Benhabiles, O. Aubreton, H. Barki, Hedi Tabia
This paper presents a fast point cloud simplification method that allows to preserve sharp edge points. The method is based on the combination of both clustering and coarse-to-fine simplification approaches. It consists to firstly create a coarse cloud using a clustering algorithm. Then each point of the resulting coarse cloud is assigned a weight that quantifies its importance, and allows to classify it into a sharp point or a simple point. Finally, both kinds of points are used to refine the coarse cloud and thus create a new simplified cloud characterized by high density of points in sharp regions and low density in flat regions. Experiments show that our algorithm is much faster than the last proposed simplification algorithm [1] which deals with sharp edge points preserving, and still produces similar results.
本文提出了一种快速的点云简化方法,可以保留尖锐的边缘点。该方法结合了聚类和粗精简化两种方法。它包括首先使用聚类算法创建粗云。然后,生成的粗云的每个点都被赋予一个量化其重要性的权重,并允许将其分类为尖锐点或简单点。最后,利用这两种点对粗云进行细化,从而形成一个新的简化云,其特征是尖锐区域的点密度高,平坦区域的点密度低。实验表明,我们的算法比最后提出的简化算法[1]要快得多,并且仍然产生相似的结果。
{"title":"Fast simplification with sharp feature preserving for 3D point clouds","authors":"H. Benhabiles, O. Aubreton, H. Barki, Hedi Tabia","doi":"10.1109/ISPS.2013.6581492","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581492","url":null,"abstract":"This paper presents a fast point cloud simplification method that allows to preserve sharp edge points. The method is based on the combination of both clustering and coarse-to-fine simplification approaches. It consists to firstly create a coarse cloud using a clustering algorithm. Then each point of the resulting coarse cloud is assigned a weight that quantifies its importance, and allows to classify it into a sharp point or a simple point. Finally, both kinds of points are used to refine the coarse cloud and thus create a new simplified cloud characterized by high density of points in sharp regions and low density in flat regions. Experiments show that our algorithm is much faster than the last proposed simplification algorithm [1] which deals with sharp edge points preserving, and still produces similar results.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130142041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 28
Improvement of scalar multiplication time for elliptic curve cryptosystems 椭圆曲线密码系统标量乘法时间的改进
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581494
M. Lehsaini, M. Feham, Chifaa Tabet Hellel
Sensor nodes have limited computing power and memory sizes. Sometimes, they are used in applications that require sending rapidly secure data to a remote control center. Therefore, they require lightweight techniques to accomplish this task. In this paper, we used Elliptical Curve Cryptography (ECC) for data encryption because ECC could create smaller and more efficient cryptographic keys compared to other cryptographic techniques such as RSA. We used specific algorithms to improve scalar multiplication time in spite of energy consumption. Moreover, we proposed a distributed scheme to enhance more the data delivery time from a source node to the base station by involving neighbors in the calculation. The results of experiments on TelosB motes showed considerable improvement of data delivery time.
传感器节点的计算能力和内存大小有限。有时,它们用于需要向远程控制中心快速发送安全数据的应用程序。因此,它们需要轻量级技术来完成这项任务。在本文中,我们使用椭圆曲线加密(ECC)进行数据加密,因为与RSA等其他加密技术相比,ECC可以创建更小、更有效的加密密钥。我们使用了特定的算法来提高标量乘法时间,尽管能量消耗。此外,我们还提出了一种分布式方案,通过将相邻节点纳入计算,进一步提高了数据从源节点到基站的传输时间。在TelosB上的实验结果表明,数据传输时间有了很大的提高。
{"title":"Improvement of scalar multiplication time for elliptic curve cryptosystems","authors":"M. Lehsaini, M. Feham, Chifaa Tabet Hellel","doi":"10.1109/ISPS.2013.6581494","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581494","url":null,"abstract":"Sensor nodes have limited computing power and memory sizes. Sometimes, they are used in applications that require sending rapidly secure data to a remote control center. Therefore, they require lightweight techniques to accomplish this task. In this paper, we used Elliptical Curve Cryptography (ECC) for data encryption because ECC could create smaller and more efficient cryptographic keys compared to other cryptographic techniques such as RSA. We used specific algorithms to improve scalar multiplication time in spite of energy consumption. Moreover, we proposed a distributed scheme to enhance more the data delivery time from a source node to the base station by involving neighbors in the calculation. The results of experiments on TelosB motes showed considerable improvement of data delivery time.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131424018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
TOP-SKY: Top-down algorithm for computing the skycube TOP-SKY:计算天空立方体的自顶向下算法
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581483
Samiha Brahimi, M. Kholladi, Amina Hamerelain
It is a big pleasure for the user to find the result of its skyline query without waiting for its processing; this can be realized by getting a saved version of this query. This paper focuses on the pre-computation of the skylines of all possible nonempty subsets of a given set of dimensions, what we call the skycube. We develop an efficient top-down approach called TOP-SKY for the skycube computation which derives the skyline objects from one of the subspace's parents adopting some techniques that help to achieve a better performance. In order to evaluate the effectiveness of the approach, TOP-SKY has been compared with the best algorithm in our knowledge Orion and with computing the cuboids of the skycube individually using BNL algorithm.
对于用户来说,不用等待天际线查询的处理就能找到查询结果是一件非常愉快的事情;这可以通过获取该查询的保存版本来实现。本文主要研究给定维度集合的所有可能的非空子集的天际线的预计算,我们称之为天际线立方体。我们开发了一种高效的自上而下的方法,称为TOP-SKY,用于天空立方体计算,它从子空间的父空间之一派生天际线对象,采用了一些有助于实现更好性能的技术。为了评估该方法的有效性,将TOP-SKY算法与目前已知的最佳算法Orion进行了比较,并与使用BNL算法单独计算天立方的长方体进行了比较。
{"title":"TOP-SKY: Top-down algorithm for computing the skycube","authors":"Samiha Brahimi, M. Kholladi, Amina Hamerelain","doi":"10.1109/ISPS.2013.6581483","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581483","url":null,"abstract":"It is a big pleasure for the user to find the result of its skyline query without waiting for its processing; this can be realized by getting a saved version of this query. This paper focuses on the pre-computation of the skylines of all possible nonempty subsets of a given set of dimensions, what we call the skycube. We develop an efficient top-down approach called TOP-SKY for the skycube computation which derives the skyline objects from one of the subspace's parents adopting some techniques that help to achieve a better performance. In order to evaluate the effectiveness of the approach, TOP-SKY has been compared with the best algorithm in our knowledge Orion and with computing the cuboids of the skycube individually using BNL algorithm.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133266683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Road traffic congestion estimation with macroscopic parameters 基于宏观参数的道路交通拥堵估计
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581489
Asmâa Ouessai, K. Mokhtar, Ouamri Abdelaziz
In this paper we propose an algorithm for road traffic density estimation, using macroscopic parameters, extracted from a video sequence. Macroscopic parameters are directly estimated by analyzing the global motion in the video scene without the need of motion detection and tracking methods. The extracted parameters are applied to the SVM classifier, to classify the road traffic in three categories: light, medium and heavy. The performance of the proposed algorithm is compared to that of the texture dynamic based traffic road classification method, using the same data base.
本文提出了一种从视频序列中提取宏观参数的道路交通密度估计算法。通过分析视频场景中的全局运动,直接估计宏观参数,不需要运动检测和跟踪方法。将提取的参数应用到SVM分类器中,将道路交通分为轻、中、重三类。在相同的数据库下,将该算法与基于纹理动态的交通道路分类方法进行了性能比较。
{"title":"Road traffic congestion estimation with macroscopic parameters","authors":"Asmâa Ouessai, K. Mokhtar, Ouamri Abdelaziz","doi":"10.1109/ISPS.2013.6581489","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581489","url":null,"abstract":"In this paper we propose an algorithm for road traffic density estimation, using macroscopic parameters, extracted from a video sequence. Macroscopic parameters are directly estimated by analyzing the global motion in the video scene without the need of motion detection and tracking methods. The extracted parameters are applied to the SVM classifier, to classify the road traffic in three categories: light, medium and heavy. The performance of the proposed algorithm is compared to that of the texture dynamic based traffic road classification method, using the same data base.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131987821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
L-P2DSA: Location-based privacy-preserving detection of Sybil attacks L-P2DSA:基于位置的Sybil攻击隐私保护检测
Pub Date : 2013-04-22 DOI: 10.1109/ISPS.2013.6581485
Kenza Mekliche, S. Moussaoui
In this paper we propose an approach that uses infrastructures and localization of nodes to detect Sybil attacks. Security and privacy are two major concerns in VANETs. Regrettably, most privacy-preserving schemes are prone to Sybil attacks, where a malicious user pretends to be multiple vehicles. L-P2DSA is an improvement to C-P2DAP [3], as it allows detecting Sybil attacks while reducing the load on the DMV. This is done due to the cooperation between adjacent RSUs to determine the location of suspicious nodes and measure a distinguishability degree between the positions of these malicious nodes. The detection in this manner doesn't need for any vehicle to disclose its identity; thus preserving privacy. The applicability of our contribution is validated through simulation of a realistic test case.
在本文中,我们提出了一种利用基础设施和节点的本地化来检测Sybil攻击的方法。安全和隐私是vanet的两个主要关注点。遗憾的是,大多数隐私保护方案都容易受到Sybil攻击,在这种攻击中,恶意用户假装是多个车辆。L-P2DSA是对C-P2DAP的改进[3],因为它允许检测Sybil攻击,同时减少DMV的负载。这是由于相邻rsu之间的合作来确定可疑节点的位置,并测量这些恶意节点位置之间的可区分程度。这种方式的检测不需要任何车辆披露其身份;从而保护隐私。通过模拟一个实际的测试用例,验证了我们的贡献的适用性。
{"title":"L-P2DSA: Location-based privacy-preserving detection of Sybil attacks","authors":"Kenza Mekliche, S. Moussaoui","doi":"10.1109/ISPS.2013.6581485","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581485","url":null,"abstract":"In this paper we propose an approach that uses infrastructures and localization of nodes to detect Sybil attacks. Security and privacy are two major concerns in VANETs. Regrettably, most privacy-preserving schemes are prone to Sybil attacks, where a malicious user pretends to be multiple vehicles. L-P2DSA is an improvement to C-P2DAP [3], as it allows detecting Sybil attacks while reducing the load on the DMV. This is done due to the cooperation between adjacent RSUs to determine the location of suspicious nodes and measure a distinguishability degree between the positions of these malicious nodes. The detection in this manner doesn't need for any vehicle to disclose its identity; thus preserving privacy. The applicability of our contribution is validated through simulation of a realistic test case.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"266 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123109634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 19
期刊
2013 11th International Symposium on Programming and Systems (ISPS)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1