Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687864
Nima Khairdoost, N. Ghahraman
Constrained policy graph (CPG) is an imaginative graph and is in a high level understanding in comparison with pure logic. In this model we can describe the policies in constrained form according to the related system. In addition to the ability of describing ACPs, CPG model is able to combine policies and analyze them in order to detect possible conflicts arising from ACPs combination. Term rewriting systems are practical systems used in different fields including automatic theorem proving and developing computational models. Using term rewriting can help us in formal description and verification of access control policies (ACPs) and models. In this article after expression of how policies are described, their combination and conflict detection in CPG model, we describe them using term rewriting rules. These rules are appropriate tools for the automatic analysis of policies and conflict detection after their combination.
{"title":"Term rewriting for describing constrained policy graph and conflict detection","authors":"Nima Khairdoost, N. Ghahraman","doi":"10.1109/PIC.2010.5687864","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687864","url":null,"abstract":"Constrained policy graph (CPG) is an imaginative graph and is in a high level understanding in comparison with pure logic. In this model we can describe the policies in constrained form according to the related system. In addition to the ability of describing ACPs, CPG model is able to combine policies and analyze them in order to detect possible conflicts arising from ACPs combination. Term rewriting systems are practical systems used in different fields including automatic theorem proving and developing computational models. Using term rewriting can help us in formal description and verification of access control policies (ACPs) and models. In this article after expression of how policies are described, their combination and conflict detection in CPG model, we describe them using term rewriting rules. These rules are appropriate tools for the automatic analysis of policies and conflict detection after their combination.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122434406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5688013
Wenhui Zhang, Huan Zhou, L. Tang, X. Zhou
Creating and rendering realistic ocean is one of the most daunting tasks in computer graphics. An efficient algorithm is used to render ocean waves by taking advantage of the parallelism and programmability of GPU and the new characteristics of vertex sampling of Shader Model 3.0. The ocean modeling is optimized by level-of-detail (LOD) technology. The real-time wave simulation is realized by the technology of animated texture associated with time, which storages 250 pictures of the height map of ocean surface and is used to superimpose the grid vertex. The illumination effects of the water, such as the reflection, refraction and Fresnel effects, are rendered. Experimental results are shown very well to meet the photorealism and real-time requirement (>60fps) and can be applied to generate real-time water in visual reality.
创建和渲染逼真的海洋是计算机图形学中最艰巨的任务之一。利用GPU的并行性和可编程性以及Shader Model 3.0顶点采样的新特性,提出了一种高效的海浪渲染算法。采用细节级(LOD)技术对海洋模型进行优化。采用与时间关联的动画纹理技术,存储了250幅海面高度图,并将网格顶点进行叠加,实现了海浪的实时模拟。水的照明效果,如反射,折射和菲涅耳效应,被渲染。实验结果表明,该方法能够很好地满足真实感和实时性要求(>60fps),可用于视觉现实中的实时水生成。
{"title":"Realistic real-time rendering for ocean waves on GPU","authors":"Wenhui Zhang, Huan Zhou, L. Tang, X. Zhou","doi":"10.1109/PIC.2010.5688013","DOIUrl":"https://doi.org/10.1109/PIC.2010.5688013","url":null,"abstract":"Creating and rendering realistic ocean is one of the most daunting tasks in computer graphics. An efficient algorithm is used to render ocean waves by taking advantage of the parallelism and programmability of GPU and the new characteristics of vertex sampling of Shader Model 3.0. The ocean modeling is optimized by level-of-detail (LOD) technology. The real-time wave simulation is realized by the technology of animated texture associated with time, which storages 250 pictures of the height map of ocean surface and is used to superimpose the grid vertex. The illumination effects of the water, such as the reflection, refraction and Fresnel effects, are rendered. Experimental results are shown very well to meet the photorealism and real-time requirement (>60fps) and can be applied to generate real-time water in visual reality.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116031536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687899
Rui Ling, Yuan-jun He, Kairen Deng
How to use computers to effectively solve geometric computation problems is one important focus in the development of geometry. In this paper, we introduce a new method to solve geometric problems with a geometric method. We establish a set of geometric bases and generate sequences of these geometric bases automatically with forward-reasoning. The geometric base sequence is a new description of the solution of geometric problems which is more readable than the solution generated by algebra methods. Moreover, we modify the hidden Markov chain model to avoid information explosion. Experimental results indicate that our method can be used to generate the sequences efficiently.
{"title":"Automatic generation of geometric base sequences","authors":"Rui Ling, Yuan-jun He, Kairen Deng","doi":"10.1109/PIC.2010.5687899","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687899","url":null,"abstract":"How to use computers to effectively solve geometric computation problems is one important focus in the development of geometry. In this paper, we introduce a new method to solve geometric problems with a geometric method. We establish a set of geometric bases and generate sequences of these geometric bases automatically with forward-reasoning. The geometric base sequence is a new description of the solution of geometric problems which is more readable than the solution generated by algebra methods. Moreover, we modify the hidden Markov chain model to avoid information explosion. Experimental results indicate that our method can be used to generate the sequences efficiently.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"294 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116696746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5688018
Ken Chen, Yicong Wang, G. Jiang, L. Banta
Saddle points formed through morphologic erosion and existing between adjacent connecting objects in 2D images have been applied for segmenting purposes. In this article, a new approach is presented for searching the saddle points in 2D images using mathematic programming restraints for the purpose of ultimately separating the connecting objects. By combining the pixel distribution information in 3D topographic image and mathematic programming restraints for saddle point, the saddle points in the image can thus be identified. In addition, the relation between step selection in the algorithm and detection rate is also explored. The experiment results on the given real particle images suggest the better robustness in saddle point detection algorithm, which undoubtedly lays the practical and theoretic base for touching object segmentation for 2D images.
{"title":"Saddle point detection for connecting objects in 2D images based on mathematic programming restraints","authors":"Ken Chen, Yicong Wang, G. Jiang, L. Banta","doi":"10.1109/PIC.2010.5688018","DOIUrl":"https://doi.org/10.1109/PIC.2010.5688018","url":null,"abstract":"Saddle points formed through morphologic erosion and existing between adjacent connecting objects in 2D images have been applied for segmenting purposes. In this article, a new approach is presented for searching the saddle points in 2D images using mathematic programming restraints for the purpose of ultimately separating the connecting objects. By combining the pixel distribution information in 3D topographic image and mathematic programming restraints for saddle point, the saddle points in the image can thus be identified. In addition, the relation between step selection in the algorithm and detection rate is also explored. The experiment results on the given real particle images suggest the better robustness in saddle point detection algorithm, which undoubtedly lays the practical and theoretic base for touching object segmentation for 2D images.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"145 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116864699","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687891
Bao Li, Wei Jiang, Zhi-Quan Cheng, Gang Dang, Shiyao Jin
We introduce a novel method for the consolidation of unorganized point clouds with noise, outliers, non-uniformities as well as sharp features. This method is feature preserving, in the sense that given an initial estimation of normal, it is able to recover the sharp features contained in the original geometric data which are usually contaminated during the acquisition. The key ingredient of our approach is a weighting term from normal space as an effective complement to the recently proposed consolidation techniques. Moreover, a normal mollification step is employed during the consolidation to get normal information respecting sharp features besides the position of each point. Experiments on both synthetic and real-world scanned models validate the ability of our approach in producing denoised, evenly distributed and feature preserving point clouds, which are preferred by most surface reconstruction methods.
{"title":"Feature preserving consolidation for unorganized point clouds","authors":"Bao Li, Wei Jiang, Zhi-Quan Cheng, Gang Dang, Shiyao Jin","doi":"10.1109/PIC.2010.5687891","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687891","url":null,"abstract":"We introduce a novel method for the consolidation of unorganized point clouds with noise, outliers, non-uniformities as well as sharp features. This method is feature preserving, in the sense that given an initial estimation of normal, it is able to recover the sharp features contained in the original geometric data which are usually contaminated during the acquisition. The key ingredient of our approach is a weighting term from normal space as an effective complement to the recently proposed consolidation techniques. Moreover, a normal mollification step is employed during the consolidation to get normal information respecting sharp features besides the position of each point. Experiments on both synthetic and real-world scanned models validate the ability of our approach in producing denoised, evenly distributed and feature preserving point clouds, which are preferred by most surface reconstruction methods.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123215529","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687444
J. Zhang, Kanyu Zhang
Tabu search algorithm has been applied to solve the optimal load distribution strategy problem for the cooling system constituted by multiple chiller water units, which has the characteristic such as complexity, constraint, nonlinearity, modeling difficulty, etc. The tabu search algorithms based on the neighborhood search can avoid the local optimization avoidance and has the artificial intelligence memory mechanism. In this paper, two chiller water units connected in parallel working using the tabu algorithm was observed. Compared with the conventional method, the results indicated that the tabu search algorithms has much less power consumption and is very suitable for application in air condition system operation.
{"title":"Application of tabu search heuristic algorithms for the purpose of energy saving in optimal load distribution strategy for multiple chiller water units","authors":"J. Zhang, Kanyu Zhang","doi":"10.1109/PIC.2010.5687444","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687444","url":null,"abstract":"Tabu search algorithm has been applied to solve the optimal load distribution strategy problem for the cooling system constituted by multiple chiller water units, which has the characteristic such as complexity, constraint, nonlinearity, modeling difficulty, etc. The tabu search algorithms based on the neighborhood search can avoid the local optimization avoidance and has the artificial intelligence memory mechanism. In this paper, two chiller water units connected in parallel working using the tabu algorithm was observed. Compared with the conventional method, the results indicated that the tabu search algorithms has much less power consumption and is very suitable for application in air condition system operation.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123665081","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687938
Yan Zhao, Weimin Wei
In this paper, a new image hashing method using Zernike moments is proposed. This method is based on rotation invariance of magnitudes and corrected phases of Zernike moments. At first the input image is divided into overlapped blocks. Zernike moments of these blocks are calculated and then each of the amplitudes and phases of modified Zernike moments is then encoded into three bits to form the intermediate hash. Lastly, the final hash sequence is obtained by pseudo-randomly permuting the intermediate hash sequence. Similarity between hashes is measured with the Hamming distance. Experimental results show that this method is robust against most content-preserving attacks. The Hamming distance of Hashes between two different images is bigger than the threshold. This method can be used to detect tampering image, and can locate the tampered region in the image.
{"title":"Perceptual image hash for tampering detection using Zernike moments","authors":"Yan Zhao, Weimin Wei","doi":"10.1109/PIC.2010.5687938","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687938","url":null,"abstract":"In this paper, a new image hashing method using Zernike moments is proposed. This method is based on rotation invariance of magnitudes and corrected phases of Zernike moments. At first the input image is divided into overlapped blocks. Zernike moments of these blocks are calculated and then each of the amplitudes and phases of modified Zernike moments is then encoded into three bits to form the intermediate hash. Lastly, the final hash sequence is obtained by pseudo-randomly permuting the intermediate hash sequence. Similarity between hashes is measured with the Hamming distance. Experimental results show that this method is robust against most content-preserving attacks. The Hamming distance of Hashes between two different images is bigger than the threshold. This method can be used to detect tampering image, and can locate the tampered region in the image.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123869649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687580
Jianneng Chen, Zhenjie Huang
Certificate-based public key cryptography was introduced to remove the use of certificate to ensure the authentication of the user's public key in the traditional cryptography and overcome the key escrow problem in the identity-based public key cryptography. The proxy signature schemes allow proxy signers to sign messages on behalf of an original signer. Combining the concept of certificate-based signature with the concept of proxy signature, in this paper, we present a notion of certificate-based proxy signature based on bilinear parings and proposed a scheme assuming the hardness of Computational Diffie-Hellman Problem.
{"title":"Certificate-based proxy signature","authors":"Jianneng Chen, Zhenjie Huang","doi":"10.1109/PIC.2010.5687580","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687580","url":null,"abstract":"Certificate-based public key cryptography was introduced to remove the use of certificate to ensure the authentication of the user's public key in the traditional cryptography and overcome the key escrow problem in the identity-based public key cryptography. The proxy signature schemes allow proxy signers to sign messages on behalf of an original signer. Combining the concept of certificate-based signature with the concept of proxy signature, in this paper, we present a notion of certificate-based proxy signature based on bilinear parings and proposed a scheme assuming the hardness of Computational Diffie-Hellman Problem.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124175429","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687455
Y. Wu, Jianguo Zheng
Collaborative filtering recommendation algorithm is one of the most successful technologies in the e-commerce recommendation system. With the development of e-commerce, the magnitudes of users and commodities grow rapidly; the performance of traditional recommendation algorithm is getting worse. So propose a new similarity measure method, automatically generate weighting factor to combine dynamically item attribute similarity and score similarity, form a reasonable item similarity, which bring the nearest neighbors of item, and predict the item's rating to recommend. The experimental results show the algorithm enhance the steady and precision of recommendation, solve cold start issue.
{"title":"A collaborative filtering recommendation algorithm based on improved similarity measure method","authors":"Y. Wu, Jianguo Zheng","doi":"10.1109/PIC.2010.5687455","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687455","url":null,"abstract":"Collaborative filtering recommendation algorithm is one of the most successful technologies in the e-commerce recommendation system. With the development of e-commerce, the magnitudes of users and commodities grow rapidly; the performance of traditional recommendation algorithm is getting worse. So propose a new similarity measure method, automatically generate weighting factor to combine dynamically item attribute similarity and score similarity, form a reasonable item similarity, which bring the nearest neighbors of item, and predict the item's rating to recommend. The experimental results show the algorithm enhance the steady and precision of recommendation, solve cold start issue.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130017980","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687872
Noor Mazlina Mahmod, Syafeeza Ahmad Radzi
This paper addresses the problem of managing variance in process instances for business process modeling. In business work practice, variance is a valuable source of organizational intellectual capital that needs to be captured and capitalized as it represents a preferred and successful work practice. Therefore, it is important to provide an effective method to analyze the similarity between these variants since it can bring benefits for organization productivity and provide consistency. Through this paper, we propose a systematic approach to deal with the complexity of business process variants by analyzing the structure relationship and the execution construct in order to measure the similarity degree of the variants.
{"title":"An approach to analyse similarity of business process variants","authors":"Noor Mazlina Mahmod, Syafeeza Ahmad Radzi","doi":"10.1109/PIC.2010.5687872","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687872","url":null,"abstract":"This paper addresses the problem of managing variance in process instances for business process modeling. In business work practice, variance is a valuable source of organizational intellectual capital that needs to be captured and capitalized as it represents a preferred and successful work practice. Therefore, it is important to provide an effective method to analyze the similarity between these variants since it can bring benefits for organization productivity and provide consistency. Through this paper, we propose a systematic approach to deal with the complexity of business process variants by analyzing the structure relationship and the execution construct in order to measure the similarity degree of the variants.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126186143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}