Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687461
Jiping Yang, Lijian Zhang, Xiaoxuan Chen
We first introduce the normalized Expected Utility-Entropy (EU-E) decision model, which is a weighted linear average of normalized expected utility and information entropy. Based on the normalized EU-E decision model, we establish a normalized EU-E investment decision model. Then we apply the model to stock selection when we invest in the 40 sample stocks of Shenzhen component index. It has concluded that portfolios of 4 stocks selected by normalized EU-E model with larger tradeoff coefficient λ are more efficient than that of those selected with smaller tradeoff coefficient λ with relative general utility function. Thus, this has demonstrated that we should not only take the expected utility of a risky action itself into account but also the information entropy to measure the uncertainty of the state of nature, which further verified the usefulness of the information entropy.
{"title":"Normalized Expected Utility-Entropy investment decision model and its application in stock selection","authors":"Jiping Yang, Lijian Zhang, Xiaoxuan Chen","doi":"10.1109/PIC.2010.5687461","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687461","url":null,"abstract":"We first introduce the normalized Expected Utility-Entropy (EU-E) decision model, which is a weighted linear average of normalized expected utility and information entropy. Based on the normalized EU-E decision model, we establish a normalized EU-E investment decision model. Then we apply the model to stock selection when we invest in the 40 sample stocks of Shenzhen component index. It has concluded that portfolios of 4 stocks selected by normalized EU-E model with larger tradeoff coefficient λ are more efficient than that of those selected with smaller tradeoff coefficient λ with relative general utility function. Thus, this has demonstrated that we should not only take the expected utility of a risky action itself into account but also the information entropy to measure the uncertainty of the state of nature, which further verified the usefulness of the information entropy.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130156941","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687996
Zhibin Yang, Kai Hu, Dian-fu Ma, Lei Pi, J. Bodeveix
AADL (Architectural Analysis & Design Language) is an architecture description language standard for embedded real-time systems, and it is widely used in aerospace and other safety-critical applications. However, the AADL standard lacks at present a formal semantics. This paper proposes a formal semantics and a verification framework of AADL models with regard to mode change. The precise semantics of AADL mode change protocol is defined by a translation into the TASM (Timed Abstract State Machine) formalism. Then the translational semantics is automated in the AADL2TASM tool, which provides model checking and simulation for AADL models. Finally, the approach is validated with a case study of an automotive cruise control system.
{"title":"Formal semantics and verification of AADL modes in Timed Abstract State Machine","authors":"Zhibin Yang, Kai Hu, Dian-fu Ma, Lei Pi, J. Bodeveix","doi":"10.1109/PIC.2010.5687996","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687996","url":null,"abstract":"AADL (Architectural Analysis & Design Language) is an architecture description language standard for embedded real-time systems, and it is widely used in aerospace and other safety-critical applications. However, the AADL standard lacks at present a formal semantics. This paper proposes a formal semantics and a verification framework of AADL models with regard to mode change. The precise semantics of AADL mode change protocol is defined by a translation into the TASM (Timed Abstract State Machine) formalism. Then the translational semantics is automated in the AADL2TASM tool, which provides model checking and simulation for AADL models. Finally, the approach is validated with a case study of an automotive cruise control system.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134345660","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687990
Longquan Yong, Fang'an Deng, Shemin Zhang
An iterative method for solving a class of nonnegative linear least squares problems is presented. Firstly, nonnegative least squares problem is transformed into monotone linear complementarity problem. Then we present an iterative algorithm for monotone linear complementarity problem based on the fixed-point principle. We prove that this method converges to optimal solution of original problem after finite iterations. At last, we give some numerical examples to indicate that the method is feasible and effective.
{"title":"Iterative method for a class of nonnegative linear least squares problems","authors":"Longquan Yong, Fang'an Deng, Shemin Zhang","doi":"10.1109/PIC.2010.5687990","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687990","url":null,"abstract":"An iterative method for solving a class of nonnegative linear least squares problems is presented. Firstly, nonnegative least squares problem is transformed into monotone linear complementarity problem. Then we present an iterative algorithm for monotone linear complementarity problem based on the fixed-point principle. We prove that this method converges to optimal solution of original problem after finite iterations. At last, we give some numerical examples to indicate that the method is feasible and effective.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"98 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130828349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687586
Rong-hua Ye, Shanshan Wei
Compared with classical SOA, autonomous web service (AWS) is emerging as a promising approach. In this approach, AWS is an autonomous service entity which can search for service requestor. In this paper, we introduce a model of AWS aggregation driven by requirement and propose a service collect-select mechanism to optimally select web services. Experimental results show that this approach outperforms “First Come First Served” approach.
{"title":"Composition-oriented autonomous web service aggregation and web service selection method","authors":"Rong-hua Ye, Shanshan Wei","doi":"10.1109/PIC.2010.5687586","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687586","url":null,"abstract":"Compared with classical SOA, autonomous web service (AWS) is emerging as a promising approach. In this approach, AWS is an autonomous service entity which can search for service requestor. In this paper, we introduce a model of AWS aggregation driven by requirement and propose a service collect-select mechanism to optimally select web services. Experimental results show that this approach outperforms “First Come First Served” approach.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"134 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133669560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687876
Q. Bian, Yuan-jun He, Hongming Cai
Due to the recent significant improvements in game entertainment, CAD, CAE and some other 3D fields, the number of 3D models is increasing rapidly. As a result, there is an increasing need for procedures supporting the automatic search for 3D objects in databases. In this paper, we present a novel framework to retrieve 3D models in 3 steps. Firstly, users input multiple ordered images of an object as input. Certain photo environment is needed. A voxel model will be generated automatically according to these inputs. Secondly, we provide a tool to modify voxel model. This procedure can help eliminate defects caused by uncertain photo environments. If the model produced by the first step is reasonable, this step will be optional. Finally, the modified model will be retrieved by a method called Solid-D2, which provides a fast and discriminating descriptor for 3D shapes. This algorithm is chosen to err on the side of retrieval speed with certain precision. Related models from the database will be listed. Experimental results indicate that the proposed method is easy-to-use, efficient and applicable for 3D model retrieval.
{"title":"3D model retrieval from multiple photographic images","authors":"Q. Bian, Yuan-jun He, Hongming Cai","doi":"10.1109/PIC.2010.5687876","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687876","url":null,"abstract":"Due to the recent significant improvements in game entertainment, CAD, CAE and some other 3D fields, the number of 3D models is increasing rapidly. As a result, there is an increasing need for procedures supporting the automatic search for 3D objects in databases. In this paper, we present a novel framework to retrieve 3D models in 3 steps. Firstly, users input multiple ordered images of an object as input. Certain photo environment is needed. A voxel model will be generated automatically according to these inputs. Secondly, we provide a tool to modify voxel model. This procedure can help eliminate defects caused by uncertain photo environments. If the model produced by the first step is reasonable, this step will be optional. Finally, the modified model will be retrieved by a method called Solid-D2, which provides a fast and discriminating descriptor for 3D shapes. This algorithm is chosen to err on the side of retrieval speed with certain precision. Related models from the database will be listed. Experimental results indicate that the proposed method is easy-to-use, efficient and applicable for 3D model retrieval.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133813191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687883
Jie Tang, Gangshan Wu, Boping Xu, Zhongliang Gong
This paper presented a fast algorithm which could measure similarity between two meshes interactively. The algorithm was based on CUDA (Compute Unified Device Architecture) technology. In order to fully utilize the computing power of GPU, we developed parallel method to construct uniform grid for fast space indexing of triangles. Special data structure was designed on device end to overcome the disadvantage of CUDA that it does not support dynamic allocation of memory. Lots of experiments were carried out and the results verified the effectiveness and efficiency of our algorithm.
{"title":"Fast mesh similarity measuring based on CUDA","authors":"Jie Tang, Gangshan Wu, Boping Xu, Zhongliang Gong","doi":"10.1109/PIC.2010.5687883","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687883","url":null,"abstract":"This paper presented a fast algorithm which could measure similarity between two meshes interactively. The algorithm was based on CUDA (Compute Unified Device Architecture) technology. In order to fully utilize the computing power of GPU, we developed parallel method to construct uniform grid for fast space indexing of triangles. Special data structure was designed on device end to overcome the disadvantage of CUDA that it does not support dynamic allocation of memory. Lots of experiments were carried out and the results verified the effectiveness and efficiency of our algorithm.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115225377","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687397
Yunhong Hu, G. He, Liang Fang, Jingyong Tang
With the development of information science and modern technology, it becomes more important about how to protect privacy information. In this paper, a novel privacy-preserving support vector machine (SVM) classifier is put forward for arbitrarily partitioned data. The proposed SVM classifier, which is public but does not reveal the privately-held data, has accuracy comparable to that of an ordinary SVM classifier based on the original data. We prove the feasibility of our algorithms by using matrix factorization theory and show the security.
{"title":"Privacy-preserving SVM classification on arbitrarily partitioned data","authors":"Yunhong Hu, G. He, Liang Fang, Jingyong Tang","doi":"10.1109/PIC.2010.5687397","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687397","url":null,"abstract":"With the development of information science and modern technology, it becomes more important about how to protect privacy information. In this paper, a novel privacy-preserving support vector machine (SVM) classifier is put forward for arbitrarily partitioned data. The proposed SVM classifier, which is public but does not reveal the privately-held data, has accuracy comparable to that of an ordinary SVM classifier based on the original data. We prove the feasibility of our algorithms by using matrix factorization theory and show the security.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114327477","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687885
Lei Wang, Xue-qing Li
In this paper, we propose a robust and image denoising method based on image statistic analysis. The image statistic method based on Weibull distribution is applied to image patch content analysis. According to the content analysis, image patches are classified into three types: smooth type, edge type and texture type. And then, different patch similarity measure method and measure window size are applied to denoise images with different types of patches. Based on the results from various different images, our content based NL-means algorithm is shown to have better performance in both PSNR and visual quality compared to the traditional NL-mean algorithm.
{"title":"Nonlocal image denoising algorithm based on image statistic","authors":"Lei Wang, Xue-qing Li","doi":"10.1109/PIC.2010.5687885","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687885","url":null,"abstract":"In this paper, we propose a robust and image denoising method based on image statistic analysis. The image statistic method based on Weibull distribution is applied to image patch content analysis. According to the content analysis, image patches are classified into three types: smooth type, edge type and texture type. And then, different patch similarity measure method and measure window size are applied to denoise images with different types of patches. Based on the results from various different images, our content based NL-means algorithm is shown to have better performance in both PSNR and visual quality compared to the traditional NL-mean algorithm.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114800213","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687888
Lin Mei, Xuan Cai, Weihao Liu
A conventional camera always captures blurred images of scene information away from the focal plane. Deblurring those blurred images is a very important and challenging job in many fields including optics, astronomy, computer vision and computer graphics. Since the defocus kernel plays a key role in the image degradation and the image recovery, many people have bent themselves to the study on the defocus kernel and achieved a lot of results. In this paper, we insert a broadband mask into the aperture of the camera lens to preserve more high frequency information for facilitating the defocus deblurring. We propose a genetic algorithm for computing the defocus kernel of a fixed distance away from the focal plane using a given sharp image and a defocused one. Finally, we implement extensive experiments to show that the computed defocus kernel can produce better images.
{"title":"Defocus deblurring with a coded aperture","authors":"Lin Mei, Xuan Cai, Weihao Liu","doi":"10.1109/PIC.2010.5687888","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687888","url":null,"abstract":"A conventional camera always captures blurred images of scene information away from the focal plane. Deblurring those blurred images is a very important and challenging job in many fields including optics, astronomy, computer vision and computer graphics. Since the defocus kernel plays a key role in the image degradation and the image recovery, many people have bent themselves to the study on the defocus kernel and achieved a lot of results. In this paper, we insert a broadband mask into the aperture of the camera lens to preserve more high frequency information for facilitating the defocus deblurring. We propose a genetic algorithm for computing the defocus kernel of a fixed distance away from the focal plane using a given sharp image and a defocused one. Finally, we implement extensive experiments to show that the computed defocus kernel can produce better images.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"68 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114805387","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/PIC.2010.5687467
Jia Liu, Guoai Xu, Yixian Yang, Yang Gao
Classified protection compliance detection is one of the important methods in information system safety control. In order to solve the uncertainty in compliance detection process, this paper proposes a method of the analysis of classified protection detection based on Dempster-Shafer theory of evidence for rank 4 information system. Firstly, the method establishes a scientific and reasonable index system of classified protection compliance detection for target system. Secondly, combine all the detection items' results of every level in index system with improved Dempster's rule of Combination so that we can obtain the final compliance detection result value with the combination level by level. Finally, verify the feasibility of the method with an example of information system. By using Dempster-Shafer theory of evidence can reduce the uncertainty in compliance detection and resolve the conflict between the various detection results.
{"title":"The analysis of classified protection compliance detection based on Dempster-Shafer theory","authors":"Jia Liu, Guoai Xu, Yixian Yang, Yang Gao","doi":"10.1109/PIC.2010.5687467","DOIUrl":"https://doi.org/10.1109/PIC.2010.5687467","url":null,"abstract":"Classified protection compliance detection is one of the important methods in information system safety control. In order to solve the uncertainty in compliance detection process, this paper proposes a method of the analysis of classified protection detection based on Dempster-Shafer theory of evidence for rank 4 information system. Firstly, the method establishes a scientific and reasonable index system of classified protection compliance detection for target system. Secondly, combine all the detection items' results of every level in index system with improved Dempster's rule of Combination so that we can obtain the final compliance detection result value with the combination level by level. Finally, verify the feasibility of the method with an example of information system. By using Dempster-Shafer theory of evidence can reduce the uncertainty in compliance detection and resolve the conflict between the various detection results.","PeriodicalId":142910,"journal":{"name":"2010 IEEE International Conference on Progress in Informatics and Computing","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116999561","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}