The overall objective of this program is to build an effective and practical and efficient campus network. First, it clarifys the needs of the existing campus network, such as online office, school finance, e-mail and other related network information services. Then it provides a method to meet current, namely the use of Gigabit Ethernet. Finally, the network indicates a greater in the future state solution: using Gigabit network.
{"title":"A Network Construction Solution in Colleges or Universities","authors":"Kailiang Liang","doi":"10.1109/IPTC.2011.55","DOIUrl":"https://doi.org/10.1109/IPTC.2011.55","url":null,"abstract":"The overall objective of this program is to build an effective and practical and efficient campus network. First, it clarifys the needs of the existing campus network, such as online office, school finance, e-mail and other related network information services. Then it provides a method to meet current, namely the use of Gigabit Ethernet. Finally, the network indicates a greater in the future state solution: using Gigabit network.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"132 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127094866","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
nowadays, local features are widely used for content-based image retrieval. Effective feature selection is very important for the improvement of retrieval performance. Among various local feature extraction methods, Scale Invariant Feature Transform (SIFT) has been proven to be the most robust local invariant feature descriptor. However, the algorithm often generates hundreds of thousands of features per image, which has seriously affected the application of SIFT in content-based image retrieval. Therefore, this paper addresses this problem and proposes a novel method to select salient and distinctive local features using integrated visual saliency analysis. Based on our method, all of the SIFT features in an image are ranked with their integrated visual saliency, and only the most distinctive features will be reserved. The experiments demonstrate that the integrated visual saliency analysis based feature selection algorithm provides significant benefits both in retrieval accuracy and speed.
{"title":"Integrated Visual Saliency Based Local Feature Selection for Image Retrieval","authors":"Han-ping Gao, Zu-qiao Yang","doi":"10.1109/IPTC.2011.19","DOIUrl":"https://doi.org/10.1109/IPTC.2011.19","url":null,"abstract":"nowadays, local features are widely used for content-based image retrieval. Effective feature selection is very important for the improvement of retrieval performance. Among various local feature extraction methods, Scale Invariant Feature Transform (SIFT) has been proven to be the most robust local invariant feature descriptor. However, the algorithm often generates hundreds of thousands of features per image, which has seriously affected the application of SIFT in content-based image retrieval. Therefore, this paper addresses this problem and proposes a novel method to select salient and distinctive local features using integrated visual saliency analysis. Based on our method, all of the SIFT features in an image are ranked with their integrated visual saliency, and only the most distinctive features will be reserved. The experiments demonstrate that the integrated visual saliency analysis based feature selection algorithm provides significant benefits both in retrieval accuracy and speed.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124833574","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In the paper, the convergence in credibility measure and the convergence in credibility distribution of fuzzy variable is discussed based on uncertainty theory.
本文从不确定性理论出发,讨论了模糊变量可信度测度的收敛性和可信度分布的收敛性。
{"title":"Some Remarks on Convergence in Credibility Measure and Convergence in Credibility Distribution of Fuzzy Variable","authors":"Sheng-Ming Ma","doi":"10.1109/IPTC.2011.32","DOIUrl":"https://doi.org/10.1109/IPTC.2011.32","url":null,"abstract":"In the paper, the convergence in credibility measure and the convergence in credibility distribution of fuzzy variable is discussed based on uncertainty theory.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"165 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115190641","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, the program uses the wireless AP to deploy campus wlan, in order to achieve a wide network coverage and security, high-speed data transmission. To meet the future needs of the campus network, we use the 802.11n standard, and adopt a new technique in management: a hardware device by carrying two or more ssid, to achieve efficient management of the campus network, the more extensive coverage and further high-speed data transmission.
{"title":"Research on Thin AP Architecture to Deploy Campus WLAN","authors":"Kailiang Liang","doi":"10.1109/IPTC.2011.56","DOIUrl":"https://doi.org/10.1109/IPTC.2011.56","url":null,"abstract":"In this paper, the program uses the wireless AP to deploy campus wlan, in order to achieve a wide network coverage and security, high-speed data transmission. To meet the future needs of the campus network, we use the 802.11n standard, and adopt a new technique in management: a hardware device by carrying two or more ssid, to achieve efficient management of the campus network, the more extensive coverage and further high-speed data transmission.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122309601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The purpose of a focused crawler is to crawl more topical portions of the Internet precisely. How to predict the visit priorities of candidate URLs whose corresponding pages have yet to be fetched is the determining factor in the focused crawler's ability of getting more relevant pages. This paper introduces a comprehensive prediction method to address this problem. In this method, a page partition algorithm that partitions the page into smaller blocks and interclass rules that statistically capture linkage relationships among the topic classes are adopted to help the focused crawler cross tunnel and to enlarge the focused crawler's coverage, URL's address, anchor text and block content are used to predict visit priority more precisely. Experiments are carried out on the target topic of tennis and the results show that crawler based on this method is more effective than a rule-based crawler on harvest ratio.
{"title":"A Comprehensive Prediction Method of Visit Priority for Focused Crawler","authors":"Xueming Li, Minling Xing, Jiapei Zhang","doi":"10.1109/IPTC.2011.14","DOIUrl":"https://doi.org/10.1109/IPTC.2011.14","url":null,"abstract":"The purpose of a focused crawler is to crawl more topical portions of the Internet precisely. How to predict the visit priorities of candidate URLs whose corresponding pages have yet to be fetched is the determining factor in the focused crawler's ability of getting more relevant pages. This paper introduces a comprehensive prediction method to address this problem. In this method, a page partition algorithm that partitions the page into smaller blocks and interclass rules that statistically capture linkage relationships among the topic classes are adopted to help the focused crawler cross tunnel and to enlarge the focused crawler's coverage, URL's address, anchor text and block content are used to predict visit priority more precisely. Experiments are carried out on the target topic of tennis and the results show that crawler based on this method is more effective than a rule-based crawler on harvest ratio.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126204399","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
With my school of Information and Computer Science "Cryptography" course teaching practice, this mainly from the theoretical teaching, laboratory teaching two aspects of cryptography in some of the feasibility of teaching methods are discussed, and teaching content design. Teaching practice in recent years show that the method can receive better teaching results. Since 2004 our school opened, "Cryptography" course, in practice, found that each year thesis topics of time, there were many students choose the topics and cryptography-related topics as a graduation project, and can pass defense, which can also prove that the teaching methods discussed in this article can receive good results.
{"title":"Teaching and Research of Cryptography in Information and Computer Science","authors":"Kuobin Dai","doi":"10.1109/IPTC.2011.64","DOIUrl":"https://doi.org/10.1109/IPTC.2011.64","url":null,"abstract":"With my school of Information and Computer Science \"Cryptography\" course teaching practice, this mainly from the theoretical teaching, laboratory teaching two aspects of cryptography in some of the feasibility of teaching methods are discussed, and teaching content design. Teaching practice in recent years show that the method can receive better teaching results. Since 2004 our school opened, \"Cryptography\" course, in practice, found that each year thesis topics of time, there were many students choose the topics and cryptography-related topics as a graduation project, and can pass defense, which can also prove that the teaching methods discussed in this article can receive good results.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130062425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Seismic data is generated by a sharp pulse, which transforms into to Earth and is reflected by the layer status in the Earth. The Data is 3D or 2D. Because of transforming through the Earth, Seismic data has wide main lobe and strong side lobe, which is different from that of the sharp pulse. In order to recover the character that is similar to that of sharp pulse, the 2D and 3D seismic data is usually processed by the deconvolution. The data includes down going data and up going data mainly. The deconvolution is done by down going data in the general procedure. In this paper, it is done by up going data statistical distinctively. The new methods can get stable deconvolution operator and deconvolution result. It can be used to process the data that is contaminated high-frequency noise. The new approach can get well data to be used in the following process flow and bring a better result to interpretation and application.
{"title":"A Deconvolution Approach of 2D Data Statistical","authors":"Xiang'e Sun, Y. Ling, Jun Gao","doi":"10.1109/IPTC.2011.25","DOIUrl":"https://doi.org/10.1109/IPTC.2011.25","url":null,"abstract":"Seismic data is generated by a sharp pulse, which transforms into to Earth and is reflected by the layer status in the Earth. The Data is 3D or 2D. Because of transforming through the Earth, Seismic data has wide main lobe and strong side lobe, which is different from that of the sharp pulse. In order to recover the character that is similar to that of sharp pulse, the 2D and 3D seismic data is usually processed by the deconvolution. The data includes down going data and up going data mainly. The deconvolution is done by down going data in the general procedure. In this paper, it is done by up going data statistical distinctively. The new methods can get stable deconvolution operator and deconvolution result. It can be used to process the data that is contaminated high-frequency noise. The new approach can get well data to be used in the following process flow and bring a better result to interpretation and application.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"393 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134522371","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper introduces the study result of technique of NDIS driver development in Windows operating system, and designs a system based on NDIS to catch and classify the network packets. To improve the efficiency of packet classification and filter, the idea of Hash function is propose to classify and filter the packets.
{"title":"Research and Implementation of Packet Classification Based on NDIS Intermediate Layer under Windows Platform","authors":"Yin Hu, Pei Lin","doi":"10.1109/IPTC.2011.47","DOIUrl":"https://doi.org/10.1109/IPTC.2011.47","url":null,"abstract":"This paper introduces the study result of technique of NDIS driver development in Windows operating system, and designs a system based on NDIS to catch and classify the network packets. To improve the efficiency of packet classification and filter, the idea of Hash function is propose to classify and filter the packets.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134396241","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents a comprehensive trust model mainly for the problems existed in the current assessment of trust in online auctions. The credibility of an auction participant usually can be assessed by multiply attributes of trust such as recent credibility, time weight, value of the transaction, etc. Different participants in online auction have different viewpoints on the trust. By means of evaluating the assessments of credibility with multi-viewpoints of different participants, we calculate credibility of participants separately. The comprehensive model is able to greatly improve the accuracy of calculation. Thus, it is more effective for analyzing the credibility of participants in online auction.
{"title":"A Comprehensive Trust Model with Multi-participant Viewpoints in Online Auctions","authors":"Wenjia Wang, Dingwei Wang","doi":"10.1109/IPTC.2011.30","DOIUrl":"https://doi.org/10.1109/IPTC.2011.30","url":null,"abstract":"This paper presents a comprehensive trust model mainly for the problems existed in the current assessment of trust in online auctions. The credibility of an auction participant usually can be assessed by multiply attributes of trust such as recent credibility, time weight, value of the transaction, etc. Different participants in online auction have different viewpoints on the trust. By means of evaluating the assessments of credibility with multi-viewpoints of different participants, we calculate credibility of participants separately. The comprehensive model is able to greatly improve the accuracy of calculation. Thus, it is more effective for analyzing the credibility of participants in online auction.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131835275","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper does some study on histogram equalization. Especially, the examples of histogram equalization on the image show the difference of using two different mapping methods respectively.
本文对直方图均衡化进行了研究。通过对图像进行直方图均衡化的实例,分别展示了两种不同映射方法的差异。
{"title":"Study on Histogram Equalization","authors":"Wu Zhihong, X. Xiaohong","doi":"10.1109/IPTC.2011.52","DOIUrl":"https://doi.org/10.1109/IPTC.2011.52","url":null,"abstract":"This paper does some study on histogram equalization. Especially, the examples of histogram equalization on the image show the difference of using two different mapping methods respectively.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"516 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115481361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}