首页 > 最新文献

2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing最新文献

英文 中文
A Network Construction Solution in Colleges or Universities 高校网络建设解决方案
Kailiang Liang
The overall objective of this program is to build an effective and practical and efficient campus network. First, it clarifys the needs of the existing campus network, such as online office, school finance, e-mail and other related network information services. Then it provides a method to meet current, namely the use of Gigabit Ethernet. Finally, the network indicates a greater in the future state solution: using Gigabit network.
本计划的总体目标是建立一个有效、实用、高效的校园网。首先,明确了现有校园网的需求,如网上办公、学校财务、电子邮件等相关的网络信息服务。然后提出了一种满足当前需求的方法,即采用千兆以太网。最后,网络表明了一个更大的未来状态解决方案:使用千兆网络。
{"title":"A Network Construction Solution in Colleges or Universities","authors":"Kailiang Liang","doi":"10.1109/IPTC.2011.55","DOIUrl":"https://doi.org/10.1109/IPTC.2011.55","url":null,"abstract":"The overall objective of this program is to build an effective and practical and efficient campus network. First, it clarifys the needs of the existing campus network, such as online office, school finance, e-mail and other related network information services. Then it provides a method to meet current, namely the use of Gigabit Ethernet. Finally, the network indicates a greater in the future state solution: using Gigabit network.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"132 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127094866","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Integrated Visual Saliency Based Local Feature Selection for Image Retrieval 基于视觉显著性的图像检索局部特征选择
Han-ping Gao, Zu-qiao Yang
nowadays, local features are widely used for content-based image retrieval. Effective feature selection is very important for the improvement of retrieval performance. Among various local feature extraction methods, Scale Invariant Feature Transform (SIFT) has been proven to be the most robust local invariant feature descriptor. However, the algorithm often generates hundreds of thousands of features per image, which has seriously affected the application of SIFT in content-based image retrieval. Therefore, this paper addresses this problem and proposes a novel method to select salient and distinctive local features using integrated visual saliency analysis. Based on our method, all of the SIFT features in an image are ranked with their integrated visual saliency, and only the most distinctive features will be reserved. The experiments demonstrate that the integrated visual saliency analysis based feature selection algorithm provides significant benefits both in retrieval accuracy and speed.
局部特征在基于内容的图像检索中得到了广泛的应用。有效的特征选择对提高检索性能非常重要。在各种局部特征提取方法中,尺度不变特征变换(SIFT)被证明是最鲁棒的局部不变特征描述子。然而,该算法往往在每张图像中产生数十万个特征,严重影响了SIFT在基于内容的图像检索中的应用。因此,本文针对这一问题,提出了一种利用综合视觉显著性分析来选择显著和独特的局部特征的新方法。基于我们的方法,图像中的所有SIFT特征根据其综合视觉显著性进行排序,只保留最显著的特征。实验表明,基于视觉显著性分析的综合特征选择算法在检索精度和速度上都有显著的提高。
{"title":"Integrated Visual Saliency Based Local Feature Selection for Image Retrieval","authors":"Han-ping Gao, Zu-qiao Yang","doi":"10.1109/IPTC.2011.19","DOIUrl":"https://doi.org/10.1109/IPTC.2011.19","url":null,"abstract":"nowadays, local features are widely used for content-based image retrieval. Effective feature selection is very important for the improvement of retrieval performance. Among various local feature extraction methods, Scale Invariant Feature Transform (SIFT) has been proven to be the most robust local invariant feature descriptor. However, the algorithm often generates hundreds of thousands of features per image, which has seriously affected the application of SIFT in content-based image retrieval. Therefore, this paper addresses this problem and proposes a novel method to select salient and distinctive local features using integrated visual saliency analysis. Based on our method, all of the SIFT features in an image are ranked with their integrated visual saliency, and only the most distinctive features will be reserved. The experiments demonstrate that the integrated visual saliency analysis based feature selection algorithm provides significant benefits both in retrieval accuracy and speed.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124833574","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Some Remarks on Convergence in Credibility Measure and Convergence in Credibility Distribution of Fuzzy Variable 关于可信度测度收敛性和模糊变量可信度分布收敛性的几点思考
Sheng-Ming Ma
In the paper, the convergence in credibility measure and the convergence in credibility distribution of fuzzy variable is discussed based on uncertainty theory.
本文从不确定性理论出发,讨论了模糊变量可信度测度的收敛性和可信度分布的收敛性。
{"title":"Some Remarks on Convergence in Credibility Measure and Convergence in Credibility Distribution of Fuzzy Variable","authors":"Sheng-Ming Ma","doi":"10.1109/IPTC.2011.32","DOIUrl":"https://doi.org/10.1109/IPTC.2011.32","url":null,"abstract":"In the paper, the convergence in credibility measure and the convergence in credibility distribution of fuzzy variable is discussed based on uncertainty theory.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"165 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115190641","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Research on Thin AP Architecture to Deploy Campus WLAN 基于瘦AP架构部署校园WLAN的研究
Kailiang Liang
In this paper, the program uses the wireless AP to deploy campus wlan, in order to achieve a wide network coverage and security, high-speed data transmission. To meet the future needs of the campus network, we use the 802.11n standard, and adopt a new technique in management: a hardware device by carrying two or more ssid, to achieve efficient management of the campus network, the more extensive coverage and further high-speed data transmission.
本文的方案采用无线AP部署校园wlan,以实现网络的广覆盖和安全、高速的数据传输。为了满足未来校园网的需求,我们采用了802.11n标准,并在管理上采用了一种新的技术:一个硬件设备通过承载两个或多个ssid,实现了对校园网的高效管理、更广泛的覆盖和进一步的高速数据传输。
{"title":"Research on Thin AP Architecture to Deploy Campus WLAN","authors":"Kailiang Liang","doi":"10.1109/IPTC.2011.56","DOIUrl":"https://doi.org/10.1109/IPTC.2011.56","url":null,"abstract":"In this paper, the program uses the wireless AP to deploy campus wlan, in order to achieve a wide network coverage and security, high-speed data transmission. To meet the future needs of the campus network, we use the 802.11n standard, and adopt a new technique in management: a hardware device by carrying two or more ssid, to achieve efficient management of the campus network, the more extensive coverage and further high-speed data transmission.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122309601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Comprehensive Prediction Method of Visit Priority for Focused Crawler 聚焦爬虫访问优先级的综合预测方法
Xueming Li, Minling Xing, Jiapei Zhang
The purpose of a focused crawler is to crawl more topical portions of the Internet precisely. How to predict the visit priorities of candidate URLs whose corresponding pages have yet to be fetched is the determining factor in the focused crawler's ability of getting more relevant pages. This paper introduces a comprehensive prediction method to address this problem. In this method, a page partition algorithm that partitions the page into smaller blocks and interclass rules that statistically capture linkage relationships among the topic classes are adopted to help the focused crawler cross tunnel and to enlarge the focused crawler's coverage, URL's address, anchor text and block content are used to predict visit priority more precisely. Experiments are carried out on the target topic of tennis and the results show that crawler based on this method is more effective than a rule-based crawler on harvest ratio.
聚焦爬虫的目的是精确地抓取Internet的更多主题部分。如何预测尚未获取相应页面的候选url的访问优先级是焦点爬虫获取更多相关页面能力的决定因素。本文介绍了一种综合预测方法来解决这一问题。该方法采用页面划分算法,将页面划分为更小的块,并采用类间规则统计捕获主题类之间的链接关系,帮助重点爬虫跨隧道,扩大重点爬虫的覆盖范围,使用URL地址、锚文本和块内容更精确地预测访问优先级。以网球为目标主题进行了实验,结果表明,基于该方法的爬行器在收获率方面比基于规则的爬行器更有效。
{"title":"A Comprehensive Prediction Method of Visit Priority for Focused Crawler","authors":"Xueming Li, Minling Xing, Jiapei Zhang","doi":"10.1109/IPTC.2011.14","DOIUrl":"https://doi.org/10.1109/IPTC.2011.14","url":null,"abstract":"The purpose of a focused crawler is to crawl more topical portions of the Internet precisely. How to predict the visit priorities of candidate URLs whose corresponding pages have yet to be fetched is the determining factor in the focused crawler's ability of getting more relevant pages. This paper introduces a comprehensive prediction method to address this problem. In this method, a page partition algorithm that partitions the page into smaller blocks and interclass rules that statistically capture linkage relationships among the topic classes are adopted to help the focused crawler cross tunnel and to enlarge the focused crawler's coverage, URL's address, anchor text and block content are used to predict visit priority more precisely. Experiments are carried out on the target topic of tennis and the results show that crawler based on this method is more effective than a rule-based crawler on harvest ratio.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126204399","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Teaching and Research of Cryptography in Information and Computer Science 信息与计算机科学中的密码学教学与研究
Kuobin Dai
With my school of Information and Computer Science "Cryptography" course teaching practice, this mainly from the theoretical teaching, laboratory teaching two aspects of cryptography in some of the feasibility of teaching methods are discussed, and teaching content design. Teaching practice in recent years show that the method can receive better teaching results. Since 2004 our school opened, "Cryptography" course, in practice, found that each year thesis topics of time, there were many students choose the topics and cryptography-related topics as a graduation project, and can pass defense, which can also prove that the teaching methods discussed in this article can receive good results.
结合我所在信息与计算机学院《密码学》课程的教学实践,本文主要从理论教学、实验教学两个方面对密码学在一些教学方法上的可行性进行了探讨,并对教学内容进行了设计。近年来的教学实践表明,该方法能收到较好的教学效果。自2004年我校开设《密码学》课程以来,在实践中发现,每年论文选题的时候,都有不少学生选择该课题与密码学相关的课题作为毕业设计,并且能够通过答辩,这也可以证明本文所探讨的教学方法能够收到良好的效果。
{"title":"Teaching and Research of Cryptography in Information and Computer Science","authors":"Kuobin Dai","doi":"10.1109/IPTC.2011.64","DOIUrl":"https://doi.org/10.1109/IPTC.2011.64","url":null,"abstract":"With my school of Information and Computer Science \"Cryptography\" course teaching practice, this mainly from the theoretical teaching, laboratory teaching two aspects of cryptography in some of the feasibility of teaching methods are discussed, and teaching content design. Teaching practice in recent years show that the method can receive better teaching results. Since 2004 our school opened, \"Cryptography\" course, in practice, found that each year thesis topics of time, there were many students choose the topics and cryptography-related topics as a graduation project, and can pass defense, which can also prove that the teaching methods discussed in this article can receive good results.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130062425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A Deconvolution Approach of 2D Data Statistical 二维数据统计的反卷积方法
Xiang'e Sun, Y. Ling, Jun Gao
Seismic data is generated by a sharp pulse, which transforms into to Earth and is reflected by the layer status in the Earth. The Data is 3D or 2D. Because of transforming through the Earth, Seismic data has wide main lobe and strong side lobe, which is different from that of the sharp pulse. In order to recover the character that is similar to that of sharp pulse, the 2D and 3D seismic data is usually processed by the deconvolution. The data includes down going data and up going data mainly. The deconvolution is done by down going data in the general procedure. In this paper, it is done by up going data statistical distinctively. The new methods can get stable deconvolution operator and deconvolution result. It can be used to process the data that is contaminated high-frequency noise. The new approach can get well data to be used in the following process flow and bring a better result to interpretation and application.
地震数据是由一个尖锐的脉冲产生的,该脉冲转换到地球上,并通过地球上的层态来反映。数据类型为3D或2D。由于地震数据是经过地球变换的,所以与尖锐脉冲不同,地震数据的主瓣宽,副瓣强。为了恢复与锐脉冲相似的特征,通常对二维和三维地震数据进行反褶积处理。数据主要包括下行数据和上行数据。反褶积是在一般程序中通过下行数据完成的。本文采用了独特的上升数据统计方法。新方法可以得到稳定的反卷积算子和反卷积结果。它可以用来处理被高频噪声污染的数据。该方法可将井资料用于后续的工艺流程中,为解释和应用带来较好的效果。
{"title":"A Deconvolution Approach of 2D Data Statistical","authors":"Xiang'e Sun, Y. Ling, Jun Gao","doi":"10.1109/IPTC.2011.25","DOIUrl":"https://doi.org/10.1109/IPTC.2011.25","url":null,"abstract":"Seismic data is generated by a sharp pulse, which transforms into to Earth and is reflected by the layer status in the Earth. The Data is 3D or 2D. Because of transforming through the Earth, Seismic data has wide main lobe and strong side lobe, which is different from that of the sharp pulse. In order to recover the character that is similar to that of sharp pulse, the 2D and 3D seismic data is usually processed by the deconvolution. The data includes down going data and up going data mainly. The deconvolution is done by down going data in the general procedure. In this paper, it is done by up going data statistical distinctively. The new methods can get stable deconvolution operator and deconvolution result. It can be used to process the data that is contaminated high-frequency noise. The new approach can get well data to be used in the following process flow and bring a better result to interpretation and application.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"393 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134522371","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Research and Implementation of Packet Classification Based on NDIS Intermediate Layer under Windows Platform Windows平台下基于NDIS中间层的数据包分类研究与实现
Yin Hu, Pei Lin
This paper introduces the study result of technique of NDIS driver development in Windows operating system, and designs a system based on NDIS to catch and classify the network packets. To improve the efficiency of packet classification and filter, the idea of Hash function is propose to classify and filter the packets.
本文介绍了Windows操作系统下NDIS驱动程序开发技术的研究成果,并设计了一个基于NDIS的网络数据包捕获与分类系统。为了提高分组分类和过滤的效率,提出了哈希函数的思想对分组进行分类和过滤。
{"title":"Research and Implementation of Packet Classification Based on NDIS Intermediate Layer under Windows Platform","authors":"Yin Hu, Pei Lin","doi":"10.1109/IPTC.2011.47","DOIUrl":"https://doi.org/10.1109/IPTC.2011.47","url":null,"abstract":"This paper introduces the study result of technique of NDIS driver development in Windows operating system, and designs a system based on NDIS to catch and classify the network packets. To improve the efficiency of packet classification and filter, the idea of Hash function is propose to classify and filter the packets.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134396241","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A Comprehensive Trust Model with Multi-participant Viewpoints in Online Auctions 基于多参与者视角的在线拍卖综合信任模型
Wenjia Wang, Dingwei Wang
This paper presents a comprehensive trust model mainly for the problems existed in the current assessment of trust in online auctions. The credibility of an auction participant usually can be assessed by multiply attributes of trust such as recent credibility, time weight, value of the transaction, etc. Different participants in online auction have different viewpoints on the trust. By means of evaluating the assessments of credibility with multi-viewpoints of different participants, we calculate credibility of participants separately. The comprehensive model is able to greatly improve the accuracy of calculation. Thus, it is more effective for analyzing the credibility of participants in online auction.
本文主要针对当前在线拍卖信任评估中存在的问题,提出了一种综合信任模型。拍卖参与者的可信度通常可以通过多个信任属性来评估,如最近可信度、时间权重、交易价值等。在网络拍卖中,不同的参与者对信任有着不同的看法。通过对不同参与者的多视点可信度评价进行评价,分别计算参与者的可信度。综合模型能大大提高计算精度。因此,对于在线拍卖参与者的可信度分析更为有效。
{"title":"A Comprehensive Trust Model with Multi-participant Viewpoints in Online Auctions","authors":"Wenjia Wang, Dingwei Wang","doi":"10.1109/IPTC.2011.30","DOIUrl":"https://doi.org/10.1109/IPTC.2011.30","url":null,"abstract":"This paper presents a comprehensive trust model mainly for the problems existed in the current assessment of trust in online auctions. The credibility of an auction participant usually can be assessed by multiply attributes of trust such as recent credibility, time weight, value of the transaction, etc. Different participants in online auction have different viewpoints on the trust. By means of evaluating the assessments of credibility with multi-viewpoints of different participants, we calculate credibility of participants separately. The comprehensive model is able to greatly improve the accuracy of calculation. Thus, it is more effective for analyzing the credibility of participants in online auction.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131835275","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Study on Histogram Equalization 直方图均衡化的研究
Wu Zhihong, X. Xiaohong
This paper does some study on histogram equalization. Especially, the examples of histogram equalization on the image show the difference of using two different mapping methods respectively.
本文对直方图均衡化进行了研究。通过对图像进行直方图均衡化的实例,分别展示了两种不同映射方法的差异。
{"title":"Study on Histogram Equalization","authors":"Wu Zhihong, X. Xiaohong","doi":"10.1109/IPTC.2011.52","DOIUrl":"https://doi.org/10.1109/IPTC.2011.52","url":null,"abstract":"This paper does some study on histogram equalization. Especially, the examples of histogram equalization on the image show the difference of using two different mapping methods respectively.","PeriodicalId":388589,"journal":{"name":"2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing","volume":"516 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115481361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
期刊
2011 2nd International Symposium on Intelligence Information Processing and Trusted Computing
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1