首页 > 最新文献

Computer Science Review最新文献

英文 中文
Random spanning trees and forests: a geometric focus 随机生成树和森林:一个几何焦点
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-22 DOI: 10.1016/j.cosrev.2025.100857
Lyuben Lichev , Dieter Mitsche , Xavier Pérez-Giménez
The current article surveys the rich literature on spanning trees with a special focus on geometric graph models.
本文综述了大量关于生成树的文献,特别关注几何图模型。
{"title":"Random spanning trees and forests: a geometric focus","authors":"Lyuben Lichev ,&nbsp;Dieter Mitsche ,&nbsp;Xavier Pérez-Giménez","doi":"10.1016/j.cosrev.2025.100857","DOIUrl":"10.1016/j.cosrev.2025.100857","url":null,"abstract":"<div><div>The current article surveys the rich literature on spanning trees with a special focus on geometric graph models.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100857"},"PeriodicalIF":12.7,"publicationDate":"2025-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145567470","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
On the generalized coloring numbers 关于广义着色数
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-14 DOI: 10.1016/j.cosrev.2025.100855
Sebastian Siebertz
The coloring number col(G) of a graph G, which is equal to the degeneracy of G plus one, provides a very useful measure for the uniform sparsity of G. The coloring number is generalized by three series of measures, the generalized coloring numbers. These are the r-admissibility admr(G), the strong r-coloring number colr(G) and the weak r-coloring number wcolr(G), where r is an integer parameter. The generalized coloring numbers measure the edge density of bounded-depth minors and thereby provide an even more uniform measure of sparsity of graphs. They have found many applications in graph theory and in particular play a key role in the theory of bounded expansion and nowhere dense graph classes introduced by Nešetřil and Ossona de Mendez. We overview combinatorial and algorithmic applications of the generalized coloring numbers, emphasizing new developments in this area. We also present a simple proof for the existence of uniform orders and improve known bounds, e.g., for the weak coloring numbers on graphs with excluded topological minors.
图G的着色数col(G)等于G + 1的简并度,它为图G的一致稀疏性提供了一个非常有用的测度。它们是r可容许admr(G),强r-染色数colr(G)和弱r-染色数wcolr(G),其中r是一个整数参数。广义着色数测量了有界深度次元的边密度,从而提供了更统一的图稀疏度度量。他们在图论中找到了许多应用,特别是在Nešetřil和Ossona de Mendez引入的有界展开和无处密集图类理论中发挥了关键作用。本文综述了广义着色数的组合和算法应用,重点介绍了这一领域的新进展。我们也给出了一致阶的存在性的一个简单证明,并改进了已知界,例如,对于具有排除拓扑次元的图上的弱着色数的证明。
{"title":"On the generalized coloring numbers","authors":"Sebastian Siebertz","doi":"10.1016/j.cosrev.2025.100855","DOIUrl":"10.1016/j.cosrev.2025.100855","url":null,"abstract":"<div><div>The <em>coloring number</em> <span><math><mrow><mo>col</mo><mrow><mo>(</mo><mi>G</mi><mo>)</mo></mrow></mrow></math></span> of a graph <span><math><mi>G</mi></math></span>, which is equal to the <em>degeneracy</em> of <span><math><mi>G</mi></math></span> plus one, provides a very useful measure for the uniform sparsity of <span><math><mi>G</mi></math></span>. The coloring number is generalized by three series of measures, the <em>generalized coloring numbers</em>. These are the <span><math><mi>r</mi></math></span>-<em>admissibility</em> <span><math><mrow><msub><mrow><mo>adm</mo></mrow><mrow><mi>r</mi></mrow></msub><mrow><mo>(</mo><mi>G</mi><mo>)</mo></mrow></mrow></math></span>, the <em>strong</em> <span><math><mi>r</mi></math></span><em>-coloring number</em> <span><math><mrow><msub><mrow><mo>col</mo></mrow><mrow><mi>r</mi></mrow></msub><mrow><mo>(</mo><mi>G</mi><mo>)</mo></mrow></mrow></math></span> and the <em>weak</em> <span><math><mi>r</mi></math></span><em>-coloring number</em> <span><math><mrow><msub><mrow><mo>wcol</mo></mrow><mrow><mi>r</mi></mrow></msub><mrow><mo>(</mo><mi>G</mi><mo>)</mo></mrow></mrow></math></span>, where <span><math><mi>r</mi></math></span> is an integer parameter. The generalized coloring numbers measure the edge density of bounded-depth minors and thereby provide an even more uniform measure of sparsity of graphs. They have found many applications in graph theory and in particular play a key role in the theory of bounded expansion and nowhere dense graph classes introduced by Nešetřil and Ossona de Mendez. We overview combinatorial and algorithmic applications of the generalized coloring numbers, emphasizing new developments in this area. We also present a simple proof for the existence of uniform orders and improve known bounds, e.g., for the weak coloring numbers on graphs with excluded topological minors.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100855"},"PeriodicalIF":12.7,"publicationDate":"2025-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145519520","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Critical insights into runtime scheduling, image, storage, and networking challenges in modern Kubernetes environments 对现代Kubernetes环境中运行时调度、映像、存储和网络挑战的关键见解
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-14 DOI: 10.1016/j.cosrev.2025.100851
Bablu Kumar , Anshul Verma , Pradeepika Verma
Kubernetes has become the de-facto standard for orchestrating containerized workloads across cloud and edge environments. Despite its modular and extensible architecture, the growing complexity of runtime behaviors, scheduling demands, and evolving application requirements has revealed persistent challenges in scalability, performance, and operational resilience. This paper presents an in-depth review of recent advancements in Kubernetes, with an emphasis on version 1.33, structured around three core problem domains: (1) runtime and scheduling inefficiencies, (2) container image and storage bottlenecks, and (3) event-driven processing and networking limitations. Across all three domains, we examine how the evolution of communication infrastructure, such as changing network protocols, traffic patterns from edge to cloud, and service coordination mechanisms, impacts orchestration reliability and system design. We explore recent feature enhancements such as JobSet, In-place Pod Resizing, improved autoscalers, and nftables-based kube-proxy, analyzing their relevance to modern workloads including distributed machine learning and high-performance computing. Beyond feature evaluation, we highlight unresolved challenges, such as device-aware workload orchestration, adaptive resource provisioning, and scalable event management, and discuss their implications in emerging scenarios. Finally, we outline future research directions and architectural strategies aimed at achieving intelligent, resilient, and workload-aware orchestration in Kubernetes. This study serves as both a state-of-the-art review and a guidepost for advancing Kubernetes-based systems.
Kubernetes已经成为跨云和边缘环境编排容器化工作负载的事实上的标准。尽管它是模块化和可扩展的体系结构,但运行时行为、调度需求和不断发展的应用程序需求的日益复杂性揭示了可伸缩性、性能和操作弹性方面的持续挑战。本文深入回顾了Kubernetes的最新进展,重点介绍了版本1.33,围绕三个核心问题领域:(1)运行时和调度效率低下,(2)容器映像和存储瓶颈,(3)事件驱动处理和网络限制。在这三个领域中,我们将研究通信基础设施的演变(例如不断变化的网络协议、从边缘到云的流量模式以及服务协调机制)如何影响编排可靠性和系统设计。我们探讨了最近的功能增强,如JobSet、就地Pod调整大小、改进的自动缩放器和基于nfs表的kube-proxy,并分析了它们与现代工作负载的相关性,包括分布式机器学习和高性能计算。除了特性评估之外,我们还强调了尚未解决的挑战,例如设备感知的工作负载编排、自适应资源供应和可扩展的事件管理,并讨论了它们在新兴场景中的含义。最后,我们概述了未来的研究方向和架构策略,旨在实现Kubernetes中智能、弹性和工作负载感知的编排。这项研究既可以作为最先进的回顾,也可以作为推进基于kubernetes的系统的路标。
{"title":"Critical insights into runtime scheduling, image, storage, and networking challenges in modern Kubernetes environments","authors":"Bablu Kumar ,&nbsp;Anshul Verma ,&nbsp;Pradeepika Verma","doi":"10.1016/j.cosrev.2025.100851","DOIUrl":"10.1016/j.cosrev.2025.100851","url":null,"abstract":"<div><div>Kubernetes has become the de-facto standard for orchestrating containerized workloads across cloud and edge environments. Despite its modular and extensible architecture, the growing complexity of runtime behaviors, scheduling demands, and evolving application requirements has revealed persistent challenges in scalability, performance, and operational resilience. This paper presents an in-depth review of recent advancements in Kubernetes, with an emphasis on version 1.33, structured around three core problem domains: (1) runtime and scheduling inefficiencies, (2) container image and storage bottlenecks, and (3) event-driven processing and networking limitations. Across all three domains, we examine how the evolution of communication infrastructure, such as changing network protocols, traffic patterns from edge to cloud, and service coordination mechanisms, impacts orchestration reliability and system design. We explore recent feature enhancements such as JobSet, In-place Pod Resizing, improved autoscalers, and nftables-based kube-proxy, analyzing their relevance to modern workloads including distributed machine learning and high-performance computing. Beyond feature evaluation, we highlight unresolved challenges, such as device-aware workload orchestration, adaptive resource provisioning, and scalable event management, and discuss their implications in emerging scenarios. Finally, we outline future research directions and architectural strategies aimed at achieving intelligent, resilient, and workload-aware orchestration in Kubernetes. This study serves as both a state-of-the-art review and a guidepost for advancing Kubernetes-based systems.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100851"},"PeriodicalIF":12.7,"publicationDate":"2025-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145519519","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Navigating the quantum computing threat landscape for blockchains: A comprehensive survey 导航区块链的量子计算威胁格局:一项全面调查
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-12 DOI: 10.1016/j.cosrev.2025.100846
Hassan Khodaiemehr, Khadijeh Bagheri, Chen Feng
Quantum computers pose a significant threat to blockchain technology’s security, which heavily relies on public-key cryptography and hash functions. The cryptographic algorithms used in blockchains, based on large odd prime numbers and discrete logarithms, can be easily compromised by quantum computing algorithms like Shor’s algorithm and its future qubit variations. This survey paper comprehensively examines the impact of quantum computers on blockchain security and explores potential mitigation strategies. The survey focuses on the quantum security of blockchain’s fundamental building blocks, including digital signatures, hash functions, consensus algorithms, and smart contracts. We analyze the vulnerabilities introduced by quantum computers and discuss potential countermeasures and enhancements to ensure the integrity and confidentiality of blockchain systems. Furthermore, we investigate the quantum attack surface of blockchains, identifying potential avenues for exploiting quantum computing to strengthen existing attacks. We emphasize the need for developing quantum-resistant defenses and explore solutions for mitigating the threat of quantum computers to blockchains, including the adoption of quantum and post-quantum blockchain architectures. The paper also discusses the limitations of current efforts, such as the computational overhead of post-quantum algorithms and the practical challenges in real-world deployment, illustrated through case studies of Bitcoin and Ethereum. Future research directions include developing scalable quantum-resistant blockchain protocols, optimizing cryptographic implementations for embedded devices, and establishing standardized security frameworks to mitigate emerging quantum attacks.
量子计算机对区块链技术的安全性构成了重大威胁,区块链技术严重依赖于公钥加密和哈希函数。区块链中使用的基于大奇数素数和离散对数的加密算法很容易被量子计算算法(如Shor算法及其未来的量子位变体)所破坏。本调查报告全面研究了量子计算机对区块链安全的影响,并探讨了潜在的缓解策略。该调查的重点是b区块链基本构建块的量子安全性,包括数字签名、哈希函数、共识算法和智能合约。我们分析了量子计算机引入的漏洞,并讨论了潜在的对策和增强措施,以确保区块链系统的完整性和保密性。此外,我们研究了区块链的量子攻击面,确定了利用量子计算来加强现有攻击的潜在途径。我们强调有必要开发抗量子防御,并探索减轻量子计算机对区块链威胁的解决方案,包括采用量子和后量子区块链架构。本文还讨论了当前努力的局限性,例如后量子算法的计算开销以及现实世界部署中的实际挑战,并通过比特币和以太坊的案例研究进行了说明。未来的研究方向包括开发可扩展的抗量子区块链协议,优化嵌入式设备的加密实现,以及建立标准化的安全框架以减轻新兴的量子攻击。
{"title":"Navigating the quantum computing threat landscape for blockchains: A comprehensive survey","authors":"Hassan Khodaiemehr,&nbsp;Khadijeh Bagheri,&nbsp;Chen Feng","doi":"10.1016/j.cosrev.2025.100846","DOIUrl":"10.1016/j.cosrev.2025.100846","url":null,"abstract":"<div><div>Quantum computers pose a significant threat to blockchain technology’s security, which heavily relies on public-key cryptography and hash functions. The cryptographic algorithms used in blockchains, based on large odd prime numbers and discrete logarithms, can be easily compromised by quantum computing algorithms like Shor’s algorithm and its future qubit variations. This survey paper comprehensively examines the impact of quantum computers on blockchain security and explores potential mitigation strategies. The survey focuses on the quantum security of blockchain’s fundamental building blocks, including digital signatures, hash functions, consensus algorithms, and smart contracts. We analyze the vulnerabilities introduced by quantum computers and discuss potential countermeasures and enhancements to ensure the integrity and confidentiality of blockchain systems. Furthermore, we investigate the quantum attack surface of blockchains, identifying potential avenues for exploiting quantum computing to strengthen existing attacks. We emphasize the need for developing quantum-resistant defenses and explore solutions for mitigating the threat of quantum computers to blockchains, including the adoption of quantum and post-quantum blockchain architectures. The paper also discusses the limitations of current efforts, such as the computational overhead of post-quantum algorithms and the practical challenges in real-world deployment, illustrated through case studies of Bitcoin and Ethereum. Future research directions include developing scalable quantum-resistant blockchain protocols, optimizing cryptographic implementations for embedded devices, and establishing standardized security frameworks to mitigate emerging quantum attacks.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100846"},"PeriodicalIF":12.7,"publicationDate":"2025-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145515619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
State-of-the-art image and video-based steganalysis techniques: A comprehensive review, challenges and future recommendations for digital forensic experts 最先进的基于图像和视频的隐写分析技术:对数字法医专家的全面回顾、挑战和未来建议
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-12 DOI: 10.1016/j.cosrev.2025.100852
Payal Mittal , Ravinder Kaur , Mukesh Dalal
In this age of rapid technological progress, digital communication through multimedia such as images and videos has become one of the most prominent ways of sharing and exchanging data. However, digital communication is now often used to obscure confidential or malicious information via steganography, posing significant risks to national security and digital safety. This challenge highlights the crucial role of steganalysis for digital forensic experts, which aims to detect and retrieve concealed data encoded in multimedia files. This paper presented a comprehensive review of the state-of-the-art steganalysis techniques based on images and videos, by incorporating classical Machine Learning (ML) and advanced Deep Learning (DL) approaches, unlike previous reviews that focuses on specific domains or algorithms. The paper thoroughly examines the current detection frameworks, existing image and video benchmark datasets and steganalysis tools available for digital forensic experts. The paper provides a comparative analysis of the existing techniques by highlighting their advantages and disadvantages. Experimental evaluation is done by using the widely adopted BOSSBase dataset, on which two existing steganalysis techniques are evaluated for the demonstration of practical insights. The findings emphasize that the advanced deep learning architectures surpass conventional machine learning approaches, while also recognizing persistent issues such as dataset imbalance and generalization issues. The novelty of the presented work lies in its unified coverage of image and video steganalysis techniques from a forensic expert’s viewpoint. The paper also offers future research recommendations to enhance multimedia security and aid forensic professionals in developing next-generation detection techniques.
在这个技术飞速发展的时代,通过图像和视频等多媒体进行数字通信已成为数据共享和交换的最突出方式之一。然而,数字通信现在经常被用来通过隐写术来掩盖机密或恶意信息,对国家安全和数字安全构成重大风险。这一挑战凸显了隐写分析对数字法医专家的关键作用,其目的是检测和检索多媒体文件中编码的隐藏数据。本文通过结合经典的机器学习(ML)和先进的深度学习(DL)方法,对基于图像和视频的最先进的隐写分析技术进行了全面的回顾,而不是像以前的评论那样专注于特定的领域或算法。本文深入研究了当前的检测框架,现有的图像和视频基准数据集以及数字法医专家可用的隐写分析工具。本文对现有技术进行了比较分析,突出了它们的优缺点。通过使用广泛采用的BOSSBase数据集进行实验评估,在该数据集上评估了两种现有的隐写分析技术,以展示实际见解。研究结果强调,先进的深度学习架构超越了传统的机器学习方法,同时也认识到持续存在的问题,如数据集不平衡和泛化问题。提出的工作的新颖性在于其统一覆盖的图像和视频隐写分析技术,从法医专家的观点。本文还提出了未来的研究建议,以增强多媒体安全性,并帮助法医专业人员开发下一代检测技术。
{"title":"State-of-the-art image and video-based steganalysis techniques: A comprehensive review, challenges and future recommendations for digital forensic experts","authors":"Payal Mittal ,&nbsp;Ravinder Kaur ,&nbsp;Mukesh Dalal","doi":"10.1016/j.cosrev.2025.100852","DOIUrl":"10.1016/j.cosrev.2025.100852","url":null,"abstract":"<div><div>In this age of rapid technological progress, digital communication through multimedia such as images and videos has become one of the most prominent ways of sharing and exchanging data. However, digital communication is now often used to obscure confidential or malicious information via steganography, posing significant risks to national security and digital safety. This challenge highlights the crucial role of steganalysis for digital forensic experts, which aims to detect and retrieve concealed data encoded in multimedia files. This paper presented a comprehensive review of the state-of-the-art steganalysis techniques based on images and videos, by incorporating classical Machine Learning (ML) and advanced Deep Learning (DL) approaches, unlike previous reviews that focuses on specific domains or algorithms. The paper thoroughly examines the current detection frameworks, existing image and video benchmark datasets and steganalysis tools available for digital forensic experts. The paper provides a comparative analysis of the existing techniques by highlighting their advantages and disadvantages. Experimental evaluation is done by using the widely adopted BOSSBase dataset, on which two existing steganalysis techniques are evaluated for the demonstration of practical insights. The findings emphasize that the advanced deep learning architectures surpass conventional machine learning approaches, while also recognizing persistent issues such as dataset imbalance and generalization issues. The novelty of the presented work lies in its unified coverage of image and video steganalysis techniques from a forensic expert’s viewpoint. The paper also offers future research recommendations to enhance multimedia security and aid forensic professionals in developing next-generation detection techniques.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100852"},"PeriodicalIF":12.7,"publicationDate":"2025-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145509611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A comprehensive review of recommender systems: Transitioning from theory to practice 推荐系统的全面回顾:从理论到实践的过渡
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-11 DOI: 10.1016/j.cosrev.2025.100849
Shaina Raza , Mizanur Rahman , Safiullah Kamawal , Armin Toroghi , Ananya Raval , Farshad Navah , Amirmohammad Kazemeini
Recommender Systems (RS) play an integral role in enhancing user experiences by providing personalized item suggestions. This survey reviews the progress in RS inclusively from 2017 to 2024, effectively connecting theoretical advances with practical applications. We explore the development from traditional RS techniques like content-based and collaborative filtering to advanced methods involving deep learning, graph-based models, reinforcement learning, and large language models. We also discuss specialized systems such as context-aware, review-based, and fairness-aware RS. The primary goal of this survey is to bridge theory with practice. It addresses challenges across various sectors, including e-commerce, healthcare, and finance, emphasizing the need for scalable, real-time, and trustworthy solutions. Through this survey, we promote stronger partnerships between academic research and industry practices. The insights offered by this survey aim to guide industry professionals in optimizing RS deployment and to inspire future research directions, especially in addressing emerging technological and societal trends. The survey resources are available in the public GitHub repository https://github.com/VectorInstitute/Recommender-Systems-Survey.
推荐系统(RS)通过提供个性化的项目建议,在增强用户体验方面发挥着不可或缺的作用。本文全面回顾了2017 - 2024年RS的研究进展,有效地将理论进展与实际应用联系起来。我们探索了从传统的RS技术(如基于内容和协同过滤)到涉及深度学习、基于图的模型、强化学习和大型语言模型的高级方法的发展。我们还讨论了专门的系统,如上下文感知、基于评论和公平感知的RS。本调查的主要目标是将理论与实践联系起来。它解决了包括电子商务、医疗保健和金融在内的各个领域的挑战,强调了对可扩展、实时和值得信赖的解决方案的需求。通过这项调查,我们促进了学术研究和行业实践之间更紧密的伙伴关系。该调查提供的见解旨在指导行业专业人员优化RS部署,并启发未来的研究方向,特别是在应对新兴技术和社会趋势方面。调查资源可以在公共GitHub存储库https://github.com/VectorInstitute/Recommender-Systems-Survey中获得。
{"title":"A comprehensive review of recommender systems: Transitioning from theory to practice","authors":"Shaina Raza ,&nbsp;Mizanur Rahman ,&nbsp;Safiullah Kamawal ,&nbsp;Armin Toroghi ,&nbsp;Ananya Raval ,&nbsp;Farshad Navah ,&nbsp;Amirmohammad Kazemeini","doi":"10.1016/j.cosrev.2025.100849","DOIUrl":"10.1016/j.cosrev.2025.100849","url":null,"abstract":"<div><div>Recommender Systems (RS) play an integral role in enhancing user experiences by providing personalized item suggestions. This survey reviews the progress in RS inclusively from 2017 to 2024, effectively connecting theoretical advances with practical applications. We explore the development from traditional RS techniques like content-based and collaborative filtering to advanced methods involving deep learning, graph-based models, reinforcement learning, and large language models. We also discuss specialized systems such as context-aware, review-based, and fairness-aware RS. The primary goal of this survey is to bridge theory with practice. It addresses challenges across various sectors, including e-commerce, healthcare, and finance, emphasizing the need for scalable, real-time, and trustworthy solutions. Through this survey, we promote stronger partnerships between academic research and industry practices. The insights offered by this survey aim to guide industry professionals in optimizing RS deployment and to inspire future research directions, especially in addressing emerging technological and societal trends. The survey resources are available in the public GitHub repository <span><span>https://github.com/VectorInstitute/Recommender-Systems-Survey</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100849"},"PeriodicalIF":12.7,"publicationDate":"2025-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145499403","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Graph diffusion models: A comprehensive survey of methods and applications 图扩散模型:方法和应用的综合调查
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-10 DOI: 10.1016/j.cosrev.2025.100854
Yuntao Shou , Wei Ai , Tao Meng , Keqin Li
Diffusion models have rapidly emerged as a new paradigm in generative modeling. Therefore, we aim to provide a comprehensive review of graph diffusion models. We introduce various forms of diffusion models (i.e., DDPMs, SDEs, and SGMs), their working mechanisms, and how they can be extended to graph data. Specifically, graph diffusion models follow the modeling process of diffusion models, implement the diffusion process in graph data, and gradually denoise and generate new graph structures through reverse steps. The application of graph diffusion models is mainly focused on the application scenarios of generating molecules and proteins, but graph diffusion models also show potential in recommendation systems and other fields. We explore the performance and advantages of graph diffusion models in these specific applications, such as using them to discover new drugs and predict protein structures. Furthermore, we also discuss the problem of evaluating graph diffusion models and their existing challenges. Due to the complexity and diversity of graph data, the authenticity of generated samples is an important and challenging task. We analyze their limitations and propose potential improvement directions to better measure the effectiveness of graph diffusion models. The summary of existing methods mentioned is in our Github: https://github.com/yuntaoshou/Graph-Diffusion-Models.
扩散模型已迅速成为生成建模的新范式。因此,我们的目的是提供一个全面的审查图扩散模型。我们介绍了各种形式的扩散模型(即ddpm, sde和SGMs),它们的工作机制,以及如何将它们扩展到图数据。具体来说,图扩散模型遵循扩散模型的建模过程,在图数据中实现扩散过程,通过反向步骤逐步去噪并生成新的图结构。图扩散模型的应用主要集中在生成分子和蛋白质的应用场景,但图扩散模型在推荐系统等领域也显示出潜力。我们探索了图扩散模型在这些特定应用中的性能和优势,例如使用它们来发现新药物和预测蛋白质结构。此外,我们还讨论了图扩散模型的评估问题及其存在的挑战。由于图数据的复杂性和多样性,生成样本的真实性是一项重要而具有挑战性的任务。我们分析了它们的局限性,并提出了潜在的改进方向,以更好地衡量图扩散模型的有效性。所提到的现有方法的总结在我们的Github中:https://github.com/yuntaoshou/Graph-Diffusion-Models。
{"title":"Graph diffusion models: A comprehensive survey of methods and applications","authors":"Yuntao Shou ,&nbsp;Wei Ai ,&nbsp;Tao Meng ,&nbsp;Keqin Li","doi":"10.1016/j.cosrev.2025.100854","DOIUrl":"10.1016/j.cosrev.2025.100854","url":null,"abstract":"<div><div>Diffusion models have rapidly emerged as a new paradigm in generative modeling. Therefore, we aim to provide a comprehensive review of graph diffusion models. We introduce various forms of diffusion models (i.e., DDPMs, SDEs, and SGMs), their working mechanisms, and how they can be extended to graph data. Specifically, graph diffusion models follow the modeling process of diffusion models, implement the diffusion process in graph data, and gradually denoise and generate new graph structures through reverse steps. The application of graph diffusion models is mainly focused on the application scenarios of generating molecules and proteins, but graph diffusion models also show potential in recommendation systems and other fields. We explore the performance and advantages of graph diffusion models in these specific applications, such as using them to discover new drugs and predict protein structures. Furthermore, we also discuss the problem of evaluating graph diffusion models and their existing challenges. Due to the complexity and diversity of graph data, the authenticity of generated samples is an important and challenging task. We analyze their limitations and propose potential improvement directions to better measure the effectiveness of graph diffusion models. The summary of existing methods mentioned is in our Github: <span><span>https://github.com/yuntaoshou/Graph-Diffusion-Models</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100854"},"PeriodicalIF":12.7,"publicationDate":"2025-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145485543","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Intra-node transaction parallelism in blockchains: Models, solutions, and trends 区块链中的节点内事务并行:模型、解决方案和趋势
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-11-08 DOI: 10.1016/j.cosrev.2025.100853
Bin Yu , Tong Zhou , He Zhao , Xiaoyan Li , Yuhui Fan , Lei Chen
Blockchain technology has been widely adopted across diverse domains, yet its scalability remains a critical bottleneck. Traditional serial transaction execution limits throughput to a low level, failing to meet the demands of high-frequency trading. While parallelization solutions exist, existing literature predominantly focus on “broad parallelism” like sharding and cross-chain. But they overlook “intra-node parallelism”, which is a lightweight approach to optimize transaction execution within single nodes without altering core protocols. We aim to fill this gap by conducting a focused, systematic analysis of intra-node transaction parallelism in blockchains. The research methods include: (1) Categorizing intra-node parallelism into three core models (deterministic, optimistic, emerging) based on conflict-handling mechanisms and architectural paradigms. (2) Analyzing ∼20 representative solutions to evaluate their core mechanisms, performance trade-offs, and applicable scenarios. (3) Investigating critical practical considerations, including conflict density-based applicability, overhead trade-offs, and synergy with consensus/network layers. (4) Comparing models across dimensions (conflict handling, performance, complexity) to identify strengths and limitations. Key results show that: (1) Deterministic models achieve ∼2–3x serial throughput with negligible rollbacks, making them ideal for high-conflict environments. (2) Optimistic models reach ∼5–10x serial throughput in low-conflict scenarios but suffer from rollback overhead in high-conflict settings. (3) Emerging models offer breakthrough scalability but require ecosystem changes. (4) No single model dominates. Optimal selection depends on conflict density, contract complexity, and compatibility needs. This study provides a foundational framework for researchers to navigate intra-node parallelism and for practitioners to select or design solutions balancing scalability, consistency, and decentralization. It advances blockchain scalability research by highlighting lightweight, backward-compatible optimizations that complement broad parallelism, enabling the development of high-performance blockchain systems for real-world applications.
区块链技术已被广泛应用于各个领域,但其可伸缩性仍然是一个关键的瓶颈。传统的串行交易执行将吞吐量限制在较低的水平,无法满足高频交易的需求。虽然存在并行化解决方案,但现有文献主要关注“广泛并行”,如分片和交叉链。但是他们忽略了“节点内并行”,这是一种在不改变核心协议的情况下在单个节点内优化事务执行的轻量级方法。我们的目标是通过对区块链中的节点内事务并行性进行集中、系统的分析来填补这一空白。研究方法包括:(1)基于冲突处理机制和架构范式,将节点内并行划分为确定性、乐观型和新兴型三个核心模型。(2)分析~ 20个代表性解决方案,以评估其核心机制、性能权衡和适用场景。(3)调查关键的实际考虑因素,包括基于冲突密度的适用性,开销权衡以及与共识/网络层的协同作用。(4)跨维度(冲突处理、性能、复杂性)比较模型,以确定优势和局限性。关键结果表明:(1)确定性模型实现了~ 2 - 3倍的串行吞吐量,且回滚可以忽略不计,使其成为高冲突环境的理想选择。(2)乐观模型在低冲突场景下达到~ 5 - 10倍的串行吞吐量,但在高冲突场景下遭受回滚开销。(3)新兴模式提供了突破性的可扩展性,但需要改变生态系统。(4)没有单一模式占主导地位。最优选择取决于冲突密度、契约复杂性和兼容性需求。本研究为研究人员导航节点内并行性和从业者选择或设计平衡可扩展性、一致性和去中心化的解决方案提供了一个基础框架。它通过突出轻量级、向后兼容的优化来推进区块链可伸缩性研究,这些优化补充了广泛的并行性,从而能够为实际应用程序开发高性能区块链系统。
{"title":"Intra-node transaction parallelism in blockchains: Models, solutions, and trends","authors":"Bin Yu ,&nbsp;Tong Zhou ,&nbsp;He Zhao ,&nbsp;Xiaoyan Li ,&nbsp;Yuhui Fan ,&nbsp;Lei Chen","doi":"10.1016/j.cosrev.2025.100853","DOIUrl":"10.1016/j.cosrev.2025.100853","url":null,"abstract":"<div><div>Blockchain technology has been widely adopted across diverse domains, yet its scalability remains a critical bottleneck. Traditional serial transaction execution limits throughput to a low level, failing to meet the demands of high-frequency trading. While parallelization solutions exist, existing literature predominantly focus on “broad parallelism” like sharding and cross-chain. But they overlook “intra-node parallelism”, which is a lightweight approach to optimize transaction execution within single nodes without altering core protocols. We aim to fill this gap by conducting a focused, systematic analysis of intra-node transaction parallelism in blockchains. The research methods include: (1) Categorizing intra-node parallelism into three core models (deterministic, optimistic, emerging) based on conflict-handling mechanisms and architectural paradigms. (2) Analyzing ∼20 representative solutions to evaluate their core mechanisms, performance trade-offs, and applicable scenarios. (3) Investigating critical practical considerations, including conflict density-based applicability, overhead trade-offs, and synergy with consensus/network layers. (4) Comparing models across dimensions (conflict handling, performance, complexity) to identify strengths and limitations. Key results show that: (1) Deterministic models achieve ∼2–3x serial throughput with negligible rollbacks, making them ideal for high-conflict environments. (2) Optimistic models reach ∼5–10x serial throughput in low-conflict scenarios but suffer from rollback overhead in high-conflict settings. (3) Emerging models offer breakthrough scalability but require ecosystem changes. (4) No single model dominates. Optimal selection depends on conflict density, contract complexity, and compatibility needs. This study provides a foundational framework for researchers to navigate intra-node parallelism and for practitioners to select or design solutions balancing scalability, consistency, and decentralization. It advances blockchain scalability research by highlighting lightweight, backward-compatible optimizations that complement broad parallelism, enabling the development of high-performance blockchain systems for real-world applications.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100853"},"PeriodicalIF":12.7,"publicationDate":"2025-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145473227","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Time-sensitive data analytics: A survey of anytime techniques, applications and challenges 时间敏感数据分析:随时技术、应用和挑战的调查
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-10-30 DOI: 10.1016/j.cosrev.2025.100850
Jagat Sesh Challa, Aarti, Navneet Goyal, Poonam Goyal
In the era of big data and real-time analytics, there is a growing demand for fast, adaptive, and efficient techniques for data analytics that are not only accurate but also responsive and adaptable to dynamic environments. Anytime algorithms have gained significant attention in data analytics due to their ability to provide approximate results at any point in time (which improves over time), making them highly suitable for quick decision-making. Anytime algorithms, which can trade computational time for quality of results, are increasingly critical for applications requiring rapid, adaptive insights. They are widely used in stock market analysis, fraud detection, sentiment analysis, weather forecasting, etc. To the best of our knowledge, there is no literature survey of research papers on anytime algorithms that comprehensively reviews the approaches, classifies them and highlights the open research issues. This paper provides a comprehensive survey of anytime algorithms tailored for data analytics over large datasets while emphasizing their application in time-sensitive decision-making environments. We examine the algorithmic foundations and the state-of-the-art anytime approaches across various data analytics tasks, including classification, clustering and frequent itemset mining. Qualitative analysis has also been presented for each algorithm described in this paper based on key aspects such as interruptibility, resource adaptiveness, and solution quality under constrained conditions. This survey also highlights the latest advancements and emerging research trends, providing insights into how anytime algorithms can be further developed to meet the demands of complex and dynamic environments.
在大数据和实时分析的时代,对快速、自适应和高效的数据分析技术的需求不断增长,这些技术不仅准确,而且能够响应和适应动态环境。Anytime算法在数据分析中获得了极大的关注,因为它们能够在任何时间点提供近似的结果(随着时间的推移而改进),使它们非常适合快速决策。随时算法可以用计算时间换取结果质量,对于需要快速、自适应洞察力的应用程序越来越重要。它们被广泛应用于股票市场分析、欺诈检测、情绪分析、天气预报等领域。据我们所知,目前还没有对任何时间算法的研究论文进行文献调查,全面回顾这些方法,对它们进行分类,并突出开放的研究问题。本文提供了针对大型数据集的数据分析量身定制的任意时间算法的全面调查,同时强调了它们在时间敏感决策环境中的应用。我们研究了各种数据分析任务的算法基础和最先进的随时方法,包括分类、聚类和频繁项集挖掘。基于可中断性、资源自适应和约束条件下的解质量等关键方面,本文还对所描述的每种算法进行了定性分析。该调查还强调了最新的进展和新兴的研究趋势,为如何进一步开发随时算法以满足复杂和动态环境的需求提供了见解。
{"title":"Time-sensitive data analytics: A survey of anytime techniques, applications and challenges","authors":"Jagat Sesh Challa,&nbsp;Aarti,&nbsp;Navneet Goyal,&nbsp;Poonam Goyal","doi":"10.1016/j.cosrev.2025.100850","DOIUrl":"10.1016/j.cosrev.2025.100850","url":null,"abstract":"<div><div>In the era of big data and real-time analytics, there is a growing demand for fast, adaptive, and efficient techniques for data analytics that are not only accurate but also responsive and adaptable to dynamic environments. Anytime algorithms have gained significant attention in data analytics due to their ability to provide approximate results at any point in time (which improves over time), making them highly suitable for quick decision-making. Anytime algorithms, which can trade computational time for quality of results, are increasingly critical for applications requiring rapid, adaptive insights. They are widely used in stock market analysis, fraud detection, sentiment analysis, weather forecasting, etc. To the best of our knowledge, there is no literature survey of research papers on anytime algorithms that comprehensively reviews the approaches, classifies them and highlights the open research issues. This paper provides a comprehensive survey of anytime algorithms tailored for data analytics over large datasets while emphasizing their application in time-sensitive decision-making environments. We examine the algorithmic foundations and the state-of-the-art anytime approaches across various data analytics tasks, including classification, clustering and frequent itemset mining. Qualitative analysis has also been presented for each algorithm described in this paper based on key aspects such as interruptibility, resource adaptiveness, and solution quality under constrained conditions. This survey also highlights the latest advancements and emerging research trends, providing insights into how anytime algorithms can be further developed to meet the demands of complex and dynamic environments.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100850"},"PeriodicalIF":12.7,"publicationDate":"2025-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145416723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The impact of large language models on medical research and patient care: A systematic review of current trends, challenges, and future innovations 大型语言模型对医学研究和病人护理的影响:对当前趋势、挑战和未来创新的系统回顾
IF 12.7 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS Pub Date : 2025-10-29 DOI: 10.1016/j.cosrev.2025.100847
Sohaib Asif , Fazal Hadi , Qurrat-ul-ain , Yuqi Yan , Vicky Yang Wang , Dong Xu
Large language models (LLMs) are gaining recognition for their sophisticated language processing abilities, which enable them to provide informed responses to open-ended queries. These models are proving to be highly beneficial in the healthcare sector, contributing to areas such as medical communication, optimization of patient data, and surgical planning. The rapid advancement of LLMs has generated extensive research, making it challenging to evaluate their overall impact. A concise review of recent developments is essential to provide clarity in this evolving field. This paper outlines the datasets used in various LLM studies and provides a detailed review of the advancements in medical LLMs, particularly focusing on the requirements and applications in the healthcare domain. It explores 198 relevant publications to assist practitioners and researchers, offering insights into the latest trends in LLM applications across medicine. The paper starts by covering the fundamental aspects of LLMs, including their history, architectures, transformers, and impacts. It then delves into specific medical applications such as medical writing, examinations, education, diagnosis, decision-making, nursing, clinical report generation, and mental health support. The study identifies challenges in deploying LLMs in real-world medical scenarios and provides recommendations for future technical integration. Lastly, it highlights potential research directions for the development of medical LLMs, aiming to meet the evolving needs of the healthcare sector and improve patient outcomes. This review serves as a key resource for future studies, offering pathways to enhance LLM utility in clinical practice.
大型语言模型(llm)因其复杂的语言处理能力而获得认可,这使它们能够为开放式查询提供知情的响应。事实证明,这些模型在医疗保健领域非常有益,有助于医疗通信、患者数据优化和手术计划等领域。法学硕士的快速发展产生了广泛的研究,使其具有挑战性的评估其整体影响。简明扼要地回顾最近的事态发展对于明确这一不断发展的领域是必不可少的。本文概述了各种法学硕士研究中使用的数据集,并详细回顾了医学法学硕士的进展,特别关注医疗保健领域的要求和应用。它探索了198个相关的出版物,以帮助从业者和研究人员,提供洞察跨医学法学硕士应用的最新趋势。本文从涵盖法学硕士的基本方面开始,包括他们的历史,架构,变压器和影响。然后深入研究具体的医学应用,如医学写作、考试、教育、诊断、决策、护理、临床报告生成和心理健康支持。该研究确定了在现实医疗场景中部署llm的挑战,并为未来的技术集成提供了建议。最后,它强调了医学法学硕士发展的潜在研究方向,旨在满足医疗保健部门不断变化的需求,改善患者的治疗效果。本综述可作为未来研究的关键资源,为提高法学硕士在临床实践中的应用提供途径。
{"title":"The impact of large language models on medical research and patient care: A systematic review of current trends, challenges, and future innovations","authors":"Sohaib Asif ,&nbsp;Fazal Hadi ,&nbsp;Qurrat-ul-ain ,&nbsp;Yuqi Yan ,&nbsp;Vicky Yang Wang ,&nbsp;Dong Xu","doi":"10.1016/j.cosrev.2025.100847","DOIUrl":"10.1016/j.cosrev.2025.100847","url":null,"abstract":"<div><div>Large language models (LLMs) are gaining recognition for their sophisticated language processing abilities, which enable them to provide informed responses to open-ended queries. These models are proving to be highly beneficial in the healthcare sector, contributing to areas such as medical communication, optimization of patient data, and surgical planning. The rapid advancement of LLMs has generated extensive research, making it challenging to evaluate their overall impact. A concise review of recent developments is essential to provide clarity in this evolving field. This paper outlines the datasets used in various LLM studies and provides a detailed review of the advancements in medical LLMs, particularly focusing on the requirements and applications in the healthcare domain. It explores 198 relevant publications to assist practitioners and researchers, offering insights into the latest trends in LLM applications across medicine. The paper starts by covering the fundamental aspects of LLMs, including their history, architectures, transformers, and impacts. It then delves into specific medical applications such as medical writing, examinations, education, diagnosis, decision-making, nursing, clinical report generation, and mental health support. The study identifies challenges in deploying LLMs in real-world medical scenarios and provides recommendations for future technical integration. Lastly, it highlights potential research directions for the development of medical LLMs, aiming to meet the evolving needs of the healthcare sector and improve patient outcomes. This review serves as a key resource for future studies, offering pathways to enhance LLM utility in clinical practice.</div></div>","PeriodicalId":48633,"journal":{"name":"Computer Science Review","volume":"59 ","pages":"Article 100847"},"PeriodicalIF":12.7,"publicationDate":"2025-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145416724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Computer Science Review
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1