Fine-tuning and Prompt Engineering with Cognitive Knowledge Graphs for Scholarly Knowledge Organization

Gollam Rabby, Sören Auer, Jennifer D'Souza, Allard Oelen
{"title":"Fine-tuning and Prompt Engineering with Cognitive Knowledge Graphs for Scholarly Knowledge Organization","authors":"Gollam Rabby, Sören Auer, Jennifer D'Souza, Allard Oelen","doi":"arxiv-2409.06433","DOIUrl":null,"url":null,"abstract":"The increasing amount of published scholarly articles, exceeding 2.5 million\nyearly, raises the challenge for researchers in following scientific progress.\nIntegrating the contributions from scholarly articles into a novel type of\ncognitive knowledge graph (CKG) will be a crucial element for accessing and\norganizing scholarly knowledge, surpassing the insights provided by titles and\nabstracts. This research focuses on effectively conveying structured scholarly\nknowledge by utilizing large language models (LLMs) to categorize scholarly\narticles and describe their contributions in a structured and comparable\nmanner. While previous studies explored language models within specific\nresearch domains, the extensive domain-independent knowledge captured by LLMs\noffers a substantial opportunity for generating structured contribution\ndescriptions as CKGs. Additionally, LLMs offer customizable pathways through\nprompt engineering or fine-tuning, thus facilitating to leveraging of smaller\nLLMs known for their efficiency, cost-effectiveness, and environmental\nconsiderations. Our methodology involves harnessing LLM knowledge, and\ncomplementing it with domain expert-verified scholarly data sourced from a CKG.\nThis strategic fusion significantly enhances LLM performance, especially in\ntasks like scholarly article categorization and predicate recommendation. Our\nmethod involves fine-tuning LLMs with CKG knowledge and additionally injecting\nknowledge from a CKG with a novel prompting technique significantly increasing\nthe accuracy of scholarly knowledge extraction. We integrated our approach in\nthe Open Research Knowledge Graph (ORKG), thus enabling precise access to\norganized scholarly knowledge, crucially benefiting domain-independent\nscholarly knowledge exchange and dissemination among policymakers, industrial\npractitioners, and the general public.","PeriodicalId":501285,"journal":{"name":"arXiv - CS - Digital Libraries","volume":"8 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Digital Libraries","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06433","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The increasing amount of published scholarly articles, exceeding 2.5 million yearly, raises the challenge for researchers in following scientific progress. Integrating the contributions from scholarly articles into a novel type of cognitive knowledge graph (CKG) will be a crucial element for accessing and organizing scholarly knowledge, surpassing the insights provided by titles and abstracts. This research focuses on effectively conveying structured scholarly knowledge by utilizing large language models (LLMs) to categorize scholarly articles and describe their contributions in a structured and comparable manner. While previous studies explored language models within specific research domains, the extensive domain-independent knowledge captured by LLMs offers a substantial opportunity for generating structured contribution descriptions as CKGs. Additionally, LLMs offer customizable pathways through prompt engineering or fine-tuning, thus facilitating to leveraging of smaller LLMs known for their efficiency, cost-effectiveness, and environmental considerations. Our methodology involves harnessing LLM knowledge, and complementing it with domain expert-verified scholarly data sourced from a CKG. This strategic fusion significantly enhances LLM performance, especially in tasks like scholarly article categorization and predicate recommendation. Our method involves fine-tuning LLMs with CKG knowledge and additionally injecting knowledge from a CKG with a novel prompting technique significantly increasing the accuracy of scholarly knowledge extraction. We integrated our approach in the Open Research Knowledge Graph (ORKG), thus enabling precise access to organized scholarly knowledge, crucially benefiting domain-independent scholarly knowledge exchange and dissemination among policymakers, industrial practitioners, and the general public.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用认知知识图谱进行学术知识组织的微调和提示工程
将学术文章的贡献整合到新型认知知识图谱(CKG)中,将成为获取和组织学术知识的关键要素,从而超越标题和摘要所提供的见解。本研究的重点是利用大型语言模型(LLM)对学术文章进行分类,并以结构化和可比较的方式描述其贡献,从而有效地传递结构化的学术知识。虽然以前的研究探索了特定研究领域中的语言模型,但 LLM 所捕获的与领域无关的广泛知识为生成结构化贡献描述(CKG)提供了大量机会。此外,LLM 还可以通过提示工程或微调提供可定制的路径,从而有助于利用以效率、成本效益和环境因素著称的小型 LLM。我们的方法包括利用 LLM 知识,并将其与来自 CKG 的领域专家验证过的学术数据相辅相成。这种战略性的融合大大提高了 LLM 的性能,尤其是在学术文章分类和谓词推荐等任务中。我们的方法包括利用 CKG 知识对 LLM 进行微调,并通过新颖的提示技术从 CKG 中注入知识,从而显著提高学术知识提取的准确性。我们将我们的方法集成到了开放研究知识图谱(ORKG)中,从而实现了对有组织的学术知识的精确访问,极大地促进了独立于领域的学术知识在政策制定者、行业从业者和公众之间的交流和传播。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Publishing Instincts: An Exploration-Exploitation Framework for Studying Academic Publishing Behavior and "Home Venues" Research Citations Building Trust in Wikipedia Evaluating the Linguistic Coverage of OpenAlex: An Assessment of Metadata Accuracy and Completeness Towards understanding evolution of science through language model series Ensuring Adherence to Standards in Experiment-Related Metadata Entered Via Spreadsheets
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1