Generation and human-expert evaluation of interesting research ideas using knowledge graphs and large language models

Xuemei Gu, Mario Krenn
{"title":"Generation and human-expert evaluation of interesting research ideas using knowledge graphs and large language models","authors":"Xuemei Gu, Mario Krenn","doi":"arxiv-2405.17044","DOIUrl":null,"url":null,"abstract":"Advanced artificial intelligence (AI) systems with access to millions of\nresearch papers could inspire new research ideas that may not be conceived by\nhumans alone. However, how interesting are these AI-generated ideas, and how\ncan we improve their quality? Here, we introduce SciMuse, a system that uses an\nevolving knowledge graph built from more than 58 million scientific papers to\ngenerate personalized research ideas via an interface to GPT-4. We conducted a\nlarge-scale human evaluation with over 100 research group leaders from the Max\nPlanck Society, who ranked more than 4,000 personalized research ideas based on\ntheir level of interest. This evaluation allows us to understand the\nrelationships between scientific interest and the core properties of the\nknowledge graph. We find that data-efficient machine learning can predict\nresearch interest with high precision, allowing us to optimize the\ninterest-level of generated research ideas. This work represents a step towards\nan artificial scientific muse that could catalyze unforeseen collaborations and\nsuggest interesting avenues for scientists.","PeriodicalId":501285,"journal":{"name":"arXiv - CS - Digital Libraries","volume":"15 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-05-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Digital Libraries","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2405.17044","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Advanced artificial intelligence (AI) systems with access to millions of research papers could inspire new research ideas that may not be conceived by humans alone. However, how interesting are these AI-generated ideas, and how can we improve their quality? Here, we introduce SciMuse, a system that uses an evolving knowledge graph built from more than 58 million scientific papers to generate personalized research ideas via an interface to GPT-4. We conducted a large-scale human evaluation with over 100 research group leaders from the Max Planck Society, who ranked more than 4,000 personalized research ideas based on their level of interest. This evaluation allows us to understand the relationships between scientific interest and the core properties of the knowledge graph. We find that data-efficient machine learning can predict research interest with high precision, allowing us to optimize the interest-level of generated research ideas. This work represents a step towards an artificial scientific muse that could catalyze unforeseen collaborations and suggest interesting avenues for scientists.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用知识图谱和大型语言模型生成并由人类专家评估有趣的研究想法
先进的人工智能(AI)系统可以访问数百万篇研究论文,可以激发人类无法单独构思的新研究想法。然而,这些由人工智能产生的想法究竟有多有趣,我们又该如何提高它们的质量呢?在这里,我们介绍了SciMuse,这是一个利用从5800多万篇科学论文中构建的不断发展的知识图谱,通过GPT-4接口生成个性化研究想法的系统。我们与马普学会的 100 多位研究小组负责人进行了大规模的人工评估,他们根据自己的兴趣程度对 4000 多个个性化研究构想进行了排序。通过这项评估,我们了解了科学兴趣与知识图谱核心属性之间的关系。我们发现,数据高效的机器学习可以高精度地预测研究兴趣,从而优化生成的研究想法的兴趣等级。这项工作标志着我们向人工科学缪斯迈出了一步,它可以催化不可预见的合作,并为科学家们提出有趣的研究方向。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Publishing Instincts: An Exploration-Exploitation Framework for Studying Academic Publishing Behavior and "Home Venues" Research Citations Building Trust in Wikipedia Evaluating the Linguistic Coverage of OpenAlex: An Assessment of Metadata Accuracy and Completeness Towards understanding evolution of science through language model series Ensuring Adherence to Standards in Experiment-Related Metadata Entered Via Spreadsheets
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1