British education research and its quality: An analysis of Research Excellence Framework submissions

IF 3 3区 教育学 Q1 EDUCATION & EDUCATIONAL RESEARCH British Educational Research Journal Pub Date : 2024-06-05 DOI:10.1002/berj.4040
Matthew Inglis, Colin Foster, Hugues Lortie-Forgues, Elizabeth Stokoe
{"title":"British education research and its quality: An analysis of Research Excellence Framework submissions","authors":"Matthew Inglis,&nbsp;Colin Foster,&nbsp;Hugues Lortie-Forgues,&nbsp;Elizabeth Stokoe","doi":"10.1002/berj.4040","DOIUrl":null,"url":null,"abstract":"<p>We analysed the full text of all journal articles returned to the education subpanel of the 2021 Research Excellence Framework (REF2021). Using a latent Dirichlet allocation topic model, we identified 35 topics that collectively summarise the journal articles that research units, typically schools of education, selected for submission. We found that the topics which units wrote about in their submitted articles collectively explained a large proportion (84.1%) of the variance in the quality assessments they received from the REF's expert peer review process. Further, with the important caveat that we cannot attribute causality, we found that there were strong associations between what the subpanel perceived to be excellent research and the adoption of particular methods or approaches. Most notably, units that returned more interview-based work typically received lower scores, and those which returned more analyses of large-scale data and meta-analyses typically received higher scores. Finally, we applied our 2021 model to articles submitted to the previous exercise, REF2014. We found that education research seems to have become less qualitative and more quantitative over time, and that our 2021 model could successfully predict the scores assigned by the REF2014 subpanel, suggesting a reasonable degree of between-exercise consistency.</p>","PeriodicalId":51410,"journal":{"name":"British Educational Research Journal","volume":null,"pages":null},"PeriodicalIF":3.0000,"publicationDate":"2024-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/berj.4040","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"British Educational Research Journal","FirstCategoryId":"95","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/berj.4040","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

Abstract

We analysed the full text of all journal articles returned to the education subpanel of the 2021 Research Excellence Framework (REF2021). Using a latent Dirichlet allocation topic model, we identified 35 topics that collectively summarise the journal articles that research units, typically schools of education, selected for submission. We found that the topics which units wrote about in their submitted articles collectively explained a large proportion (84.1%) of the variance in the quality assessments they received from the REF's expert peer review process. Further, with the important caveat that we cannot attribute causality, we found that there were strong associations between what the subpanel perceived to be excellent research and the adoption of particular methods or approaches. Most notably, units that returned more interview-based work typically received lower scores, and those which returned more analyses of large-scale data and meta-analyses typically received higher scores. Finally, we applied our 2021 model to articles submitted to the previous exercise, REF2014. We found that education research seems to have become less qualitative and more quantitative over time, and that our 2021 model could successfully predict the scores assigned by the REF2014 subpanel, suggesting a reasonable degree of between-exercise consistency.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
英国教育研究及其质量:对 "卓越研究框架 "申报材料的分析
我们分析了 2021 年 "卓越研究框架"(REF2021)教育子小组收到的所有期刊论文全文。通过使用潜在 Dirichlet 分配主题模型,我们确定了 35 个主题,这些主题共同概括了研究单位(通常是教育学院)选择提交的期刊论文。我们发现,在 REF 专家同行评审过程中,各研究单位在其提交的文章中所涉及的主题共同解释了它们所获得的质量评估差异的很大一部分(84.1%)。此外,我们还发现,在小组认为的优秀研究与采用特定方法或途径之间存在密切联系,但我们不能将其归结为因果关系,这一点非常重要。最值得注意的是,那些提交了较多基于访谈的工作的单位通常得分较低,而那些提交了较多大规模数据分析和荟萃分析的单位通常得分较高。最后,我们将 "2021 模型 "应用到了上一次 REF2014 中提交的文章中。我们发现,随着时间的推移,教育研究似乎变得更少定性、更多定量,而我们的 2021 模型可以成功预测 REF2014 小组给出的分数,这表明我们的模型在两次研究之间具有一定程度的一致性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
British Educational Research Journal
British Educational Research Journal EDUCATION & EDUCATIONAL RESEARCH-
CiteScore
4.70
自引率
8.70%
发文量
71
期刊介绍: The British Educational Research Journal is an international peer reviewed medium for the publication of articles of interest to researchers in education and has rapidly become a major focal point for the publication of educational research from throughout the world. For further information on the association please visit the British Educational Research Association web site. The journal is interdisciplinary in approach, and includes reports of case studies, experiments and surveys, discussions of conceptual and methodological issues and of underlying assumptions in educational research, accounts of research in progress, and book reviews.
期刊最新文献
Issue Information Making it explicit – Sustained shared thinking dialogue as a way to explore children's perspectives on quality in German early childhood education and care Issue Information Personality‐sensitive pedagogies: A study of small group interactive behaviours among 9‐ to 10‐year‐olds The QAA's subject benchmarks and critical pedagogy: The example of ‘gateway to King's’
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1