Semantic analysis of test items through large language model embeddings predicts a-priori factorial structure of personality tests

IF 2 Q1 Psychology Current research in behavioral sciences Pub Date : 2025-01-01 Epub Date: 2025-01-25 DOI:10.1016/j.crbeha.2025.100168
Nicola Milano, Maria Luongo, Michela Ponticorvo, Davide Marocco
{"title":"Semantic analysis of test items through large language model embeddings predicts a-priori factorial structure of personality tests","authors":"Nicola Milano,&nbsp;Maria Luongo,&nbsp;Michela Ponticorvo,&nbsp;Davide Marocco","doi":"10.1016/j.crbeha.2025.100168","DOIUrl":null,"url":null,"abstract":"<div><div>In this article, we explore the use of Large Language Models (LLMs) for predicting factor loadings in personality tests through the semantic analysis of test items. By leveraging text embeddings generated from LLMs, we evaluate the semantic similarity of test items and their alignment with hypothesized factorial structures without depending on human response data. Our methodology involves using embeddings from four different personality test to examine correlations between item semantics and their grouping in principal factors. Our results indicate that LLM-derived embeddings can effectively capture semantic similarities among test items, showing moderate to high correlation with the factorial structure produced by humans respondents in all tests, potentially serving as a valid measure of content validity for initial survey design and refinement. This approach offers valuable insights into the robustness of embedding techniques in psychological evaluations, showing a significant correlation with traditional test structures and providing a novel perspective on test item analysis.</div></div>","PeriodicalId":72746,"journal":{"name":"Current research in behavioral sciences","volume":"8 ","pages":"Article 100168"},"PeriodicalIF":2.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Current research in behavioral sciences","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666518225000014","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/25 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"Psychology","Score":null,"Total":0}
引用次数: 0

Abstract

In this article, we explore the use of Large Language Models (LLMs) for predicting factor loadings in personality tests through the semantic analysis of test items. By leveraging text embeddings generated from LLMs, we evaluate the semantic similarity of test items and their alignment with hypothesized factorial structures without depending on human response data. Our methodology involves using embeddings from four different personality test to examine correlations between item semantics and their grouping in principal factors. Our results indicate that LLM-derived embeddings can effectively capture semantic similarities among test items, showing moderate to high correlation with the factorial structure produced by humans respondents in all tests, potentially serving as a valid measure of content validity for initial survey design and refinement. This approach offers valuable insights into the robustness of embedding techniques in psychological evaluations, showing a significant correlation with traditional test structures and providing a novel perspective on test item analysis.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过大型语言模型嵌入对测试项目进行语义分析,预测人格测试的先验析因结构
在本文中,我们通过对测试项目的语义分析,探讨了在人格测试中使用大语言模型(LLMs)来预测因素负荷的方法。通过利用llm生成的文本嵌入,我们评估了测试项目的语义相似性及其与假设的析因结构的一致性,而不依赖于人类的响应数据。我们的方法包括使用四种不同人格测试的嵌入来检验项目语义及其在主要因素分组之间的相关性。我们的研究结果表明,llm衍生的嵌入可以有效地捕获测试项目之间的语义相似性,在所有测试中显示出与人类受访者产生的析因结构的中度到高度相关性,可能作为初始调查设计和改进的内容效度的有效测量。这种方法为嵌入技术在心理评估中的稳健性提供了有价值的见解,显示了与传统测试结构的显著相关性,并为测试项目分析提供了一种新的视角。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Current research in behavioral sciences
Current research in behavioral sciences Behavioral Neuroscience
CiteScore
7.90
自引率
0.00%
发文量
0
审稿时长
40 days
期刊最新文献
Leisure activities are associated with physical and mental health School readiness in the context of interpersonal coordination: A longitudinal multiple case study The time-varying effect of fitness on change in anxiety and depression during and after the COVID-19 pandemic: A seven-wave longitudinal study of physically active women and men Teaching and beyond: Exploring the specialized interests within education careers The relationships between cognitive functioning and decision making under value-based conditions in older adults: Findings based on a psychometric network analysis
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1