Rodrigo Schames Kreitchmann, Pablo Nájera, Susana Sanz, Miguel A Sorrel
{"title":"Enhancing Content Validity Assessment With Item Response Theory Modeling.","authors":"Rodrigo Schames Kreitchmann, Pablo Nájera, Susana Sanz, Miguel A Sorrel","doi":"10.7334/psicothema2023.208","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Ensuring the validity of assessments requires a thorough examination of the test content. Subject matter experts (SMEs) are commonly employed to evaluate the relevance, representativeness, and appropriateness of the items. This article proposes incorporating item response theory (IRT) into model assessments conducted by SMEs. Using IRT allows for the estimation of discrimination and threshold parameters for each SME, providing evidence of their performance in differentiating relevant from irrelevant items, thus facilitating the detection of suboptimal SME performance while improving item relevance scores.</p><p><strong>Method: </strong>Use of IRT was compared to traditional validity indices (content validity index and Aiken's V) in the evaluation of items. The aim was to assess the SMEs' accuracy in identifying whether items were designed to measure conscientiousness or not, and predicting their factor loadings.</p><p><strong>Results: </strong>The IRT-based scores effectively identified conscientiousness items (R2 = 0.57) and accurately predicted their factor loadings (R2 = 0.45). These scores demonstrated incremental validity, explaining 11% more variance than Aiken's V and up to 17% more than the content validity index.</p><p><strong>Conclusions: </strong>Modeling SME assessments with IRT improves item alignment and provides better predictions of factor loadings, enabling improvement of the content validity of measurement instruments.</p>","PeriodicalId":48179,"journal":{"name":"Psicothema","volume":null,"pages":null},"PeriodicalIF":3.2000,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psicothema","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.7334/psicothema2023.208","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Background: Ensuring the validity of assessments requires a thorough examination of the test content. Subject matter experts (SMEs) are commonly employed to evaluate the relevance, representativeness, and appropriateness of the items. This article proposes incorporating item response theory (IRT) into model assessments conducted by SMEs. Using IRT allows for the estimation of discrimination and threshold parameters for each SME, providing evidence of their performance in differentiating relevant from irrelevant items, thus facilitating the detection of suboptimal SME performance while improving item relevance scores.
Method: Use of IRT was compared to traditional validity indices (content validity index and Aiken's V) in the evaluation of items. The aim was to assess the SMEs' accuracy in identifying whether items were designed to measure conscientiousness or not, and predicting their factor loadings.
Results: The IRT-based scores effectively identified conscientiousness items (R2 = 0.57) and accurately predicted their factor loadings (R2 = 0.45). These scores demonstrated incremental validity, explaining 11% more variance than Aiken's V and up to 17% more than the content validity index.
Conclusions: Modeling SME assessments with IRT improves item alignment and provides better predictions of factor loadings, enabling improvement of the content validity of measurement instruments.
期刊介绍:
La revista Psicothema fue fundada en Asturias en 1989 y está editada conjuntamente por la Facultad y el Departamento de Psicología de la Universidad de Oviedo y el Colegio Oficial de Psicólogos del Principado de Asturias. Publica cuatro números al año. Se admiten trabajos tanto de investigación básica como aplicada, pertenecientes a cualquier ámbito de la Psicología, que previamente a su publicación son evaluados anónimamente por revisores externos. Psicothema está incluida en las bases de datos nacionales e internacionales más relevantes, entre las que cabe destacar Psychological Abstracts, Current Contents y MEDLINE/Index Medicus, entre otras. Además, figura en las listas de Factor de Impacto del Journal Citation Reports. Psicothema es una revista abierta a cualquier enfoque u orientación psicológica que venga avalada por la fuerza de los datos y los argumentos, y en la que encuentran acomodo todos los autores que sean capaces de convencer a los revisores de que sus manuscritos tienen la calidad para ser publicados. Psicothema es una revista de acceso abierto lo que significa que todo el contenido está a disposición de cualquier usuario o institución sin cargo alguno. Los usuarios pueden leer, descargar, copiar, distribuir, imprimir, buscar, o realizar enlaces a los textos completos de esta revista sin pedir permiso previo al editor o al autor, siempre y cuando la fuente original sea referenciada. Para acervos y repositorios, se prefiere que la cobertura se realice mediante enlaces a la propia web de Psicothema. Nos parece que una apuesta decidida por la calidad es el mejor modo de servir a nuestros lectores, cuyas sugerencias siempre serán bienvenidas.