Mohsen Tavakol, David G O'Brien, Claire C Sharpe, Claire Stewart
{"title":"帮助解读评估后心理测量报告的十二条提示。","authors":"Mohsen Tavakol, David G O'Brien, Claire C Sharpe, Claire Stewart","doi":"10.1080/0142159X.2023.2241624","DOIUrl":null,"url":null,"abstract":"<p><p>Post-assessments psychometric reports are a vital component of the assessment cycle to ensure that assessments are reliable, valid and fair to make appropriate pass-fail decisions. Students' scores can be summarised by examination of frequency distributions, central tendency measures and dispersion measures. Item discrimination indicies to assess the quality of items, and distractors that differentiate between students achieving or not achieving the learning outcomes are key. Estimating individual item reliability and item validity indices can maximise test-score reliability and validity. Test accuracy can be evaluated by assessing test reliability, consistency and validity and standard error of measurement can be used to measure the variation. Standard setting, even by experts, may be unreliable and reality checks such as the Hofstee method, P values and correlation analysis can improve validity. The Rasch model of student ability and item difficulty assists in modifying assessment questions, pinpointing areas for additional instruction. We propose 12 tips to support test developers in interpreting structured psychometric reports, including analysis and refinement of flawed items and ensuring fair assessments with accurate and defensible marks.</p>","PeriodicalId":18643,"journal":{"name":"Medical Teacher","volume":" ","pages":"188-195"},"PeriodicalIF":3.3000,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Twelve tips to aid interpretation of post-assessment psychometric reports.\",\"authors\":\"Mohsen Tavakol, David G O'Brien, Claire C Sharpe, Claire Stewart\",\"doi\":\"10.1080/0142159X.2023.2241624\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Post-assessments psychometric reports are a vital component of the assessment cycle to ensure that assessments are reliable, valid and fair to make appropriate pass-fail decisions. Students' scores can be summarised by examination of frequency distributions, central tendency measures and dispersion measures. Item discrimination indicies to assess the quality of items, and distractors that differentiate between students achieving or not achieving the learning outcomes are key. Estimating individual item reliability and item validity indices can maximise test-score reliability and validity. Test accuracy can be evaluated by assessing test reliability, consistency and validity and standard error of measurement can be used to measure the variation. Standard setting, even by experts, may be unreliable and reality checks such as the Hofstee method, P values and correlation analysis can improve validity. The Rasch model of student ability and item difficulty assists in modifying assessment questions, pinpointing areas for additional instruction. We propose 12 tips to support test developers in interpreting structured psychometric reports, including analysis and refinement of flawed items and ensuring fair assessments with accurate and defensible marks.</p>\",\"PeriodicalId\":18643,\"journal\":{\"name\":\"Medical Teacher\",\"volume\":\" \",\"pages\":\"188-195\"},\"PeriodicalIF\":3.3000,\"publicationDate\":\"2024-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Medical Teacher\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1080/0142159X.2023.2241624\",\"RegionNum\":2,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2023/8/4 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical Teacher","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1080/0142159X.2023.2241624","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/8/4 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
Twelve tips to aid interpretation of post-assessment psychometric reports.
Post-assessments psychometric reports are a vital component of the assessment cycle to ensure that assessments are reliable, valid and fair to make appropriate pass-fail decisions. Students' scores can be summarised by examination of frequency distributions, central tendency measures and dispersion measures. Item discrimination indicies to assess the quality of items, and distractors that differentiate between students achieving or not achieving the learning outcomes are key. Estimating individual item reliability and item validity indices can maximise test-score reliability and validity. Test accuracy can be evaluated by assessing test reliability, consistency and validity and standard error of measurement can be used to measure the variation. Standard setting, even by experts, may be unreliable and reality checks such as the Hofstee method, P values and correlation analysis can improve validity. The Rasch model of student ability and item difficulty assists in modifying assessment questions, pinpointing areas for additional instruction. We propose 12 tips to support test developers in interpreting structured psychometric reports, including analysis and refinement of flawed items and ensuring fair assessments with accurate and defensible marks.
期刊介绍:
Medical Teacher provides accounts of new teaching methods, guidance on structuring courses and assessing achievement, and serves as a forum for communication between medical teachers and those involved in general education. In particular, the journal recognizes the problems teachers have in keeping up-to-date with the developments in educational methods that lead to more effective teaching and learning at a time when the content of the curriculum—from medical procedures to policy changes in health care provision—is also changing. The journal features reports of innovation and research in medical education, case studies, survey articles, practical guidelines, reviews of current literature and book reviews. All articles are peer reviewed.