主题专家认知与职业分析调查之比较

Q2 Social Sciences Practical Assessment, Research and Evaluation Pub Date : 2018-01-01 DOI:10.7275/7DEY-ZD62
Adam E. Wyse, Ben Babcock
{"title":"主题专家认知与职业分析调查之比较","authors":"Adam E. Wyse, Ben Babcock","doi":"10.7275/7DEY-ZD62","DOIUrl":null,"url":null,"abstract":"Two common approaches for performing job analysis in credentialing programs are committee-based methods, which rely solely on subject matter experts’ judgments, and task inventory surveys. This study evaluates how well subject matter experts’ perceptions coincide with task inventory survey results for three credentialing programs. Results suggest that subject matter expert ratings differ in systematic ways from task inventory survey results and that task lists generated based solely on subject matter experts’ intuitions generally lead to narrower task lists. Results also indicated that there can be key differences for procedures and non-procedures, with subject matter experts’ judgments often tending to exhibit lower agreement levels with task inventory survey results for procedures than for non-procedures. We recommend that organizations performing job analyses think very carefully before relying solely on subject matter experts’ judgments as their primary method of job analysis.","PeriodicalId":20361,"journal":{"name":"Practical Assessment, Research and Evaluation","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"A Comparison of Subject Matter Experts’ Perceptions and Job Analysis Surveys\",\"authors\":\"Adam E. Wyse, Ben Babcock\",\"doi\":\"10.7275/7DEY-ZD62\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Two common approaches for performing job analysis in credentialing programs are committee-based methods, which rely solely on subject matter experts’ judgments, and task inventory surveys. This study evaluates how well subject matter experts’ perceptions coincide with task inventory survey results for three credentialing programs. Results suggest that subject matter expert ratings differ in systematic ways from task inventory survey results and that task lists generated based solely on subject matter experts’ intuitions generally lead to narrower task lists. Results also indicated that there can be key differences for procedures and non-procedures, with subject matter experts’ judgments often tending to exhibit lower agreement levels with task inventory survey results for procedures than for non-procedures. We recommend that organizations performing job analyses think very carefully before relying solely on subject matter experts’ judgments as their primary method of job analysis.\",\"PeriodicalId\":20361,\"journal\":{\"name\":\"Practical Assessment, Research and Evaluation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Practical Assessment, Research and Evaluation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.7275/7DEY-ZD62\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Social Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Practical Assessment, Research and Evaluation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.7275/7DEY-ZD62","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 7

摘要

在资格认证项目中进行工作分析的两种常见方法是基于委员会的方法,它完全依赖于主题专家的判断,以及任务清单调查。本研究评估了主题专家的看法与三个认证项目的任务清单调查结果相吻合的程度。结果表明,主题专家评级与任务清单调查结果在系统方式上存在差异,并且仅基于主题专家的直觉生成的任务列表通常会导致更窄的任务列表。结果还表明,程序和非程序之间可能存在关键差异,与非程序相比,主题专家的判断往往倾向于对程序任务清单调查结果表现出较低的一致性。我们建议进行职业分析的组织在完全依赖主题专家的判断作为其职业分析的主要方法之前,要非常仔细地考虑。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A Comparison of Subject Matter Experts’ Perceptions and Job Analysis Surveys
Two common approaches for performing job analysis in credentialing programs are committee-based methods, which rely solely on subject matter experts’ judgments, and task inventory surveys. This study evaluates how well subject matter experts’ perceptions coincide with task inventory survey results for three credentialing programs. Results suggest that subject matter expert ratings differ in systematic ways from task inventory survey results and that task lists generated based solely on subject matter experts’ intuitions generally lead to narrower task lists. Results also indicated that there can be key differences for procedures and non-procedures, with subject matter experts’ judgments often tending to exhibit lower agreement levels with task inventory survey results for procedures than for non-procedures. We recommend that organizations performing job analyses think very carefully before relying solely on subject matter experts’ judgments as their primary method of job analysis.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
2.60
自引率
0.00%
发文量
0
期刊最新文献
Feedback is a gift: Do Video-enhanced rubrics result in providing better peer feedback than textual rubrics? Do Loss Aversion and the Ownership Effect Bias Content Validation Procedures Flipping the Feedback: Formative Assessment in a Flipped Freshman Circuits Class Eight issues to consider when developing animated videos for the assessment of complex constructs Variability In The Accuracy Of Self-Assessments Among Low, Moderate, And High Performing Students In University Education
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1