如何评估数学教师的 TPACK?自我报告与知识测试的比较

IF 1.9 3区 教育学 Q2 EDUCATION & EDUCATIONAL RESEARCH International Journal of Science and Mathematics Education Pub Date : 2024-08-06 DOI:10.1007/s10763-024-10490-2
Alina Kadluba, Andreas Obersteiner
{"title":"如何评估数学教师的 TPACK?自我报告与知识测试的比较","authors":"Alina Kadluba, Andreas Obersteiner","doi":"10.1007/s10763-024-10490-2","DOIUrl":null,"url":null,"abstract":"<p>Teachers need technology-related knowledge to effectively use technology in the classroom. Previous studies have often used self-reports to assess such knowledge. However, it is questionable whether self-reports are valid measures for this purpose. This study investigates how mathematics teachers’ self-reports correlate with their scores in a paper–pencil knowledge test regarding TPCK (technological pedagogical content knowledge), CK (content knowledge), pedagogical content knowledge (PCK), and technological knowledge (TK). Participants were <span>\\(N = 173\\)</span> pre- and in-service mathematics teachers. To assess self-reports, we adapted an existing survey from the literature. We also compiled a knowledge test based on items from existing test instruments. To increase comparability between the two instruments, both the self-report and the paper–pencil knowledge test addressed the specific topic of fractions. The four subscales in both instruments had sufficient reliability. The correlations between the self-reports and the paper–pencil test scores were low or very low for all subscales <span>\\(\\left(r=.00-.23\\right)\\)</span>, suggesting that the two instruments captured different underlying constructs. While paper–pencil tests seem more suitable for assessing knowledge, self-reports may be influenced more strongly by participants’ personal traits such as self-efficacy. Our findings raise concerns about the validity of self-reports as measures of teachers’ professional knowledge, and the comparability of studies that use distinct assessment instruments. We recommend that researchers should be more cautious when interpreting self-reports as knowledge and rely more strongly on externally assessed tests.</p>","PeriodicalId":14267,"journal":{"name":"International Journal of Science and Mathematics Education","volume":null,"pages":null},"PeriodicalIF":1.9000,"publicationDate":"2024-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"How to Assess Mathematics Teachers’ TPACK? A Comparison Between Self-Reports and Knowledge Tests\",\"authors\":\"Alina Kadluba, Andreas Obersteiner\",\"doi\":\"10.1007/s10763-024-10490-2\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Teachers need technology-related knowledge to effectively use technology in the classroom. Previous studies have often used self-reports to assess such knowledge. However, it is questionable whether self-reports are valid measures for this purpose. This study investigates how mathematics teachers’ self-reports correlate with their scores in a paper–pencil knowledge test regarding TPCK (technological pedagogical content knowledge), CK (content knowledge), pedagogical content knowledge (PCK), and technological knowledge (TK). Participants were <span>\\\\(N = 173\\\\)</span> pre- and in-service mathematics teachers. To assess self-reports, we adapted an existing survey from the literature. We also compiled a knowledge test based on items from existing test instruments. To increase comparability between the two instruments, both the self-report and the paper–pencil knowledge test addressed the specific topic of fractions. The four subscales in both instruments had sufficient reliability. The correlations between the self-reports and the paper–pencil test scores were low or very low for all subscales <span>\\\\(\\\\left(r=.00-.23\\\\right)\\\\)</span>, suggesting that the two instruments captured different underlying constructs. While paper–pencil tests seem more suitable for assessing knowledge, self-reports may be influenced more strongly by participants’ personal traits such as self-efficacy. Our findings raise concerns about the validity of self-reports as measures of teachers’ professional knowledge, and the comparability of studies that use distinct assessment instruments. We recommend that researchers should be more cautious when interpreting self-reports as knowledge and rely more strongly on externally assessed tests.</p>\",\"PeriodicalId\":14267,\"journal\":{\"name\":\"International Journal of Science and Mathematics Education\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.9000,\"publicationDate\":\"2024-08-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Science and Mathematics Education\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1007/s10763-024-10490-2\",\"RegionNum\":3,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Science and Mathematics Education","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1007/s10763-024-10490-2","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

摘要

教师需要掌握与技术相关的知识,才能在课堂上有效地使用技术。以往的研究通常使用自我报告来评估这类知识。然而,自我报告是否是有效的衡量标准尚存疑问。本研究调查了数学教师的自我报告与他们在纸笔知识测试中关于TPCK(技术教学内容知识)、CK(内容知识)、教学内容知识(PCK)和技术知识(TK)的得分之间的相关性。参与者是(N = 173)职前和在职数学教师。为了评估自我报告,我们改编了文献中的现有调查。我们还根据现有测试工具中的项目编制了一个知识测试。为了增加两种工具的可比性,自我报告和纸笔知识测试都针对分数这一特定主题。两种工具的四个分量表都具有足够的可靠性。在所有分量表中,自我报告和纸笔测验分数之间的相关性都很低或很低,这表明这两种工具捕捉到了不同的基本结构。虽然纸笔测验似乎更适合评估知识,但自我报告可能受参与者个人特征(如自我效能)的影响更大。我们的研究结果引起了人们对自我报告作为教师专业知识衡量标准的有效性以及使用不同评估工具的研究的可比性的关注。我们建议研究人员在将自我报告解释为知识时应更加谨慎,并更多地依赖外部评估测试。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
How to Assess Mathematics Teachers’ TPACK? A Comparison Between Self-Reports and Knowledge Tests

Teachers need technology-related knowledge to effectively use technology in the classroom. Previous studies have often used self-reports to assess such knowledge. However, it is questionable whether self-reports are valid measures for this purpose. This study investigates how mathematics teachers’ self-reports correlate with their scores in a paper–pencil knowledge test regarding TPCK (technological pedagogical content knowledge), CK (content knowledge), pedagogical content knowledge (PCK), and technological knowledge (TK). Participants were \(N = 173\) pre- and in-service mathematics teachers. To assess self-reports, we adapted an existing survey from the literature. We also compiled a knowledge test based on items from existing test instruments. To increase comparability between the two instruments, both the self-report and the paper–pencil knowledge test addressed the specific topic of fractions. The four subscales in both instruments had sufficient reliability. The correlations between the self-reports and the paper–pencil test scores were low or very low for all subscales \(\left(r=.00-.23\right)\), suggesting that the two instruments captured different underlying constructs. While paper–pencil tests seem more suitable for assessing knowledge, self-reports may be influenced more strongly by participants’ personal traits such as self-efficacy. Our findings raise concerns about the validity of self-reports as measures of teachers’ professional knowledge, and the comparability of studies that use distinct assessment instruments. We recommend that researchers should be more cautious when interpreting self-reports as knowledge and rely more strongly on externally assessed tests.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
5.10
自引率
9.10%
发文量
87
期刊介绍: The objective of this journal is to publish original, fully peer-reviewed articles on a variety of topics and research methods in both science and mathematics education. The journal welcomes articles that address common issues in mathematics and science education and cross-curricular dimensions more widely. Specific attention will be paid to manuscripts written by authors whose native language is not English and the editors have made arrangements for support in re-writing where appropriate. Contemporary educators highlight the importance of viewing knowledge as context-oriented and not limited to one domain. This concurs with current curriculum reforms worldwide for interdisciplinary and integrated curricula. Modern educational practice also focuses on the use of new technology in assisting instruction which may be easily implemented into such an integrated curriculum. The journal welcomes studies that explore science and mathematics education from different cultural perspectives.
期刊最新文献
STEM Outside of School: a Meta-Analysis of the Effects of Informal Science Education on Students' Interests and Attitudes for STEM Preservice Teachers Learn to Engage in Argument from Evidence through the Science Writing Heuristic Enhancing Pre-Service Mathematics Teachers' Competencies in Distance Education: An Empirical Investigation Utilizing Micro-Teaching and Peer Assessment The Effectiveness of AI on K-12 Students’ Mathematics Learning: A Systematic Review and Meta-Analysis Dimensionality and Invariance of Contemporary Mathematical Instruction Competence across Educational Systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1