{"title":"How to evaluate students’ decisions in a data comparison problem: Correct decision for the wrong reasons?","authors":"Karel Kok, Sophia Chroszczinsky, Burkhard Priemer","doi":"10.1103/physrevphyseducres.20.010129","DOIUrl":null,"url":null,"abstract":"Data comparison problems are used in teaching and science education research that focuses on students’ ability to compare datasets and their conceptual understanding of measurement uncertainties. However, the evaluation of students’ decisions in these problems can pose a problem: e.g., students making a correct decision for the wrong reasons. Three previous studies, that share the same context and data comparison problem but where participants had increasing conceptual knowledge of measurement uncertainties, are revisited. The comparison shows a troublesome result: increasing conceptual knowledge does not lead to better decision making in the data comparison problem. In this research, we have looked into this apparent discrepancy by comparing and reanalyzing the data from these three studies. We have analyzed students’ justifications by coding them based on the compared quantity and the deciding criterion, giving a highly detailed insight into what they do when comparing the datasets. The results show clear differences in the quality of the justifications across the studies and by combining the results with the decisions, we could successfully identify four cases of correct and incorrect decisions for right or wrong reasons. This analysis showed a high prevalence of correct decisions for wrong reasons in two of the studies, resolving the discrepancy in the initial comparison of these studies. The implication of our analysis is that simply asking students to make a decision in data comparison problems is not a suitable probe to gauge their ability to compare datasets or their conceptual understanding of measurement uncertainties and a probe like this should always be complemented by an analysis of the justification.","PeriodicalId":54296,"journal":{"name":"Physical Review Physics Education Research","volume":"1 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physical Review Physics Education Research","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1103/physrevphyseducres.20.010129","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Data comparison problems are used in teaching and science education research that focuses on students’ ability to compare datasets and their conceptual understanding of measurement uncertainties. However, the evaluation of students’ decisions in these problems can pose a problem: e.g., students making a correct decision for the wrong reasons. Three previous studies, that share the same context and data comparison problem but where participants had increasing conceptual knowledge of measurement uncertainties, are revisited. The comparison shows a troublesome result: increasing conceptual knowledge does not lead to better decision making in the data comparison problem. In this research, we have looked into this apparent discrepancy by comparing and reanalyzing the data from these three studies. We have analyzed students’ justifications by coding them based on the compared quantity and the deciding criterion, giving a highly detailed insight into what they do when comparing the datasets. The results show clear differences in the quality of the justifications across the studies and by combining the results with the decisions, we could successfully identify four cases of correct and incorrect decisions for right or wrong reasons. This analysis showed a high prevalence of correct decisions for wrong reasons in two of the studies, resolving the discrepancy in the initial comparison of these studies. The implication of our analysis is that simply asking students to make a decision in data comparison problems is not a suitable probe to gauge their ability to compare datasets or their conceptual understanding of measurement uncertainties and a probe like this should always be complemented by an analysis of the justification.
期刊介绍:
PRPER covers all educational levels, from elementary through graduate education. All topics in experimental and theoretical physics education research are accepted, including, but not limited to:
Educational policy
Instructional strategies, and materials development
Research methodology
Epistemology, attitudes, and beliefs
Learning environment
Scientific reasoning and problem solving
Diversity and inclusion
Learning theory
Student participation
Faculty and teacher professional development