{"title":"Learning to evaluate sources of science (mis)information on the internet: Assessing students' scientific online reasoning","authors":"Daniel R. Pimentel","doi":"10.1002/tea.21974","DOIUrl":null,"url":null,"abstract":"Students frequently turn to the internet for information about a range of scientific issues. However, they can find it challenging to evaluate the credibility of the information they find, which may increase their susceptibility to mis‐ and disinformation. This exploratory study reports findings from an instructional intervention designed to teach high school students to engage in <jats:italic>scientific online reasoning</jats:italic> (SOR), a set of competencies for evaluating sources of scientific information on the internet. Forty‐three ninth grade students participated in eleven instructional activities. They completed pre and post constructed response tasks designed to assess three constructs: evaluating conflicts of interest, relevant scientific expertise, and alignment with scientific consensus. A subset of students (<jats:italic>n</jats:italic> = 6) also completed pre and post think‐aloud tasks where they evaluated websites of varying credibility. Students' written responses and screen‐capture recordings were scored, coded, and analyzed using a mixed‐methods approach. Findings from the study demonstrate that after the intervention: (1) students' assessment scores improved significantly on all three tasks, (2) students improved in their ability to distinguish between sources of online scientific information of varying credibility, and (3) more students used online reasoning strategies and outside sources of information. Areas for student growth are also identified, such as improving coordinated use of credibility criteria with online reasoning strategies. These results suggest that teaching criteria for the credibility of scientific information, along with online reasoning strategies, has the potential to help students evaluate scientific information encountered on the internet.","PeriodicalId":48369,"journal":{"name":"Journal of Research in Science Teaching","volume":null,"pages":null},"PeriodicalIF":3.6000,"publicationDate":"2024-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Research in Science Teaching","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1002/tea.21974","RegionNum":1,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Students frequently turn to the internet for information about a range of scientific issues. However, they can find it challenging to evaluate the credibility of the information they find, which may increase their susceptibility to mis‐ and disinformation. This exploratory study reports findings from an instructional intervention designed to teach high school students to engage in scientific online reasoning (SOR), a set of competencies for evaluating sources of scientific information on the internet. Forty‐three ninth grade students participated in eleven instructional activities. They completed pre and post constructed response tasks designed to assess three constructs: evaluating conflicts of interest, relevant scientific expertise, and alignment with scientific consensus. A subset of students (n = 6) also completed pre and post think‐aloud tasks where they evaluated websites of varying credibility. Students' written responses and screen‐capture recordings were scored, coded, and analyzed using a mixed‐methods approach. Findings from the study demonstrate that after the intervention: (1) students' assessment scores improved significantly on all three tasks, (2) students improved in their ability to distinguish between sources of online scientific information of varying credibility, and (3) more students used online reasoning strategies and outside sources of information. Areas for student growth are also identified, such as improving coordinated use of credibility criteria with online reasoning strategies. These results suggest that teaching criteria for the credibility of scientific information, along with online reasoning strategies, has the potential to help students evaluate scientific information encountered on the internet.
期刊介绍:
Journal of Research in Science Teaching, the official journal of NARST: A Worldwide Organization for Improving Science Teaching and Learning Through Research, publishes reports for science education researchers and practitioners on issues of science teaching and learning and science education policy. Scholarly manuscripts within the domain of the Journal of Research in Science Teaching include, but are not limited to, investigations employing qualitative, ethnographic, historical, survey, philosophical, case study research, quantitative, experimental, quasi-experimental, data mining, and data analytics approaches; position papers; policy perspectives; critical reviews of the literature; and comments and criticism.