Pub Date : 2026-01-09DOI: 10.1007/s10648-025-10107-2
Rhoda Lai, Jennifer Southam, Luella Mageean, Sue Roffey, Kelly-Ann Allen
Positive teacher-student relationships have the potential to impact teacher, as well as student, wellbeing. However, in middle and secondary schools, where teachers have contact with more students and less time with each of them, it is less clear how important these relationships are. This study systematically reviewed the literature on the association between positive teacher-student relationships and wellbeing in middle and secondary school teachers. A total of 55 studies were included in the review. Results suggested that positive teacher-student relationships were associated with each of the other aspects of wellbeing outlined in the PERMA model (positive emotions, engagement, meaning, and accomplishment) as well as overall wellbeing. There was wide variation in how teacher-student relationships were measured and defined, with majority of the studies adopting tools and definitions that were formulated for primary school teacher-student relationships, indicating that developing an understanding of what constitutes positive relationships for teachers in middle and secondary school settings through future research would be valuable.
{"title":"Associations Between Teacher-Student Relationship Quality and Middle and Secondary School Teachers’ Wellbeing: A Systematic Review","authors":"Rhoda Lai, Jennifer Southam, Luella Mageean, Sue Roffey, Kelly-Ann Allen","doi":"10.1007/s10648-025-10107-2","DOIUrl":"https://doi.org/10.1007/s10648-025-10107-2","url":null,"abstract":"Positive teacher-student relationships have the potential to impact teacher, as well as student, wellbeing. However, in middle and secondary schools, where teachers have contact with more students and less time with each of them, it is less clear how important these relationships are. This study systematically reviewed the literature on the association between positive teacher-student relationships and wellbeing in middle and secondary school teachers. A total of 55 studies were included in the review. Results suggested that positive teacher-student relationships were associated with each of the other aspects of wellbeing outlined in the PERMA model (positive emotions, engagement, meaning, and accomplishment) as well as overall wellbeing. There was wide variation in how teacher-student relationships were measured and defined, with majority of the studies adopting tools and definitions that were formulated for primary school teacher-student relationships, indicating that developing an understanding of what constitutes positive relationships for teachers in middle and secondary school settings through future research would be valuable.","PeriodicalId":48344,"journal":{"name":"Educational Psychology Review","volume":"29 1","pages":""},"PeriodicalIF":10.1,"publicationDate":"2026-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145947436","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-06DOI: 10.1007/s10648-025-10110-7
Denis Dumas, Benjamin Goecke, Sofiia Kagan, Selina Weiss
Openly available datasets from large-scale educational assessments like PISA, PIRLS, TIMSS, or NAEP, among others, are some of the most valuable public resources in the education sciences. Understandably, educational psychologists are interested in analyzing these datasets to advance their research. But, in contrast to the kinds of psychoeducational assessments about which educational psychologists are typically trained, large-scale assessments are not designed to make inferences about the mental attributes of students themselves, but about the population distributions of those attributes. This seemingly subtle distinction leads to a host of analytic, epistemological, and interpretative challenges that can cause confusion and dissuade educational psychologists from using these interesting datasets. In this theoretical paper, we seek to clarify the kinds of inferences that can be validly made with large-scale assessment data, and justify those inferences based on the psychometric and score-generation procedures that underpin them. This paper is not intended to be a technical or methodological guide to analyzing large-scale assessment data but instead serves as an epistemic and conceptual introduction to the topic. After appropriately accounting for various sources of error in large-scale assessment proficiency estimates, researchers can make interesting inferences about education and psychology, but those inferences can only be validly made at the population level, not about individual students.
{"title":"Zooming Out On Education: Making Valid Psychological Inferences From Large-Scale Assessment Data","authors":"Denis Dumas, Benjamin Goecke, Sofiia Kagan, Selina Weiss","doi":"10.1007/s10648-025-10110-7","DOIUrl":"https://doi.org/10.1007/s10648-025-10110-7","url":null,"abstract":"Openly available datasets from large-scale educational assessments like PISA, PIRLS, TIMSS, or NAEP, among others, are some of the most valuable public resources in the education sciences. Understandably, educational psychologists are interested in analyzing these datasets to advance their research. But, in contrast to the kinds of psychoeducational assessments about which educational psychologists are typically trained, large-scale assessments are not designed to make inferences about the mental attributes of students themselves, but about the <jats:italic>population distributions</jats:italic> of those attributes. This seemingly subtle distinction leads to a host of analytic, epistemological, and interpretative challenges that can cause confusion and dissuade educational psychologists from using these interesting datasets. In this theoretical paper, we seek to clarify the kinds of inferences that can be validly made with large-scale assessment data, and justify those inferences based on the psychometric and score-generation procedures that underpin them. This paper is not intended to be a technical or methodological guide to analyzing large-scale assessment data but instead serves as an epistemic and conceptual introduction to the topic. After appropriately accounting for various sources of error in large-scale assessment proficiency estimates, researchers can make interesting inferences about education and psychology, but those inferences can only be validly made at the population level, not about individual students.","PeriodicalId":48344,"journal":{"name":"Educational Psychology Review","volume":"38 1","pages":""},"PeriodicalIF":10.1,"publicationDate":"2026-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145903679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-04DOI: 10.1007/s10648-025-10104-5
Royel M. Johnson
{"title":"Reframing Belonging in Higher Education: an Intersectional Ecological Model for Research, Policy, and Practice","authors":"Royel M. Johnson","doi":"10.1007/s10648-025-10104-5","DOIUrl":"https://doi.org/10.1007/s10648-025-10104-5","url":null,"abstract":"","PeriodicalId":48344,"journal":{"name":"Educational Psychology Review","volume":"53 1","pages":""},"PeriodicalIF":10.1,"publicationDate":"2026-01-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145894236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-13DOI: 10.1007/s10648-025-10088-2
Richard M. Ryan, Hyungshim Jang, John C. K. Wang, Lennia Matos, Tamara Gordeeva, Haya Kaplan, Behzad Behzadnia, Özge Kantas, Kelly A. Ferber, Bart Soenens, Maarten Vansteenkiste
{"title":"Variations in Need Supports in Education as a Function of Cultural and Economic Factors: Perspectives from Self-Determination Theory","authors":"Richard M. Ryan, Hyungshim Jang, John C. K. Wang, Lennia Matos, Tamara Gordeeva, Haya Kaplan, Behzad Behzadnia, Özge Kantas, Kelly A. Ferber, Bart Soenens, Maarten Vansteenkiste","doi":"10.1007/s10648-025-10088-2","DOIUrl":"https://doi.org/10.1007/s10648-025-10088-2","url":null,"abstract":"","PeriodicalId":48344,"journal":{"name":"Educational Psychology Review","volume":"3 1","pages":""},"PeriodicalIF":10.1,"publicationDate":"2025-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145753040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-11DOI: 10.1007/s10648-025-10094-4
Christian Ritter, Hannah Hausman, Robert Gaschler, Simon P. Tiffin-Richards, Veit Kubik
Making judgments of learning (JOLs) can directly affect learning outcomes. The present study investigated whether providing JOLs during inductive learning tasks improves learning of new material (forward effect), and whether feedback moderates these effects. Participants learned the painting styles of different artists (Experiment 1) or different rock types (Experiment 2) across two study phases separated by an interim learning task, and then completed a transfer posttest (classifying new exemplars) and memory posttest (classifying previously-studied exemplars). In Experiment 1, the interim learning tasks of overt retrieval and cue-only JOLs (based on the painting without the artist’s name) improved future inductive learning compared to restudy, whereas cue–target JOLs (painting and artist name shown) did not. Cue-only JOLs also produced response-time patterns consistent with retrieval-based processing, and self-reported retrieval use predicted their forward benefit. Experiment 2 replicated the beneficial forward effect of cue-only JOLs over restudy with different materials and found that providing item-by-item feedback did not change the effect. Our results suggest that cue-only JOLs, but not cue–target JOLs, enhance future inductive learning of natural visual categories, likely through metacognitively controlled, covert retrieval processes.
{"title":"The Forward Effect of Judgements of Learning on Memory and Transfer in Inductive Learning","authors":"Christian Ritter, Hannah Hausman, Robert Gaschler, Simon P. Tiffin-Richards, Veit Kubik","doi":"10.1007/s10648-025-10094-4","DOIUrl":"https://doi.org/10.1007/s10648-025-10094-4","url":null,"abstract":"Making judgments of learning (JOLs) can directly affect learning outcomes. The present study investigated whether providing JOLs during inductive learning tasks improves learning of new material (forward effect), and whether feedback moderates these effects. Participants learned the painting styles of different artists (Experiment 1) or different rock types (Experiment 2) across two study phases separated by an interim learning task, and then completed a transfer posttest (classifying new exemplars) and memory posttest (classifying previously-studied exemplars). In Experiment 1, the interim learning tasks of overt retrieval and cue-only JOLs (based on the painting without the artist’s name) improved future inductive learning compared to restudy, whereas cue–target JOLs (painting and artist name shown) did not. Cue-only JOLs also produced response-time patterns consistent with retrieval-based processing, and self-reported retrieval use predicted their forward benefit. Experiment 2 replicated the beneficial forward effect of cue-only JOLs over restudy with different materials and found that providing item-by-item feedback did not change the effect. Our results suggest that cue-only JOLs, but not cue–target JOLs, enhance future inductive learning of natural visual categories, likely through metacognitively controlled, covert retrieval processes.","PeriodicalId":48344,"journal":{"name":"Educational Psychology Review","volume":"18 1","pages":""},"PeriodicalIF":10.1,"publicationDate":"2025-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145717636","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-10DOI: 10.1007/s10648-025-10075-7
Quentin W. King-Shepard, Julia Walker, Timothy J. Nokes-Malach, Shana K. Carpenter, Scott H. Fraundorf
{"title":"The Effect of Prequestions on Learning: A Multilevel Meta-Analysis","authors":"Quentin W. King-Shepard, Julia Walker, Timothy J. Nokes-Malach, Shana K. Carpenter, Scott H. Fraundorf","doi":"10.1007/s10648-025-10075-7","DOIUrl":"https://doi.org/10.1007/s10648-025-10075-7","url":null,"abstract":"","PeriodicalId":48344,"journal":{"name":"Educational Psychology Review","volume":"22 1","pages":""},"PeriodicalIF":10.1,"publicationDate":"2025-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145711404","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-02DOI: 10.1007/s10648-025-10095-3
Suya Liu, Yun Dai, Oi Lam Ng, Zhihui Cai
Gender disparity is a well-recognized issue in computational thinking (CT) education, yet few meta-analyses have examined how specific pedagogical and assessment contexts influence this gap. This study addresses that limitation by synthesizing 53 empirical studies, comprising 100 effect sizes and 15,454 participants, to explore the extent and moderators of gender differences in CT education. The analysis reveals an overall effect size of g = 0.106 (95% CI [0.024, 0.188], p < 0.05), indicating a very small but statistically significant gender disparity favoring males. Among the three groups of moderators examined, neither general study characteristics (publication type, geographical region, and educational level) nor CT assessment contexts (instrument and measured learning outcome) significantly influenced effect sizes. However, pedagogical strategies moderated gender disparities: mixed and plugged approaches, which integrate technologies, were associated with larger gaps favoring boys, while unplugged approaches tended to reduce or even reverse the disparity, benefiting girls in some cases. In terms of assessment, gender disparities were nonsignificant for CT concepts, but became significant when evaluating authentic practices (e.g., programming tasks) and identity-related perspectives (e.g., motivation, learning interest, and self-efficacy). The findings offer practical implications for advancing educational equity in CT education. Early interventions in K-12 settings, especially those targeting at CT practices and perspectives, are critical to prevent disparities from becoming entrenched. Unplugged activities can help build foundational understanding and confidence, especially for girls. Gradually introducing digital and AI tools within supportive environments, such as culturally relevant scenarios, may reduce technology-related anxiety and promote more inclusive learning experiences.
性别差异是计算思维(CT)教育中一个公认的问题,但很少有荟萃分析研究具体的教学和评估环境如何影响这一差距。本研究通过综合53项实证研究,包括100个效应量和15,454名参与者,来探讨CT教育中性别差异的程度和调节因素,从而解决了这一局限性。分析显示,总体效应值g = 0.106 (95% CI [0.024, 0.188], p < 0.05),表明性别差异非常小,但在统计学上显著有利于男性。在被检查的三组调节者中,一般研究特征(出版类型、地理区域和教育水平)和CT评估背景(仪器和测量的学习结果)都没有显著影响效应量。然而,教学策略缓和了性别差异:综合技术的混合和插入方法与有利于男孩的较大差距有关,而不插入方法往往会缩小甚至扭转这种差距,在某些情况下使女孩受益。在评估方面,性别差异在CT概念方面不显著,但在评估真实实践(如编程任务)和身份相关观点(如动机、学习兴趣和自我效能感)时变得显著。研究结果对促进CT教育公平具有实际意义。在K-12阶段的早期干预,特别是针对CT实践和观点的干预,对于防止差距变得根深蒂固至关重要。不插电活动可以帮助建立基本的理解和自信,尤其是对女孩来说。在支持性环境(例如与文化相关的场景)中逐步引入数字和人工智能工具,可能会减少与技术相关的焦虑,并促进更具包容性的学习体验。
{"title":"Gender Disparity in Computational Thinking Pedagogy and Assessment: A Three-Level Meta-Analysis","authors":"Suya Liu, Yun Dai, Oi Lam Ng, Zhihui Cai","doi":"10.1007/s10648-025-10095-3","DOIUrl":"https://doi.org/10.1007/s10648-025-10095-3","url":null,"abstract":"Gender disparity is a well-recognized issue in computational thinking (CT) education, yet few meta-analyses have examined how specific pedagogical and assessment contexts influence this gap. This study addresses that limitation by synthesizing 53 empirical studies, comprising 100 effect sizes and 15,454 participants, to explore the extent and moderators of gender differences in CT education. The analysis reveals an overall effect size of <jats:italic>g</jats:italic> = 0.106 (95% CI [0.024, 0.188], <jats:italic>p</jats:italic> < 0.05), indicating a very small but statistically significant gender disparity favoring males. Among the three groups of moderators examined, neither general study characteristics (publication type, geographical region, and educational level) nor CT assessment contexts (instrument and measured learning outcome) significantly influenced effect sizes. However, pedagogical strategies moderated gender disparities: mixed and plugged approaches, which integrate technologies, were associated with larger gaps favoring boys, while unplugged approaches tended to reduce or even reverse the disparity, benefiting girls in some cases. In terms of assessment, gender disparities were nonsignificant for CT concepts, but became significant when evaluating authentic practices (e.g., programming tasks) and identity-related perspectives (e.g., motivation, learning interest, and self-efficacy). The findings offer practical implications for advancing educational equity in CT education. Early interventions in K-12 settings, especially those targeting at CT practices and perspectives, are critical to prevent disparities from becoming entrenched. Unplugged activities can help build foundational understanding and confidence, especially for girls. Gradually introducing digital and AI tools within supportive environments, such as culturally relevant scenarios, may reduce technology-related anxiety and promote more inclusive learning experiences.","PeriodicalId":48344,"journal":{"name":"Educational Psychology Review","volume":"43 1","pages":""},"PeriodicalIF":10.1,"publicationDate":"2025-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145658264","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-11-28DOI: 10.1007/s10648-025-10079-3
Herbert W. Marsh, Richard M. Ryan, Theresa Dicke, Reinhard Pekrun, Jiesi Guo, Emma L. Bradshaw, Johnmarshall Reeve, Oliver Lüdtke, Thomas Clarke, Joachim Waterschoot
{"title":"Basic Psychological Needs Under Constrained Autonomy: A Substantive–Methodological Reflection and Analysis of School Leaders’ Needs from a Self-Determination Theory Perspective","authors":"Herbert W. Marsh, Richard M. Ryan, Theresa Dicke, Reinhard Pekrun, Jiesi Guo, Emma L. Bradshaw, Johnmarshall Reeve, Oliver Lüdtke, Thomas Clarke, Joachim Waterschoot","doi":"10.1007/s10648-025-10079-3","DOIUrl":"https://doi.org/10.1007/s10648-025-10079-3","url":null,"abstract":"","PeriodicalId":48344,"journal":{"name":"Educational Psychology Review","volume":"204 1","pages":""},"PeriodicalIF":10.1,"publicationDate":"2025-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145611111","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}