Pub Date : 2020-12-12DOI: 10.1080/10627197.2020.1848423
Paul A. Westrick, F. Schmidt, Huy Le, S. Robbins, Justine Radunzel
ABSTRACT This meta-analytic path analysis presents evidence that first-year academic performance (FYAP), measured by first-year grade point average (FYGPA) plays the major role in determining second-year student retention and that socioeconomic status (SES), measured by parental income, plays a negligible role. Based on large sample data used in a previous study, we conducted additional analyses that included corrections for measurement error and created path models using precollege academic achievement, measured by ACT Composite scores and high school GPA (HSGPA), and SES to predict FYAP and then second-year retention. Precollege academic performances had direct effects on FYAP, and FYAP fully mediated their effects on second-year retention. SES did not contribute to the prediction of FYAP, and it had only a trivial effect on second-year retention. The results of this study point to the criticality of FYAP, and supporting first-year student academic success may serve as the central vehicle for retention efforts.
{"title":"The Road to Retention Passes through First Year Academic Performance: A Meta-Analytic Path Analysis of Academic Performance and Persistence","authors":"Paul A. Westrick, F. Schmidt, Huy Le, S. Robbins, Justine Radunzel","doi":"10.1080/10627197.2020.1848423","DOIUrl":"https://doi.org/10.1080/10627197.2020.1848423","url":null,"abstract":"ABSTRACT This meta-analytic path analysis presents evidence that first-year academic performance (FYAP), measured by first-year grade point average (FYGPA) plays the major role in determining second-year student retention and that socioeconomic status (SES), measured by parental income, plays a negligible role. Based on large sample data used in a previous study, we conducted additional analyses that included corrections for measurement error and created path models using precollege academic achievement, measured by ACT Composite scores and high school GPA (HSGPA), and SES to predict FYAP and then second-year retention. Precollege academic performances had direct effects on FYAP, and FYAP fully mediated their effects on second-year retention. SES did not contribute to the prediction of FYAP, and it had only a trivial effect on second-year retention. The results of this study point to the criticality of FYAP, and supporting first-year student academic success may serve as the central vehicle for retention efforts.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"26 1","pages":"35 - 51"},"PeriodicalIF":1.5,"publicationDate":"2020-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1848423","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46352124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-11-19DOI: 10.1080/10627197.2020.1841625
Graham G. Rifenbark, Allison R. Lombardi, Jennifer Freeman
ABSTRACT In this study (n = 5037) we further investigated the construct validity of the Georgia Brief School Climate Inventory (GBS). Despite containing ordinal items (e.g., ordered-responses), the GBS has only been examined using linear approaches. To fill this void, we employed confirmatory item factor analysis (IFA) to examine the factor structure because the observed data are polytomous. Using response data from adolescents with (n = 784) and without (n = 4253) disabilities in urban and suburban high schools, we tested for measurement invariance by estimating a series of multiple-group IFA models and examined the relationship between school climate and academic achievement. The study findings promote the use of non-linear methods for analyzing ordered response data and further support the importance of prioritizing students with disabilities in school climate studies.
{"title":"A Confirmatory Item Factor Analysis of A School Climate Measure for Adolescents with and without Disabilities","authors":"Graham G. Rifenbark, Allison R. Lombardi, Jennifer Freeman","doi":"10.1080/10627197.2020.1841625","DOIUrl":"https://doi.org/10.1080/10627197.2020.1841625","url":null,"abstract":"ABSTRACT In this study (n = 5037) we further investigated the construct validity of the Georgia Brief School Climate Inventory (GBS). Despite containing ordinal items (e.g., ordered-responses), the GBS has only been examined using linear approaches. To fill this void, we employed confirmatory item factor analysis (IFA) to examine the factor structure because the observed data are polytomous. Using response data from adolescents with (n = 784) and without (n = 4253) disabilities in urban and suburban high schools, we tested for measurement invariance by estimating a series of multiple-group IFA models and examined the relationship between school climate and academic achievement. The study findings promote the use of non-linear methods for analyzing ordered response data and further support the importance of prioritizing students with disabilities in school climate studies.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"26 1","pages":"52 - 68"},"PeriodicalIF":1.5,"publicationDate":"2020-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1841625","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48364105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-11-05DOI: 10.1080/10627197.2020.1841626
R. Zwick, A. Blatter, Lei Ye, Steven P. Isham
ABSTRACT Today, postsecondary institutions in the US typically wish to enroll entering classes that are both academically qualified and diverse. Although the definition of diversity varies from school to school, the challenge is essentially the same: How can academic objectives be combined with goals that involve the composition of the entering class? Many schools have a commitment to facilitating access for under-represented minorities or low-income applicants, or for members of nearby communities. Incorporating these goals while maintaining academic standards can be challenging.
{"title":"Using an Index of Admission Obstacles with Constrained Optimization to Increase the Diversity of College Classes","authors":"R. Zwick, A. Blatter, Lei Ye, Steven P. Isham","doi":"10.1080/10627197.2020.1841626","DOIUrl":"https://doi.org/10.1080/10627197.2020.1841626","url":null,"abstract":"ABSTRACT Today, postsecondary institutions in the US typically wish to enroll entering classes that are both academically qualified and diverse. Although the definition of diversity varies from school to school, the challenge is essentially the same: How can academic objectives be combined with goals that involve the composition of the entering class? Many schools have a commitment to facilitating access for under-represented minorities or low-income applicants, or for members of nearby communities. Incorporating these goals while maintaining academic standards can be challenging.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"26 1","pages":"20 - 34"},"PeriodicalIF":1.5,"publicationDate":"2020-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1841626","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48905544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-01DOI: 10.1080/10627197.2020.1766959
M. Heritage
ABSTRACT This concluding essay offers a reflection on the set of the papers contained in this special issue of the Educational Assessment journal. In it the author situates formative assessment squarely in the realm of teachers’ continuous professional learning and considers the essential nature of formative assessment as centering on three questions that guide the practice for both teachers and students: where are the students going (well-defined learning goals), where are they currently (evidence collected during the course of learning), and how to close the gap between these two (instructional adjustments, feedback, and student involvement)? The author returns at the end to note the importance of professional learning in the context of formative assessment.
{"title":"Getting the Emphasis Right: Formative Assessment through Professional Learning","authors":"M. Heritage","doi":"10.1080/10627197.2020.1766959","DOIUrl":"https://doi.org/10.1080/10627197.2020.1766959","url":null,"abstract":"ABSTRACT This concluding essay offers a reflection on the set of the papers contained in this special issue of the Educational Assessment journal. In it the author situates formative assessment squarely in the realm of teachers’ continuous professional learning and considers the essential nature of formative assessment as centering on three questions that guide the practice for both teachers and students: where are the students going (well-defined learning goals), where are they currently (evidence collected during the course of learning), and how to close the gap between these two (instructional adjustments, feedback, and student involvement)? The author returns at the end to note the importance of professional learning in the context of formative assessment.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"25 1","pages":"355 - 358"},"PeriodicalIF":1.5,"publicationDate":"2020-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1766959","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42350019","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-07-02DOI: 10.1080/10627197.2020.1804353
Kadriye Ercikan, Hongwen Guo, Qiwei He
ABSTRACT Comparing group is one of the key uses of large-scale assessment results, which are used to gain insights to inform policy and practice and to examine the comparability of scores and score meaning. Such comparisons typically focus on examinees’ final answers and responses to test questions, ignoring response process differences groups may engage in. This paper discusses and demonstrates the use of response process data in enhancing group comparison and fairness research methodologies. We propose two statistical approaches for identifying differential response processes which extend the differential item functioning (DIF) detection methods and demonstrate the complementary use of process data in comparing groups in two case studies Our findings demonstrate the use of response process data in gaining insights about students’ test-taking behaviors from different populations that go beyond what may be identified using response data only.
{"title":"Use of Response Process Data to Inform Group Comparisons and Fairness Research","authors":"Kadriye Ercikan, Hongwen Guo, Qiwei He","doi":"10.1080/10627197.2020.1804353","DOIUrl":"https://doi.org/10.1080/10627197.2020.1804353","url":null,"abstract":"ABSTRACT Comparing group is one of the key uses of large-scale assessment results, which are used to gain insights to inform policy and practice and to examine the comparability of scores and score meaning. Such comparisons typically focus on examinees’ final answers and responses to test questions, ignoring response process differences groups may engage in. This paper discusses and demonstrates the use of response process data in enhancing group comparison and fairness research methodologies. We propose two statistical approaches for identifying differential response processes which extend the differential item functioning (DIF) detection methods and demonstrate the complementary use of process data in comparing groups in two case studies Our findings demonstrate the use of response process data in gaining insights about students’ test-taking behaviors from different populations that go beyond what may be identified using response data only.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"25 1","pages":"179 - 197"},"PeriodicalIF":1.5,"publicationDate":"2020-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1804353","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42782276","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-07-02DOI: 10.1080/10627197.2020.1804351
R. Bennett, Mo Zhang, P. Deane, P. V. van Rijn
ABSTRACT We evaluate how higher- vs. lower-scoring middle-school students differ in their composition processes when writing persuasive essays from source materials. We examined differences on four individual process features–time taken before beginning to write, typing speed, total time spent, and number of words started. Next, we examined differences for four aggregated process measures: fluency, local editing, macro editing, and interstitial pausing (suspending text entry at locations associated with planning). Results showed that higher vs. lower scoring students were most consistently differentiated by total time, number of words started, and fluency. These differences persisted across two persuasive subgenres and two proficiency criteria, essay score and English language arts total-test score. The study’s findings give a more complete picture of how the processes employed by more- and less-successful students differ, which contributes to cognitive writing theory and may have eventual implications for education policy and instructional practice.
{"title":"How Do Proficient and Less Proficient Students Differ in Their Composition Processes?","authors":"R. Bennett, Mo Zhang, P. Deane, P. V. van Rijn","doi":"10.1080/10627197.2020.1804351","DOIUrl":"https://doi.org/10.1080/10627197.2020.1804351","url":null,"abstract":"ABSTRACT We evaluate how higher- vs. lower-scoring middle-school students differ in their composition processes when writing persuasive essays from source materials. We examined differences on four individual process features–time taken before beginning to write, typing speed, total time spent, and number of words started. Next, we examined differences for four aggregated process measures: fluency, local editing, macro editing, and interstitial pausing (suspending text entry at locations associated with planning). Results showed that higher vs. lower scoring students were most consistently differentiated by total time, number of words started, and fluency. These differences persisted across two persuasive subgenres and two proficiency criteria, essay score and English language arts total-test score. The study’s findings give a more complete picture of how the processes employed by more- and less-successful students differ, which contributes to cognitive writing theory and may have eventual implications for education policy and instructional practice.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"25 1","pages":"198 - 217"},"PeriodicalIF":1.5,"publicationDate":"2020-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1804351","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44436411","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-07-02DOI: 10.1080/10627197.2020.1804350
J. Moon, M. Keehner, Irvin R. Katz
ABSTRACT We investigated how item formats influence test takers’ response tendencies under uncertainty. Adult participants solved content-equivalent math items in three formats: multiple-selection multiple-choice, grid with forced-choice (true-false) options, and grid with non-forced-choice options. Participants showed a greater tendency to commit (rather than omit) responses in the grid items, in both forced-choice and non-forced-choice types, compared to the multiple-selection multiple-choice items. These findings relate to the theoretical framework of affordances, which predicts that the design of interactive artifacts can shape one’s perception of action opportunities. The results of a signal detection analysis provided additional evidence that the item formats affected participants’ response bias. The current research suggests that cognitive science principles could provide an in-depth understanding of test takers’ cognition in new item formats.
{"title":"Test Takers’ Response Tendencies in Alternative Item Formats: A Cognitive Science Approach","authors":"J. Moon, M. Keehner, Irvin R. Katz","doi":"10.1080/10627197.2020.1804350","DOIUrl":"https://doi.org/10.1080/10627197.2020.1804350","url":null,"abstract":"ABSTRACT We investigated how item formats influence test takers’ response tendencies under uncertainty. Adult participants solved content-equivalent math items in three formats: multiple-selection multiple-choice, grid with forced-choice (true-false) options, and grid with non-forced-choice options. Participants showed a greater tendency to commit (rather than omit) responses in the grid items, in both forced-choice and non-forced-choice types, compared to the multiple-selection multiple-choice items. These findings relate to the theoretical framework of affordances, which predicts that the design of interactive artifacts can shape one’s perception of action opportunities. The results of a signal detection analysis provided additional evidence that the item formats affected participants’ response bias. The current research suggests that cognitive science principles could provide an in-depth understanding of test takers’ cognition in new item formats.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"25 1","pages":"236 - 250"},"PeriodicalIF":1.5,"publicationDate":"2020-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1804350","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49080178","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-07-02DOI: 10.1080/10627197.2020.1804352
R. Levy
ABSTRACT This paper characterizes the ways in which increased attention to response process data has implications for psychometrics. To do so, this work draws on two organizing frameworks that have heretofore not been associated: evidence-centered design, and the distinction between greater and lesser statistics. Overlaying these frameworks leads to a conceptualization of greater and lesser psychometrics. This provides a conceptual space for articulating the ways in which response process data have implications for lesser psychometrics through measurement models, as well as greater psychometrics through student models, task models, and the social and personal contexts in which assessment takes place. To illustrate the key points and motivate discussions of where future research is needed, I draw from experiences with three assessments that have involved response process data: a task assessing the measurement of geometric area, a performance-based assessment of computer networking, and an educational video game targeting rational number addition.
{"title":"Implications of considering Response Process Data for Greater and Lesser Psychometrics","authors":"R. Levy","doi":"10.1080/10627197.2020.1804352","DOIUrl":"https://doi.org/10.1080/10627197.2020.1804352","url":null,"abstract":"ABSTRACT This paper characterizes the ways in which increased attention to response process data has implications for psychometrics. To do so, this work draws on two organizing frameworks that have heretofore not been associated: evidence-centered design, and the distinction between greater and lesser statistics. Overlaying these frameworks leads to a conceptualization of greater and lesser psychometrics. This provides a conceptual space for articulating the ways in which response process data have implications for lesser psychometrics through measurement models, as well as greater psychometrics through student models, task models, and the social and personal contexts in which assessment takes place. To illustrate the key points and motivate discussions of where future research is needed, I draw from experiences with three assessments that have involved response process data: a task assessing the measurement of geometric area, a performance-based assessment of computer networking, and an educational video game targeting rational number addition.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"25 1","pages":"218 - 235"},"PeriodicalIF":1.5,"publicationDate":"2020-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1804352","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42406805","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-16DOI: 10.1080/10627197.2020.1766956
E. Wylie, Christine J. Lyon
ABSTRACT To promote and support teachers’ professional growth in using formative assessment practices in the classroom, we developed and piloted a suite of materials which consists of a set of rubrics for ten dimensions of formative assessment and six self-reflection/peer-observation tools. We describe the iterative development process, the design decisions and challenges, and the findings from an early pilot in which twenty-four educators from five states used the materials. We also report on a content validation study was conducted with seven subject matter experts who confirmed the importance of the ten formative assessment dimensions and did not identify any significant omissions. Finally we have preliminary findings from a group of teachers who engaged with a set of training modules to support their understanding of the rubrics before applying them to classroom practice through peer observation and feedback.
{"title":"Developing a Formative Assessment Protocol to Support Professional Growth","authors":"E. Wylie, Christine J. Lyon","doi":"10.1080/10627197.2020.1766956","DOIUrl":"https://doi.org/10.1080/10627197.2020.1766956","url":null,"abstract":"ABSTRACT To promote and support teachers’ professional growth in using formative assessment practices in the classroom, we developed and piloted a suite of materials which consists of a set of rubrics for ten dimensions of formative assessment and six self-reflection/peer-observation tools. We describe the iterative development process, the design decisions and challenges, and the findings from an early pilot in which twenty-four educators from five states used the materials. We also report on a content validation study was conducted with seven subject matter experts who confirmed the importance of the ten formative assessment dimensions and did not identify any significant omissions. Finally we have preliminary findings from a group of teachers who engaged with a set of training modules to support their understanding of the rubrics before applying them to classroom practice through peer observation and feedback.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"25 1","pages":"314 - 330"},"PeriodicalIF":1.5,"publicationDate":"2020-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1766956","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45313983","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-06-12DOI: 10.1080/10627197.2020.1766955
E. Wylie
ABSTRACT This introductory essay provides a framing for and an overview of the papers contained in this special issue of the Educational Assessment journal. In it the author begins with a brief description of the history of classroom observation tools, and then articulates a rationale for observation protocols that target formative assessment. Examining similarities and differences across the protocols, the author orients the reader to the approaches to classroom observation that are the focus of the special issue.
{"title":"Observing Formative Assessment Practice: Learning Lessons Through Validation","authors":"E. Wylie","doi":"10.1080/10627197.2020.1766955","DOIUrl":"https://doi.org/10.1080/10627197.2020.1766955","url":null,"abstract":"ABSTRACT This introductory essay provides a framing for and an overview of the papers contained in this special issue of the Educational Assessment journal. In it the author begins with a brief description of the history of classroom observation tools, and then articulates a rationale for observation protocols that target formative assessment. Examining similarities and differences across the protocols, the author orients the reader to the approaches to classroom observation that are the focus of the special issue.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"25 1","pages":"251 - 258"},"PeriodicalIF":1.5,"publicationDate":"2020-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2020.1766955","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45950383","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}