Pub Date : 2022-03-07DOI: 10.1186/s40536-022-00119-7
Takashi Yamashita, Thomas J. Smith, Shalini Sahoo, Phyllis A. Cummins
This study highlighted how particular intersections of personal characteristics were related to Motivation to Learn (MtL) among adults. MtL is a prerequisite for adult education and training participation. However, little is known about MtL across subpopulations due to several methodological limitations. This study developed a national profile of MtL by key subpopulations that are defined by combinations of age, gender, education level, and literacy proficiency in the United States. Data were obtained from 2012/2014/2017 Program for International Assessment of Adult Competencies (PIAAC) restricted use file (N = 8400). The alignment optimization (AO) method was employed to estimate subpopulation means of a PIAAC-based latent MtL construct. Subpopulations with younger age, greater educational attainment, and higher literacy proficiency showed significantly greater MtL.
{"title":"Motivation to learn by age, education, and literacy skills among working-age adults in the United States","authors":"Takashi Yamashita, Thomas J. Smith, Shalini Sahoo, Phyllis A. Cummins","doi":"10.1186/s40536-022-00119-7","DOIUrl":"https://doi.org/10.1186/s40536-022-00119-7","url":null,"abstract":"<p>This study highlighted how particular intersections of personal characteristics were related to Motivation to Learn (MtL) among adults. MtL is a prerequisite for adult education and training participation. However, little is known about MtL across subpopulations due to several methodological limitations. This study developed a national profile of MtL by key subpopulations that are defined by combinations of age, gender, education level, and literacy proficiency in the United States. Data were obtained from 2012/2014/2017 Program for International Assessment of Adult Competencies (PIAAC) restricted use file (<i>N</i> = 8400). The alignment optimization (AO) method was employed to estimate subpopulation means of a PIAAC-based latent MtL construct. Subpopulations with younger age, greater educational attainment, and higher literacy proficiency showed significantly greater MtL.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"20 12","pages":""},"PeriodicalIF":3.1,"publicationDate":"2022-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138496874","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-01DOI: 10.1186/s40536-022-00143-7
Alec I Kennedy, Ana María Mejía-Rodríguez, Andrés Strello
Background: Remote learning, or synchronous or asynchronous instruction provided to students outside the classroom, was a common strategy used by schools to ensure learning continuity for their students when many school buildings were forced to shut down due to the COVID-19 pandemic. Differences in technology infrastructures, digital competencies of students and teachers, and home supports for learning likely led to inequalities in the way remote learning reached and was perceived by students. This study seeks to understand how student perspectives on remote learning varied across and within several countries.
Methods: Building off a conceptual framework developed to understand remote learning success and using data from the Responses to Education Disruption Survey (REDS) student questionnaire from seven countries, we construct measures of student perceptions of three essential components of successful remote learning: Access to Suitable Technology, Effective Teachers, and Engaged Students. We then compare values on these scales across and within countries to identify inequalities in remote learning quality during school closures. We also investigate the extent to which schools implemented supports for remote learning across countries.
Results: We find evidence of across country variation in remote learning quality with certain countries having much lower values on our remote learning quality scales compared to other countries in our sample. Furthermore, we identify within-country inequalities in access to and confidence in using technology with low-SES students, girls, and those living in rural areas having lower values on these measures. Furthermore, we find some evidence of within-country inequalities in student engagement across socioeconomic groups. In contrast, we do not find as many inequalities in our measures of effective teachers. In most countries, schools provided several supports to improve remote learning.
Conclusions: While inequalities in remote learning experiences were anticipated and confirmed by our results, we find it promising that, in some countries, inequalities in access to and confidence in using technology as well as student engagement did not extend to inequalities in perceptions of teacher effectiveness and support. Schools' efforts to support remote learning, regardless of student background, should be seen as a positive and illustrate their resilience in the face of many challenges.
{"title":"Inequality in remote learning quality during COVID-19: student perspectives and mitigating factors.","authors":"Alec I Kennedy, Ana María Mejía-Rodríguez, Andrés Strello","doi":"10.1186/s40536-022-00143-7","DOIUrl":"https://doi.org/10.1186/s40536-022-00143-7","url":null,"abstract":"<p><strong>Background: </strong>Remote learning, or synchronous or asynchronous instruction provided to students outside the classroom, was a common strategy used by schools to ensure learning continuity for their students when many school buildings were forced to shut down due to the COVID-19 pandemic. Differences in technology infrastructures, digital competencies of students and teachers, and home supports for learning likely led to inequalities in the way remote learning reached and was perceived by students. This study seeks to understand how student perspectives on remote learning varied across and within several countries.</p><p><strong>Methods: </strong>Building off a conceptual framework developed to understand remote learning success and using data from the Responses to Education Disruption Survey (REDS) student questionnaire from seven countries, we construct measures of student perceptions of three essential components of successful remote learning: <i>Access to Suitable Technology</i>, <i>Effective Teachers</i>, and <i>Engaged Students</i>. We then compare values on these scales across and within countries to identify inequalities in remote learning quality during school closures. We also investigate the extent to which schools implemented supports for remote learning across countries.</p><p><strong>Results: </strong>We find evidence of across country variation in remote learning quality with certain countries having much lower values on our remote learning quality scales compared to other countries in our sample. Furthermore, we identify within-country inequalities in access to and confidence in using technology with low-SES students, girls, and those living in rural areas having lower values on these measures. Furthermore, we find some evidence of within-country inequalities in student engagement across socioeconomic groups. In contrast, we do not find as many inequalities in our measures of effective teachers. In most countries, schools provided several supports to improve remote learning.</p><p><strong>Conclusions: </strong>While inequalities in remote learning experiences were anticipated and confirmed by our results, we find it promising that, in some countries, inequalities in access to and confidence in using technology as well as student engagement did not extend to inequalities in perceptions of teacher effectiveness and support. Schools' efforts to support remote learning, regardless of student background, should be seen as a positive and illustrate their resilience in the face of many challenges.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"10 1","pages":"29"},"PeriodicalIF":3.1,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9790815/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10523952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-12-08DOI: 10.1186/s40536-021-00115-3
Murchan, Damian, Siddiq, Fazilat
Analysis of user-generated data (for example process data from logfiles, learning analytics, and data mining) in computer-based environments has gained much attention in the last decade and is considered a promising evolving field in learning sciences. In the area of educational assessment, the benefits of such data and how to exploit them are increasingly emphasised. Even though the use of process data in assessment holds significant promise, the ethical and regulatory implications associated with it have not been sufficiently considered. To address this issue and to provide an overview of how ethical and regulatory requirements interface with process data from assessments in primary and secondary education (K-12), we conducted a systematic literature review. Initial results showed that few studies considered ethical, privacy and regulatory issues in K-12 assessment, prompting a widening of the search criteria to include research in higher education also, which identified 22 studies. The literature that was relevant to our research questions represented an approximate balance in the number of theoretical and empirical studies. The studies identified as relevant interpret issues of privacy largely in terms of informed consent and the research pays little attention to ethical and privacy issues in the use of process data in assessment. The implications for the field of educational assessment and the use of process data are discussed. This includes the need to develop a specific code of ethics to govern the use of process- and logfile data in educational assessment.
{"title":"A call to action: a systematic review of ethical and regulatory issues in using process data in educational assessment","authors":"Murchan, Damian, Siddiq, Fazilat","doi":"10.1186/s40536-021-00115-3","DOIUrl":"https://doi.org/10.1186/s40536-021-00115-3","url":null,"abstract":"<p>Analysis of user-generated data (for example process data from logfiles, learning analytics, and data mining) in computer-based environments has gained much attention in the last decade and is considered a promising evolving field in learning sciences. In the area of educational assessment, the benefits of such data and how to exploit them are increasingly emphasised. Even though the use of process data in assessment holds significant promise, the ethical and regulatory implications associated with it have not been sufficiently considered. To address this issue and to provide an overview of how ethical and regulatory requirements interface with process data from assessments in primary and secondary education (K-12), we conducted a systematic literature review. Initial results showed that few studies considered ethical, privacy and regulatory issues in K-12 assessment, prompting a widening of the search criteria to include research in higher education also, which identified 22 studies. The literature that was relevant to our research questions represented an approximate balance in the number of theoretical and empirical studies. The studies identified as relevant interpret issues of privacy largely in terms of informed consent and the research pays little attention to ethical and privacy issues in the use of process data in assessment. The implications for the field of educational assessment and the use of process data are discussed. This includes the need to develop a specific code of ethics to govern the use of process- and logfile data in educational assessment.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"21 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138496873","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-18DOI: 10.1186/s40536-021-00118-0
Silinskas, Gintautas, Ahonen, Arto K., Wilska, Terhi-Anna
The aim or the present study was to examine the relative importance of financial education in school and families and dispositional factors (competitiveness, work mastery, meta-cognition) in predicting financial literacy among Finnish adolescents. The data on the 4328 Finnish 15-year-olds was drawn from the PISA 2018 assessment. Financial literacy was measured by tests, and financial education and dispositional factors were assessed by adolescent questionnaires. First, the results showed that financial education in school was positively associated with adolescents’ financial literacy skills, whereas parental involvement in financial matters did not relate or related negatively to financial literacy scores. Second, dispositional factors, such as competitiveness, work mastery, and meta-cognition (effective strategies to understand/remember information, to summarize information, and to evaluate source credibility) were the strongest positive predictors of the financial literacy scores. Overall, the present study emphasizes that certain social factors (schools and families) and especially dispositional characteristics (personality/motivation and critical thinking/learning strategies) may shape the development of the financial skills of adolescents.
{"title":"Financial literacy among Finnish adolescents in PISA 2018: the role of financial learning and dispositional factors","authors":"Silinskas, Gintautas, Ahonen, Arto K., Wilska, Terhi-Anna","doi":"10.1186/s40536-021-00118-0","DOIUrl":"https://doi.org/10.1186/s40536-021-00118-0","url":null,"abstract":"<p>The aim or the present study was to examine the relative importance of financial education in school and families and dispositional factors (competitiveness, work mastery, meta-cognition) in predicting financial literacy among Finnish adolescents. The data on the 4328 Finnish 15-year-olds was drawn from the PISA 2018 assessment. Financial literacy was measured by tests, and financial education and dispositional factors were assessed by adolescent questionnaires. First, the results showed that financial education in school was positively associated with adolescents’ financial literacy skills, whereas parental involvement in financial matters did not relate or related negatively to financial literacy scores. Second, dispositional factors, such as competitiveness, work mastery, and meta-cognition (effective strategies to understand/remember information, to summarize information, and to evaluate source credibility) were the strongest positive predictors of the financial literacy scores. Overall, the present study emphasizes that certain social factors (schools and families) and especially dispositional characteristics (personality/motivation and critical thinking/learning strategies) may shape the development of the financial skills of adolescents.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"21 2","pages":""},"PeriodicalIF":3.1,"publicationDate":"2021-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138496872","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-10-28DOI: 10.1186/s40536-021-00116-2
Rohm, Theresa, Carstensen, Claus H., Fischer, Luise, Gnambs, Timo
Background
After elementary school, students in Germany are separated into different school tracks (i.e., school types) with the aim of creating homogeneous student groups in secondary school. Consequently, the development of students’ reading achievement diverges across school types. Findings on this achievement gap have been criticized as depending on the quality of the administered measure. Therefore, the present study examined to what degree differential item functioning affects estimates of the achievement gap in reading competence.
Methods
Using data from the German National Educational Panel Study, reading competence was investigated across three timepoints during secondary school: in grades 5, 7, and 9 (N = 7276). First, using the invariance alignment method, measurement invariance across school types was tested. Then, multilevel structural equation models were used to examine whether a lack of measurement invariance between school types affected the results regarding reading development.
Results
Our analyses revealed some measurement non-invariant items that did not alter the patterns of competence development found among school types in the longitudinal modeling approach. However, misleading conclusions about the development of reading competence in different school types emerged when the hierarchical data structure (i.e., students being nested in schools) was not taken into account.
Conclusions
We assessed the relevance of measurement invariance and accounting for clustering in the context of longitudinal competence measurement. Even though differential item functioning between school types was found for each measurement occasion, taking these differences in item estimates into account did not alter the parallel pattern of reading competence development across German secondary school types. However, ignoring the clustered data structure of students being nested within schools led to an overestimation of the statistical significance of school type effects.
{"title":"The achievement gap in reading competence: the effect of measurement non-invariance across school types","authors":"Rohm, Theresa, Carstensen, Claus H., Fischer, Luise, Gnambs, Timo","doi":"10.1186/s40536-021-00116-2","DOIUrl":"https://doi.org/10.1186/s40536-021-00116-2","url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Background</h3><p>After elementary school, students in Germany are separated into different school tracks (i.e., school types) with the aim of creating homogeneous student groups in secondary school. Consequently, the development of students’ reading achievement diverges across school types. Findings on this achievement gap have been criticized as depending on the quality of the administered measure. Therefore, the present study examined to what degree differential item functioning affects estimates of the achievement gap in reading competence.</p><h3 data-test=\"abstract-sub-heading\">Methods</h3><p>Using data from the German National Educational Panel Study, reading competence was investigated across three timepoints during secondary school: in grades 5, 7, and 9 (<i>N</i> = 7276). First, using the invariance alignment method, measurement invariance across school types was tested. Then, multilevel structural equation models were used to examine whether a lack of measurement invariance between school types affected the results regarding reading development.</p><h3 data-test=\"abstract-sub-heading\">Results</h3><p>Our analyses revealed some measurement non-invariant items that did not alter the patterns of competence development found among school types in the longitudinal modeling approach. However, misleading conclusions about the development of reading competence in different school types emerged when the hierarchical data structure (i.e., students being nested in schools) was not taken into account.</p><h3 data-test=\"abstract-sub-heading\">Conclusions</h3><p>We assessed the relevance of measurement invariance and accounting for clustering in the context of longitudinal competence measurement. Even though differential item functioning between school types was found for each measurement occasion, taking these differences in item estimates into account did not alter the parallel pattern of reading competence development across German secondary school types. However, ignoring the clustered data structure of students being nested within schools led to an overestimation of the statistical significance of school type effects.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"21 3","pages":""},"PeriodicalIF":3.1,"publicationDate":"2021-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138496871","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-10-22DOI: 10.1186/s40536-021-00117-1
Tomul, Ekber, Önder, Emine, Taslidere, Erdal
This study aims to examine the relative effects of student, family and school-related characteristics on 4th grade students’ math achievement according to location of the school in Turkey. The data of 6435 students studying at 260 primary schools were analyzed using TIMSS-2015 database. The dependent variable of the study was students’ math scores and 19 factors constituting the student, family and school-related characteristics were the independent variables. The location of the school was classified as urban, suburban, medium-size city and village. The data was analyzed via single level multiple linear regression. The results revealed that the entire models explained the largest amount of variance (52%) in the schools located in the villages and the least amount of variance (44%) in those located in the urban area. Although all of the student, family and school-related characteristic sets were found to be significantly related with the achievement, the student-related characteristics explained the largest amount of variance in achievement. Students’ confidence in math contributed almost the highest amount of variance, and the early numeracy tasks, absenteeism in school, parents’ highest education level, parents’ highest occupation level, early numeric activities before school explained small amounts of variance in students’ math achievement in the schools of all locations.
{"title":"The relative effect of student, family and school-related factors on math achievement by location of the school","authors":"Tomul, Ekber, Önder, Emine, Taslidere, Erdal","doi":"10.1186/s40536-021-00117-1","DOIUrl":"https://doi.org/10.1186/s40536-021-00117-1","url":null,"abstract":"<p>This study aims to examine the relative effects of student, family and school-related characteristics on 4th grade students’ math achievement according to location of the school in Turkey. The data of 6435 students studying at 260 primary schools were analyzed using TIMSS-2015 database. The dependent variable of the study was students’ math scores and 19 factors constituting the student, family and school-related characteristics were the independent variables. The location of the school was classified as urban, suburban, medium-size city and village. The data was analyzed via single level multiple linear regression. The results revealed that the entire models explained the largest amount of variance (52%) in the schools located in the villages and the least amount of variance (44%) in those located in the urban area. Although all of the student, family and school-related characteristic sets were found to be significantly related with the achievement, the student-related characteristics explained the largest amount of variance in achievement. Students’ confidence in math contributed almost the highest amount of variance, and the early numeracy tasks, absenteeism in school, parents’ highest education level, parents’ highest occupation level, early numeric activities before school explained small amounts of variance in students’ math achievement in the schools of all locations.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"21 4","pages":""},"PeriodicalIF":3.1,"publicationDate":"2021-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138496870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-10-04DOI: 10.1186/s40536-021-00114-4
Plamen V. Mirazchiyski
{"title":"RALSA: the R analyzer for large-scale assessments","authors":"Plamen V. Mirazchiyski","doi":"10.1186/s40536-021-00114-4","DOIUrl":"https://doi.org/10.1186/s40536-021-00114-4","url":null,"abstract":"","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"9 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2021-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"65692595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
International large-scale assessments such as PISA or PIAAC have started to provide public or scientific use files for log data; that is, events, event-related attributes and timestamps of test-takers’ interactions with the assessment system. Log data and the process indicators derived from it can be used for many purposes. However, the intended uses and interpretations of process indicators require validation, which here means a theoretical and/or empirical justification that inferences about (latent) attributes of the test-taker’s work process are valid. This article reviews and synthesizes measurement concepts from various areas, including the standard assessment paradigm, the continuous assessment approach, the evidence-centered design (ECD) framework, and test validation. Based on this synthesis, we address the questions of how to ensure the valid interpretation of process indicators by means of an evidence-centered design of the task situation, and how to empirically challenge the intended interpretation of process indicators by developing and implementing correlational and/or experimental validation strategies. For this purpose, we explicate the process of reasoning from log data to low-level features and process indicators as the outcome of evidence identification. In this process, contextualizing information from log data is essential in order to reduce interpretative ambiguities regarding the derived process indicators. Finally, we show that empirical validation strategies can be adapted from classical approaches investigating the nomothetic span and construct representation. Two worked examples illustrate possible validation strategies for the design phase of measurements and their empirical evaluation.
{"title":"From byproduct to design factor: on validating the interpretation of process indicators based on log data","authors":"Goldhammer, Frank, Hahnel, Carolin, Kroehne, Ulf, Zehner, Fabian","doi":"10.1186/s40536-021-00113-5","DOIUrl":"https://doi.org/10.1186/s40536-021-00113-5","url":null,"abstract":"<p>International large-scale assessments such as PISA or PIAAC have started to provide public or scientific use files for log data; that is, events, event-related attributes and timestamps of test-takers’ interactions with the assessment system. Log data and the process indicators derived from it can be used for many purposes. However, the intended uses and interpretations of process indicators require validation, which here means a theoretical and/or empirical justification that inferences about (latent) attributes of the test-taker’s work process are valid. This article reviews and synthesizes measurement concepts from various areas, including the standard assessment paradigm, the continuous assessment approach, the evidence-centered design (ECD) framework, and test validation. Based on this synthesis, we address the questions of how to ensure the valid interpretation of process indicators by means of an evidence-centered design of the task situation, and how to empirically challenge the intended interpretation of process indicators by developing and implementing correlational and/or experimental validation strategies. For this purpose, we explicate the process of reasoning from log data to low-level features and process indicators as the outcome of evidence identification. In this process, contextualizing information from log data is essential in order to reduce interpretative ambiguities regarding the derived process indicators. Finally, we show that empirical validation strategies can be adapted from classical approaches investigating the nomothetic span and construct representation. Two worked examples illustrate possible validation strategies for the design phase of measurements and their empirical evaluation.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"21 5","pages":""},"PeriodicalIF":3.1,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138496869","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-01DOI: 10.1186/s40536-021-00112-6
M. Billington, Njål Foldnes
{"title":"Exploring the association between occupational complexity and numeracy","authors":"M. Billington, Njål Foldnes","doi":"10.1186/s40536-021-00112-6","DOIUrl":"https://doi.org/10.1186/s40536-021-00112-6","url":null,"abstract":"","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"9 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"65692530","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-08-17DOI: 10.1186/s40536-021-00110-8
Rios, Joseph A., Deng, Jiayi
Background
In testing contexts that are predominately concerned with power, rapid guessing (RG) has the potential to undermine the validity of inferences made from educational assessments, as such responses are unreflective of the knowledge, skills, and abilities assessed. Given this concern, practitioners/researchers have utilized a multitude of response time threshold procedures that classify RG responses in these contexts based on either the use of no empirical data (e.g., an arbitrary time limit), response time distributions, and the combination of response time and accuracy information. As there is little understanding of how these procedures compare to each other, this meta-analysis sought to investigate whether threshold typology is related to differences in descriptive, measurement property, and performance outcomes in these contexts.
Methods
Studies were sampled that: (a) employed two or more response time (RT) threshold procedures to identify and exclude RG responses on the same computer-administered low-stakes power test; and (b) evaluated differences between procedures on the proportion of RG responses and responders, measurement properties, and test performance.
Results
Based on as many as 86 effect sizes, our findings indicated non-negligible differences between RT threshold procedures in the proportion of RG responses and responders. The largest differences for these outcomes were observed between procedures using no empirical data and those relying on response time and accuracy information. However, these differences were not related to variability in aggregate-level measurement properties and test performance.
Conclusions
When filtering RG responses to improve inferences concerning item properties and group score outcomes, the actual threshold procedure chosen may be of less importance than the act of identifying such deleterious responses. However, given the conservative nature of RT thresholds that use no empirical data, practitioners may look to avoid the use of these procedures when making inferences at the individual-level, given their potential for underclassifying RG.
{"title":"Does the choice of response time threshold procedure substantially affect inferences concerning the identification and exclusion of rapid guessing responses? A meta-analysis","authors":"Rios, Joseph A., Deng, Jiayi","doi":"10.1186/s40536-021-00110-8","DOIUrl":"https://doi.org/10.1186/s40536-021-00110-8","url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Background</h3><p>In testing contexts that are predominately concerned with power, rapid guessing (RG) has the potential to undermine the validity of inferences made from educational assessments, as such responses are unreflective of the knowledge, skills, and abilities assessed. Given this concern, practitioners/researchers have utilized a multitude of response time threshold procedures that classify RG responses in these contexts based on either the use of no empirical data (e.g., an arbitrary time limit), response time distributions, and the combination of response time and accuracy information. As there is little understanding of how these procedures compare to each other, this meta-analysis sought to investigate whether threshold typology is related to differences in descriptive, measurement property, and performance outcomes in these contexts.</p><h3 data-test=\"abstract-sub-heading\">Methods</h3><p>Studies were sampled that: (a) employed two or more response time (RT) threshold procedures to identify and exclude RG responses on the same computer-administered low-stakes power test; and (b) evaluated differences between procedures on the proportion of RG responses and responders, measurement properties, and test performance.</p><h3 data-test=\"abstract-sub-heading\">Results</h3><p>Based on as many as 86 effect sizes, our findings indicated non-negligible differences between RT threshold procedures in the proportion of RG responses and responders. The largest differences for these outcomes were observed between procedures using no empirical data and those relying on response time and accuracy information. However, these differences were not related to variability in aggregate-level measurement properties and test performance.</p><h3 data-test=\"abstract-sub-heading\">Conclusions</h3><p>When filtering RG responses to improve inferences concerning item properties and group score outcomes, the actual threshold procedure chosen may be of less importance than the act of identifying such deleterious responses. However, given the conservative nature of RT thresholds that use no empirical data, practitioners may look to avoid the use of these procedures when making inferences at the individual-level, given their potential for underclassifying RG.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"27 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2021-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140881944","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}