Pub Date : 2023-06-24DOI: 10.1177/07342829231186231
M. A. Küçükaydın
The aim of this study was to adapt the Career-related Teacher Support Scale to Turkish and to carry out validity and reliability studies. Data were collected from a total of 752 high school students studying in Turkey. Confirmatory factor analysis showed that the scale had an excellent fit. Students’ perceptions of career-related teacher support were also examined in terms of demographic variables. The findings showed that female students perceived teacher support more. Also, students with a high socioeconomic level had a higher perception of support.
{"title":"Career-Related Teacher Support in Turkey: Scale Adaptation and Validation","authors":"M. A. Küçükaydın","doi":"10.1177/07342829231186231","DOIUrl":"https://doi.org/10.1177/07342829231186231","url":null,"abstract":"The aim of this study was to adapt the Career-related Teacher Support Scale to Turkish and to carry out validity and reliability studies. Data were collected from a total of 752 high school students studying in Turkey. Confirmatory factor analysis showed that the scale had an excellent fit. Students’ perceptions of career-related teacher support were also examined in terms of demographic variables. The findings showed that female students perceived teacher support more. Also, students with a high socioeconomic level had a higher perception of support.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47923023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-20DOI: 10.1177/07342829231175938
Benjamin J. Lovett, Theresa M. Schaberg, Ara Nazmiyal, Laura M. Spenceley
Data collected during psychoeducational evaluations can be compromised by response bias: clients not putting forth sufficient effort on tests, not being motivated to do well, or not being fully honest and careful when completing rating scales and contributing similar self-report data. Some of these problems apply to data from third-party informants as well. In the present study, we surveyed school psychologists about their approach to detecting, preventing, and reacting to apparent response bias. A sample of 297 school psychologists responded to at least one of four open-ended questions. We found that most participants only used informal techniques for detecting response bias (rather than specialized tests and embedded indices), relied on rewards or reinforcements to prevent response bias, and reacted to apparent response bias by noting it in their evaluation reports. However, a wide variety of other strategies were endorsed by smaller proportions of practitioners. We compare these results to results from similar surveys in neuropsychology, and discuss implications for applied practice as well as future research.
{"title":"How Do School Psychologists Address Issues of Effort, Motivation, and Honesty During Evaluations?","authors":"Benjamin J. Lovett, Theresa M. Schaberg, Ara Nazmiyal, Laura M. Spenceley","doi":"10.1177/07342829231175938","DOIUrl":"https://doi.org/10.1177/07342829231175938","url":null,"abstract":"Data collected during psychoeducational evaluations can be compromised by response bias: clients not putting forth sufficient effort on tests, not being motivated to do well, or not being fully honest and careful when completing rating scales and contributing similar self-report data. Some of these problems apply to data from third-party informants as well. In the present study, we surveyed school psychologists about their approach to detecting, preventing, and reacting to apparent response bias. A sample of 297 school psychologists responded to at least one of four open-ended questions. We found that most participants only used informal techniques for detecting response bias (rather than specialized tests and embedded indices), relied on rewards or reinforcements to prevent response bias, and reacted to apparent response bias by noting it in their evaluation reports. However, a wide variety of other strategies were endorsed by smaller proportions of practitioners. We compare these results to results from similar surveys in neuropsychology, and discuss implications for applied practice as well as future research.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48161737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-19DOI: 10.1177/07342829231177109
Adam Mccrimmon, Abdullah S. Bernier, J. McLeod, Rachel Pagaling, Janine Newton Montgomery, Sydney E. Kingston, David Nordstokke
Examination of emotional intelligence (EI) of autistic individuals has gained popularity. These efforts have included the BarOn Emotional Quotient Inventory, Youth Version (BarOn EQ-i YV); however, this measure was not standardized with this population and so its utility and the accuracy of its factor structure for this population is questionable. This study examined how well the factor structure as represented by a sample of autistic children and youth aligns with that described in the measure’s technical manual to clarify considerations for use. Results indicate poor metrics of model fit. Some factors were significantly correlated, though this was attenuated somewhat upon correction for multiple analyses. Two items from the interpersonal factor were negatively loaded, suggesting they should be subtracted from other items in that factor (in contrast to the standardized model). Implications for EI construct validity, understanding of EI in autism, and use of EI measures for this population are discussed.
{"title":"Factor Structure of the Bar-On Emotional Quotient Inventory in Youth on the Autism Spectrum","authors":"Adam Mccrimmon, Abdullah S. Bernier, J. McLeod, Rachel Pagaling, Janine Newton Montgomery, Sydney E. Kingston, David Nordstokke","doi":"10.1177/07342829231177109","DOIUrl":"https://doi.org/10.1177/07342829231177109","url":null,"abstract":"Examination of emotional intelligence (EI) of autistic individuals has gained popularity. These efforts have included the BarOn Emotional Quotient Inventory, Youth Version (BarOn EQ-i YV); however, this measure was not standardized with this population and so its utility and the accuracy of its factor structure for this population is questionable. This study examined how well the factor structure as represented by a sample of autistic children and youth aligns with that described in the measure’s technical manual to clarify considerations for use. Results indicate poor metrics of model fit. Some factors were significantly correlated, though this was attenuated somewhat upon correction for multiple analyses. Two items from the interpersonal factor were negatively loaded, suggesting they should be subtracted from other items in that factor (in contrast to the standardized model). Implications for EI construct validity, understanding of EI in autism, and use of EI measures for this population are discussed.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41629283","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-25DOI: 10.1177/07342829231169845
Sylwia Opozda-Suder, P. Grygiel, Kinga Karteczka-Świętek
The present article describes the development and validation of the Informational and Normative Conformity Scale (SKI-N), a brief self-report tool capturing adolescents' general propensity to adopt a conformist attitude, and the underlying motives for doing so. The presentation includes a description of scale construction and an assessment of the psychometric properties. In two independent samples of adolescents (total N = 1,953), the SKI-N factorial structure was investigated, and the reliability and dimensionality, the multi-group measurement invariance, and the construct validity were each verified. The findings showed that the scale structure is bi-factorial, and the tool is reliable, valid, and invariant across gender. Therefore, the SKI-N can be applied in research and/or in psychological and educational practice to provide important information in a broader assessment of students’ psychosocial functioning in the school environment. Moreover, compared to currently available measures, it fills a gap in the tools for measuring conformity in the adolescent population.
{"title":"Conformity in High School Adolescents: Development and Validation of the Informational and Normative Conformity Scale","authors":"Sylwia Opozda-Suder, P. Grygiel, Kinga Karteczka-Świętek","doi":"10.1177/07342829231169845","DOIUrl":"https://doi.org/10.1177/07342829231169845","url":null,"abstract":"The present article describes the development and validation of the Informational and Normative Conformity Scale (SKI-N), a brief self-report tool capturing adolescents' general propensity to adopt a conformist attitude, and the underlying motives for doing so. The presentation includes a description of scale construction and an assessment of the psychometric properties. In two independent samples of adolescents (total N = 1,953), the SKI-N factorial structure was investigated, and the reliability and dimensionality, the multi-group measurement invariance, and the construct validity were each verified. The findings showed that the scale structure is bi-factorial, and the tool is reliable, valid, and invariant across gender. Therefore, the SKI-N can be applied in research and/or in psychological and educational practice to provide important information in a broader assessment of students’ psychosocial functioning in the school environment. Moreover, compared to currently available measures, it fills a gap in the tools for measuring conformity in the adolescent population.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46908497","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-19DOI: 10.1177/07342829231169171
Teresa M. Ober, Yikai Lu, Chessley B. Blacklock, Cheng Liu, Ying Cheng
We develop and validate a self-report measure of intrinsic and extrinsic cognitive load suitable for measuring the constructs in a variety of learning contexts. Data were collected from three independent samples of college students in the U.S. (N total = 513; M age = 21.13 years). Kane’s (2013) framework was used to validate the measure. Three types of validity evidence were presented: scoring, generalization, extrapolation. After establishing evidence of validity, especially measurement invariance, we then compared group mean differences based on students’ demographic characteristics. These findings support the psychometric integrity of this measure of cognitive load, which may be used to investigate cognitive load in various learning contexts, particularly examining factors that may perpetuate or mitigate differences in cognitive load between students. Such a measure could be useful in educational and clinical settings as a mechanism for early identification of potential learning challenges.
{"title":"Development and Validation of a Cognitive Load Measure for General Educational Settings","authors":"Teresa M. Ober, Yikai Lu, Chessley B. Blacklock, Cheng Liu, Ying Cheng","doi":"10.1177/07342829231169171","DOIUrl":"https://doi.org/10.1177/07342829231169171","url":null,"abstract":"We develop and validate a self-report measure of intrinsic and extrinsic cognitive load suitable for measuring the constructs in a variety of learning contexts. Data were collected from three independent samples of college students in the U.S. (N total = 513; M age = 21.13 years). Kane’s (2013) framework was used to validate the measure. Three types of validity evidence were presented: scoring, generalization, extrapolation. After establishing evidence of validity, especially measurement invariance, we then compared group mean differences based on students’ demographic characteristics. These findings support the psychometric integrity of this measure of cognitive load, which may be used to investigate cognitive load in various learning contexts, particularly examining factors that may perpetuate or mitigate differences in cognitive load between students. Such a measure could be useful in educational and clinical settings as a mechanism for early identification of potential learning challenges.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48516889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-11DOI: 10.1177/07342829231167726
Ning Jiang, Ruiqin Gao, C. Distefano, Jin Liu, M. Weist, J. Splett, Colleen A. Halliday-Boykins
A growing interest has been given to examining the heterogeneity of children’s health to provide for their particular needs. This study examined subgroups of elementary school children’s social, emotional, and behavioral functioning (SEB) using teacher ratings of children with the Behavioral and Emotional Screening System. A Latent Profile Analysis (LPA) was conducted with 5,150 students aged from kindergarten to fifth grade. Subscale scores in the areas of externalizing risk, internalizing risk, and adaptive skills risk were used to identify profiles. Students’ grade level, sex, race, and status of receiving special education services were added to investigate the impact of covariates on the classification of latent profiles. Four profiles of risk were identified: Well Development (60.5%), Normal Development (25.8%), Externalizing and Adaptive Skills Risk (9.1%), and Elevated Risk (4.6%). Significant differences of covariates including sex, race, and status of receiving special education services were identified. Specifically, males and African American students were more likely to be classified into the “Elevated Risk” profile than other profiles. Students who did not receive special education services were more in the “Well Development” profile. The results may have important implications for the stakeholders to allocate intervention and treatment resources effectively and accurately.
{"title":"Social-Emotional and Behavioral Functioning Profiles and Demographic Factors: A Latent Profile Analysis in Elementary Students","authors":"Ning Jiang, Ruiqin Gao, C. Distefano, Jin Liu, M. Weist, J. Splett, Colleen A. Halliday-Boykins","doi":"10.1177/07342829231167726","DOIUrl":"https://doi.org/10.1177/07342829231167726","url":null,"abstract":"A growing interest has been given to examining the heterogeneity of children’s health to provide for their particular needs. This study examined subgroups of elementary school children’s social, emotional, and behavioral functioning (SEB) using teacher ratings of children with the Behavioral and Emotional Screening System. A Latent Profile Analysis (LPA) was conducted with 5,150 students aged from kindergarten to fifth grade. Subscale scores in the areas of externalizing risk, internalizing risk, and adaptive skills risk were used to identify profiles. Students’ grade level, sex, race, and status of receiving special education services were added to investigate the impact of covariates on the classification of latent profiles. Four profiles of risk were identified: Well Development (60.5%), Normal Development (25.8%), Externalizing and Adaptive Skills Risk (9.1%), and Elevated Risk (4.6%). Significant differences of covariates including sex, race, and status of receiving special education services were identified. Specifically, males and African American students were more likely to be classified into the “Elevated Risk” profile than other profiles. Students who did not receive special education services were more in the “Well Development” profile. The results may have important implications for the stakeholders to allocate intervention and treatment resources effectively and accurately.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46975510","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-06DOI: 10.1177/07342829231167892
Ulrich Ludewig, Jakob Schwerter, Nele McElvany
A better understanding of how distractor features influence the plausibility of distractors is essential for an efficient multiple-choice (MC) item construction in educational assessment. The plausibility of distractors has a major influence on the psychometric characteristics of MC items. Our analysis utilizes the nominal categories model to investigate German fourth graders' (N = 924) selection of response options in a German MC Vocabulary test. We used principles from cognitive psychology to identify relevant option features capturing the option’s potential to distract students from the correct answer. The results show that only a few option characteristics explain option choice behavior to a large extent. Options with distracting features (i.e., semantic relatedness and orthographic similarity) increase the item difficulty and discrimination, whereas distractors that are less synonym than the attractor decrease item discrimination. Implications for test score interpretations and item construction guidelines are highlighted.
{"title":"The Features of Plausible but Incorrect Options: Distractor Plausibility in Synonym-Based Vocabulary Tests","authors":"Ulrich Ludewig, Jakob Schwerter, Nele McElvany","doi":"10.1177/07342829231167892","DOIUrl":"https://doi.org/10.1177/07342829231167892","url":null,"abstract":"A better understanding of how distractor features influence the plausibility of distractors is essential for an efficient multiple-choice (MC) item construction in educational assessment. The plausibility of distractors has a major influence on the psychometric characteristics of MC items. Our analysis utilizes the nominal categories model to investigate German fourth graders' (N = 924) selection of response options in a German MC Vocabulary test. We used principles from cognitive psychology to identify relevant option features capturing the option’s potential to distract students from the correct answer. The results show that only a few option characteristics explain option choice behavior to a large extent. Options with distracting features (i.e., semantic relatedness and orthographic similarity) increase the item difficulty and discrimination, whereas distractors that are less synonym than the attractor decrease item discrimination. Implications for test score interpretations and item construction guidelines are highlighted.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-04-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45107422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-03-29DOI: 10.1177/07342829231166251
Kenneth Stensen, S. Lydersen, Ingunn Ranøyen, C. Klöckner, E. S. Buøen, R. Lekhal, M. Drugli
The Student-Teacher Relationship Scale-Short Form (STRS-SF) is one of the most frequently used instruments globally to measure professional caregivers’ perceptions of the relationship quality with a specific child. However, its psychometric properties for children younger than 3 years of age enrolled in early childhood education and care (ECEC) centers are largely unknown. Thus, this study aimed to investigate and evaluate the factorial validity of the STRS-SF and measurement invariance across children’s gender and age by combining two large Norwegian community samples (N = 2900), covering the full age range of children enrolled in ECEC (1–6 years olds). Our findings indicate promising psychometric properties for the STRS-SF; thus, its applicability is supported for both younger and older children indiscriminate of their gender. However, some caution is advised when comparing latent means between older and younger ECEC children because professional caregivers interpret the STRS-SF differently based on children’s age.
{"title":"Psychometric Properties of the Student-Teacher Relationship Scale-Short Form in a Norwegian Early Childhood Education and Care Context","authors":"Kenneth Stensen, S. Lydersen, Ingunn Ranøyen, C. Klöckner, E. S. Buøen, R. Lekhal, M. Drugli","doi":"10.1177/07342829231166251","DOIUrl":"https://doi.org/10.1177/07342829231166251","url":null,"abstract":"The Student-Teacher Relationship Scale-Short Form (STRS-SF) is one of the most frequently used instruments globally to measure professional caregivers’ perceptions of the relationship quality with a specific child. However, its psychometric properties for children younger than 3 years of age enrolled in early childhood education and care (ECEC) centers are largely unknown. Thus, this study aimed to investigate and evaluate the factorial validity of the STRS-SF and measurement invariance across children’s gender and age by combining two large Norwegian community samples (N = 2900), covering the full age range of children enrolled in ECEC (1–6 years olds). Our findings indicate promising psychometric properties for the STRS-SF; thus, its applicability is supported for both younger and older children indiscriminate of their gender. However, some caution is advised when comparing latent means between older and younger ECEC children because professional caregivers interpret the STRS-SF differently based on children’s age.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48697964","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-03-22DOI: 10.1177/07342829231165812
Yixiao Dong, Denis G. Dumas, D. Clements, Crystal Day-Hess, Julie Sarama
Consequential validity (often referred to as “test fairness” in practice) is an essential aspect of educational measurement. This study evaluated the consequential validity of the Research-Based Early Mathematics Assessment (REMA). A sample of 627 children from PreK to second grade was collected using the short form of the REMA. We conducted two sets of analyses with different foci (item- or scale-level) for validation: differential item functioning (DIF) and consequential validity ratio (CVR) analyses. The analyses focused on the demographic subgroups of gender, English Language Learner status, and race/ethnicity. We found a low percentage of DIF items (less than 3%) and high CVRs (ranging from 96 to 98%). Both findings support the consequential validity and thus “fairness” of the REMA.
{"title":"Evaluating the Consequential Validity of the Research-Based Early Mathematics Assessment","authors":"Yixiao Dong, Denis G. Dumas, D. Clements, Crystal Day-Hess, Julie Sarama","doi":"10.1177/07342829231165812","DOIUrl":"https://doi.org/10.1177/07342829231165812","url":null,"abstract":"Consequential validity (often referred to as “test fairness” in practice) is an essential aspect of educational measurement. This study evaluated the consequential validity of the Research-Based Early Mathematics Assessment (REMA). A sample of 627 children from PreK to second grade was collected using the short form of the REMA. We conducted two sets of analyses with different foci (item- or scale-level) for validation: differential item functioning (DIF) and consequential validity ratio (CVR) analyses. The analyses focused on the demographic subgroups of gender, English Language Learner status, and race/ethnicity. We found a low percentage of DIF items (less than 3%) and high CVRs (ranging from 96 to 98%). Both findings support the consequential validity and thus “fairness” of the REMA.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48001663","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-03-20DOI: 10.1177/07342829231162216
Yu-Yu Hsiao, C. H. Qi, P. Dale, Rebecca J. Bulotsky-Shearer, Qing Wang
The Child Behavior Checklist for Ages 1.5–5 (CBCL/1½–5) has been widely used by researchers and clinicians in the field of special education and psychology. The purpose of this study was to examine the psychometric properties of the CBCL/1½-5 with a sample of preschool children from low-income families using the Rasch model. Participants included 244 children enrolled in a Head Start program. Findings suggested that both the Internalizing and Externalizing subscales are unidimensional and demonstrated local independence successfully, after misfit items were removed to fit the Rasch model. Both subscales operated well with high item reliability and low to medium person reliability, indicating that both subscales have stable item difficulty orders from sample to sample but weaker capacity to distinguish children with mild problem behaviors from those with more severe problems. Differential item functioning was found for a few items across child gender. Considering the length of the item sets, it is appropriate to use the subscale scores to compare the differences in problem behaviors between boys and girls. Overall, the CBCL/1½–5 has adequate psychometric properties for detecting problem behaviors in preschool children from low-income families. Implications were discussed.
{"title":"Measuring Behavior Problems in Children from Low-Income Families: A Rasch Analysis of the Child Behavior Checklist for Ages 1½–5","authors":"Yu-Yu Hsiao, C. H. Qi, P. Dale, Rebecca J. Bulotsky-Shearer, Qing Wang","doi":"10.1177/07342829231162216","DOIUrl":"https://doi.org/10.1177/07342829231162216","url":null,"abstract":"The Child Behavior Checklist for Ages 1.5–5 (CBCL/1½–5) has been widely used by researchers and clinicians in the field of special education and psychology. The purpose of this study was to examine the psychometric properties of the CBCL/1½-5 with a sample of preschool children from low-income families using the Rasch model. Participants included 244 children enrolled in a Head Start program. Findings suggested that both the Internalizing and Externalizing subscales are unidimensional and demonstrated local independence successfully, after misfit items were removed to fit the Rasch model. Both subscales operated well with high item reliability and low to medium person reliability, indicating that both subscales have stable item difficulty orders from sample to sample but weaker capacity to distinguish children with mild problem behaviors from those with more severe problems. Differential item functioning was found for a few items across child gender. Considering the length of the item sets, it is appropriate to use the subscale scores to compare the differences in problem behaviors between boys and girls. Overall, the CBCL/1½–5 has adequate psychometric properties for detecting problem behaviors in preschool children from low-income families. Implications were discussed.","PeriodicalId":51446,"journal":{"name":"Journal of Psychoeducational Assessment","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45548042","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}