{"title":"学术英语中的问题预习听力评估:词干预习对难度、项目类型和辨别能力的影响","authors":"Rebecca Yeager, Zachary Meyer","doi":"10.1080/10904018.2022.2029705","DOIUrl":null,"url":null,"abstract":"ABSTRACT This study investigates the effects of adding stem preview to an English for Academic Purposes (EAP) multiple-choice listening assessment. In stem preview, listeners may view the item stems, but not response options, before listening. Previous research indicates that adding preview to an exam typically decreases difficulty, but raises concerns about score interpretation. Concerningly, no previous studies have explored the impact of preview on item discrimination, a key assumption of a validity argument. Our study utilized a Latin square design controlling for group, lecture, and preview condition to explore the impact of stem preview on difficulty, item type, and discrimination. Analysis indicated no significant effects of preview condition on difficulty or item type overall at our chosen alpha level. However, comparisons of total scores revealed a bimodal distribution in the no-preview condition, but not in the preview condition, indicating lower-scoring students received a boost from stem preview. Additionally, preview significantly increased facility on one of the trivial items. Results for discrimination were more complicated. On two of the five discrimination indices in the study, preview significantly decreased item discrimination, but for the other three indices, effects were not significant. Implications for assessment developers and researchers are discussed.","PeriodicalId":35114,"journal":{"name":"International Journal of Listening","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-02-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"QUESTION PREVIEW IN ENGLISH FOR ACADEMIC PURPOSES LISTENING ASSESSMENT: THE EFFECT OF STEM PREVIEW ON DIFFICULTY, ITEM TYPE, AND DISCRIMINATION\",\"authors\":\"Rebecca Yeager, Zachary Meyer\",\"doi\":\"10.1080/10904018.2022.2029705\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT This study investigates the effects of adding stem preview to an English for Academic Purposes (EAP) multiple-choice listening assessment. In stem preview, listeners may view the item stems, but not response options, before listening. Previous research indicates that adding preview to an exam typically decreases difficulty, but raises concerns about score interpretation. Concerningly, no previous studies have explored the impact of preview on item discrimination, a key assumption of a validity argument. Our study utilized a Latin square design controlling for group, lecture, and preview condition to explore the impact of stem preview on difficulty, item type, and discrimination. Analysis indicated no significant effects of preview condition on difficulty or item type overall at our chosen alpha level. However, comparisons of total scores revealed a bimodal distribution in the no-preview condition, but not in the preview condition, indicating lower-scoring students received a boost from stem preview. Additionally, preview significantly increased facility on one of the trivial items. Results for discrimination were more complicated. On two of the five discrimination indices in the study, preview significantly decreased item discrimination, but for the other three indices, effects were not significant. Implications for assessment developers and researchers are discussed.\",\"PeriodicalId\":35114,\"journal\":{\"name\":\"International Journal of Listening\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-02-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Listening\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/10904018.2022.2029705\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Arts and Humanities\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Listening","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/10904018.2022.2029705","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Arts and Humanities","Score":null,"Total":0}
QUESTION PREVIEW IN ENGLISH FOR ACADEMIC PURPOSES LISTENING ASSESSMENT: THE EFFECT OF STEM PREVIEW ON DIFFICULTY, ITEM TYPE, AND DISCRIMINATION
ABSTRACT This study investigates the effects of adding stem preview to an English for Academic Purposes (EAP) multiple-choice listening assessment. In stem preview, listeners may view the item stems, but not response options, before listening. Previous research indicates that adding preview to an exam typically decreases difficulty, but raises concerns about score interpretation. Concerningly, no previous studies have explored the impact of preview on item discrimination, a key assumption of a validity argument. Our study utilized a Latin square design controlling for group, lecture, and preview condition to explore the impact of stem preview on difficulty, item type, and discrimination. Analysis indicated no significant effects of preview condition on difficulty or item type overall at our chosen alpha level. However, comparisons of total scores revealed a bimodal distribution in the no-preview condition, but not in the preview condition, indicating lower-scoring students received a boost from stem preview. Additionally, preview significantly increased facility on one of the trivial items. Results for discrimination were more complicated. On two of the five discrimination indices in the study, preview significantly decreased item discrimination, but for the other three indices, effects were not significant. Implications for assessment developers and researchers are discussed.