大学大型课程中写作相关随变对写作质量和多项选择考试成绩的影响

K. Krohn, Megan R. Parker, L. N. Foster, K. Aspiranti, D. McCleary, Robert L. Williams
{"title":"大学大型课程中写作相关随变对写作质量和多项选择考试成绩的影响","authors":"K. Krohn, Megan R. Parker, L. N. Foster, K. Aspiranti, D. McCleary, Robert L. Williams","doi":"10.1037/H0100658","DOIUrl":null,"url":null,"abstract":"Most undergraduates likely take some courses that use multiple-choice exams as a major source of course credit. Instructors may be especially inclined to use multiple-choice exams in large courses because of ease and efficiency of grading (Hautau et al., 2006b). Nonetheless, many students report difficulty in taking multiple-choice exams, claiming that they could do better on essay exams. Consequently, discovering how to heighten student performance on multiple-choice exams is an important issue at the college level (Wallace & Williams, 2003). One possibility for improving multiple-choice exam scores may be the use of daily writing activities related to concepts included on the multiple-choice exams. Past research demonstrates that brief essay quizzes may improve performance on a variety of exam formats (e.g., short answer, essay, fill-in-the-blank, and multiple-choice). Padilla-Walker (2006) found that brief, extra-credit daily quizzes on assigned reading material predicted performance on major exams (short-answer and essay) better than did gender, self-reported college GPA, and self-reported ACT scores. Daniel and Broida (2004) also reported that completing in-class quizzes over course concepts boosted performance on course exams (multiple-choice and short answer). Narloch, Garbin, and Turnage (2006) showed that prelecture quizzes, compared to no quizzes, produced better performance on both multiple-choice and essay exam items. Additionally, Leeming (2002) found that participating in daily 1015 min writing activities on course concepts significantly improved performance on a comprehensive final exam that included short essay, fill-in-the-blank, and multiple-choice questions. Similarly, Turner et al. (2006) demonstrated that students required to complete a daily in-class writing activity performed better on the course's multiple-choice exams than students without daily writing. Although some research supports using daily essay quizzes to boost major exam performance, maximizing the impact of these quizzes is not without logistical challenges. For example, grading the quizzes could be labor intensive for the instructor. The writing activities may only consume a small percentage of class time, but the time required to grade and record the scores may detract from instructor time needed to organize and prepare for class. Leeming's (2002) daily quizzes required about 10 to 20% of class time and an hour of instructor grading time each day. Hautau et al. (2006a) and Turner et al. (2006) reduced instructor time for grading quizzes by grading quizzes only on randomly selected days rather than on all days. The quizzes took 6 to 7% of total class time and required about 1 min per student for instructor grading. Given that students in these studies did not know what day's quizzes would be randomly selected for grading, the researchers expected the quizzes to have much the same impact on student performance as would daily grading and crediting of quizzes. However, the results proved mixed regarding this expectation, creating additional questions as to how best to maximize student performance on writing quizzes and multiple-choice exams without requiring an inordinate amount of instructor time. Recent attempts to clarify the conditions under which daily writing could efficiently promote writing and exam scores have been reported by Hautau et al. (2006a; 2006b). Hautau et al. (2006a) required students to analyze in writing pairs of concepts that could be found within instructor notes available to students. The class website identified two to five pairs of concepts each day that students could be asked to address on the next quiz. At the following class period, the instructor randomly selected one of those pairs and instructed students to identify the concepts' commonalities, differences, and the effect of one on the other. When the final writing activity had been completed for each unit, a student randomly chose one day's writing activities to count for course credit. …","PeriodicalId":88717,"journal":{"name":"The behavior analyst today","volume":"9 1","pages":"184-195"},"PeriodicalIF":0.0000,"publicationDate":"2008-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Effects of Writing-Related Contingencies on Both Quality of Writing and Multiple-Choice Exam Performance in Large College Courses.\",\"authors\":\"K. Krohn, Megan R. Parker, L. N. Foster, K. Aspiranti, D. McCleary, Robert L. Williams\",\"doi\":\"10.1037/H0100658\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Most undergraduates likely take some courses that use multiple-choice exams as a major source of course credit. Instructors may be especially inclined to use multiple-choice exams in large courses because of ease and efficiency of grading (Hautau et al., 2006b). Nonetheless, many students report difficulty in taking multiple-choice exams, claiming that they could do better on essay exams. Consequently, discovering how to heighten student performance on multiple-choice exams is an important issue at the college level (Wallace & Williams, 2003). One possibility for improving multiple-choice exam scores may be the use of daily writing activities related to concepts included on the multiple-choice exams. Past research demonstrates that brief essay quizzes may improve performance on a variety of exam formats (e.g., short answer, essay, fill-in-the-blank, and multiple-choice). Padilla-Walker (2006) found that brief, extra-credit daily quizzes on assigned reading material predicted performance on major exams (short-answer and essay) better than did gender, self-reported college GPA, and self-reported ACT scores. Daniel and Broida (2004) also reported that completing in-class quizzes over course concepts boosted performance on course exams (multiple-choice and short answer). Narloch, Garbin, and Turnage (2006) showed that prelecture quizzes, compared to no quizzes, produced better performance on both multiple-choice and essay exam items. Additionally, Leeming (2002) found that participating in daily 1015 min writing activities on course concepts significantly improved performance on a comprehensive final exam that included short essay, fill-in-the-blank, and multiple-choice questions. Similarly, Turner et al. (2006) demonstrated that students required to complete a daily in-class writing activity performed better on the course's multiple-choice exams than students without daily writing. Although some research supports using daily essay quizzes to boost major exam performance, maximizing the impact of these quizzes is not without logistical challenges. For example, grading the quizzes could be labor intensive for the instructor. The writing activities may only consume a small percentage of class time, but the time required to grade and record the scores may detract from instructor time needed to organize and prepare for class. Leeming's (2002) daily quizzes required about 10 to 20% of class time and an hour of instructor grading time each day. Hautau et al. (2006a) and Turner et al. (2006) reduced instructor time for grading quizzes by grading quizzes only on randomly selected days rather than on all days. The quizzes took 6 to 7% of total class time and required about 1 min per student for instructor grading. Given that students in these studies did not know what day's quizzes would be randomly selected for grading, the researchers expected the quizzes to have much the same impact on student performance as would daily grading and crediting of quizzes. However, the results proved mixed regarding this expectation, creating additional questions as to how best to maximize student performance on writing quizzes and multiple-choice exams without requiring an inordinate amount of instructor time. Recent attempts to clarify the conditions under which daily writing could efficiently promote writing and exam scores have been reported by Hautau et al. (2006a; 2006b). Hautau et al. (2006a) required students to analyze in writing pairs of concepts that could be found within instructor notes available to students. The class website identified two to five pairs of concepts each day that students could be asked to address on the next quiz. At the following class period, the instructor randomly selected one of those pairs and instructed students to identify the concepts' commonalities, differences, and the effect of one on the other. When the final writing activity had been completed for each unit, a student randomly chose one day's writing activities to count for course credit. …\",\"PeriodicalId\":88717,\"journal\":{\"name\":\"The behavior analyst today\",\"volume\":\"9 1\",\"pages\":\"184-195\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-06-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The behavior analyst today\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1037/H0100658\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The behavior analyst today","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1037/H0100658","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

大多数本科生可能会选择一些以多项选择考试作为课程学分的主要来源的课程。教师可能特别倾向于在大型课程中使用选择题考试,因为评分容易且高效(Hautau等人,2006b)。尽管如此,许多学生反映在多项选择考试中遇到困难,声称他们可以在论文考试中做得更好。因此,发现如何提高学生在多项选择考试中的表现是大学水平的一个重要问题(华莱士和威廉姆斯,2003年)。提高多项选择考试成绩的一种可能是使用与多项选择考试中包含的概念相关的日常写作活动。过去的研究表明,短文测验可以提高学生在各种考试形式(如简答、短文、填空和多项选择)中的表现。帕迪拉-沃克(2006)发现,与性别、自我报告的大学GPA和自我报告的ACT分数相比,关于指定阅读材料的简短、额外学分的每日小测验更能预测学生在主要考试(简答题和论文)中的表现。Daniel和Broida(2004)还报告说,完成课程概念的课堂小测验可以提高课程考试(选择题和简答题)的成绩。Narloch, Garbin和Turnage(2006)表明,与没有测验相比,课前测验在多项选择题和论文考试项目上都有更好的表现。此外,Leeming(2002)发现,每天参加1015分钟的课程概念写作活动,可以显著提高学生在包括短文、填空和选择题在内的综合期末考试中的表现。同样,Turner等人(2006)证明,在课程的多项选择考试中,被要求每天完成课堂写作活动的学生比没有每天写作的学生表现得更好。尽管一些研究支持使用每日论文测验来提高主要考试成绩,但最大化这些测验的影响并非没有后勤挑战。例如,对教师来说,批改测验可能是一项劳动密集型工作。写作活动可能只占用课堂时间的一小部分,但评分和记录分数所需的时间可能会减少教师组织和准备课堂所需的时间。Leeming(2002)的每日测验需要大约10%到20%的课堂时间和每天一个小时的教师评分时间。Hautau等人(2006a)和Turner等人(2006)通过只在随机选择的日子而不是在所有的日子里给测验评分,减少了教师给测验评分的时间。小测验占总上课时间的6%到7%,每位学生需要1分钟的时间来给老师打分。考虑到这些研究中的学生不知道哪一天的小测验将被随机选择进行评分,研究人员希望小测验对学生表现的影响与日常评分和记入小测验的影响大致相同。然而,结果证明这一期望好坏参半,这就产生了额外的问题,即如何在不需要过多教师时间的情况下,最大限度地提高学生在写作测试和多项选择考试中的表现。Hautau等人(2006a)最近试图阐明日常写作可以有效提高写作和考试成绩的条件。2006 b)。Hautau等人(2006a)要求学生以书面形式分析学生可以在教师笔记中找到的概念对。课程网站每天确定两到五对概念,学生可能会被要求在下一次测验中回答这些概念。在接下来的课堂上,老师随机选择其中一组,并指导学生识别概念的共性、差异以及一个概念对另一个概念的影响。当每个单元完成最后的写作活动时,学生随机选择一天的写作活动来计算课程学分。…
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Effects of Writing-Related Contingencies on Both Quality of Writing and Multiple-Choice Exam Performance in Large College Courses.
Most undergraduates likely take some courses that use multiple-choice exams as a major source of course credit. Instructors may be especially inclined to use multiple-choice exams in large courses because of ease and efficiency of grading (Hautau et al., 2006b). Nonetheless, many students report difficulty in taking multiple-choice exams, claiming that they could do better on essay exams. Consequently, discovering how to heighten student performance on multiple-choice exams is an important issue at the college level (Wallace & Williams, 2003). One possibility for improving multiple-choice exam scores may be the use of daily writing activities related to concepts included on the multiple-choice exams. Past research demonstrates that brief essay quizzes may improve performance on a variety of exam formats (e.g., short answer, essay, fill-in-the-blank, and multiple-choice). Padilla-Walker (2006) found that brief, extra-credit daily quizzes on assigned reading material predicted performance on major exams (short-answer and essay) better than did gender, self-reported college GPA, and self-reported ACT scores. Daniel and Broida (2004) also reported that completing in-class quizzes over course concepts boosted performance on course exams (multiple-choice and short answer). Narloch, Garbin, and Turnage (2006) showed that prelecture quizzes, compared to no quizzes, produced better performance on both multiple-choice and essay exam items. Additionally, Leeming (2002) found that participating in daily 1015 min writing activities on course concepts significantly improved performance on a comprehensive final exam that included short essay, fill-in-the-blank, and multiple-choice questions. Similarly, Turner et al. (2006) demonstrated that students required to complete a daily in-class writing activity performed better on the course's multiple-choice exams than students without daily writing. Although some research supports using daily essay quizzes to boost major exam performance, maximizing the impact of these quizzes is not without logistical challenges. For example, grading the quizzes could be labor intensive for the instructor. The writing activities may only consume a small percentage of class time, but the time required to grade and record the scores may detract from instructor time needed to organize and prepare for class. Leeming's (2002) daily quizzes required about 10 to 20% of class time and an hour of instructor grading time each day. Hautau et al. (2006a) and Turner et al. (2006) reduced instructor time for grading quizzes by grading quizzes only on randomly selected days rather than on all days. The quizzes took 6 to 7% of total class time and required about 1 min per student for instructor grading. Given that students in these studies did not know what day's quizzes would be randomly selected for grading, the researchers expected the quizzes to have much the same impact on student performance as would daily grading and crediting of quizzes. However, the results proved mixed regarding this expectation, creating additional questions as to how best to maximize student performance on writing quizzes and multiple-choice exams without requiring an inordinate amount of instructor time. Recent attempts to clarify the conditions under which daily writing could efficiently promote writing and exam scores have been reported by Hautau et al. (2006a; 2006b). Hautau et al. (2006a) required students to analyze in writing pairs of concepts that could be found within instructor notes available to students. The class website identified two to five pairs of concepts each day that students could be asked to address on the next quiz. At the following class period, the instructor randomly selected one of those pairs and instructed students to identify the concepts' commonalities, differences, and the effect of one on the other. When the final writing activity had been completed for each unit, a student randomly chose one day's writing activities to count for course credit. …
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Functional and morphological maturation of the full-sized and mini-pig corpus luteum by programmed cell death mechanism. Procedural aspects that control discounting rates when using the fill-in-the-blank and multiple-choice methods On the sequential and concurrent presentation of trials establishing prerequisites for emergent relations. Using SAFMEDS and direct instruction to teach the model of hierarchical complexity The zeitgeist of behavior analytic research in the 21st century: A keyword analysis.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1