Problems Before Solutions: Automated Problem Clarification at Scale

S. Basu, A. Wu, Brian Hou, John DeNero
{"title":"Problems Before Solutions: Automated Problem Clarification at Scale","authors":"S. Basu, A. Wu, Brian Hou, John DeNero","doi":"10.1145/2724660.2724679","DOIUrl":null,"url":null,"abstract":"Automatic assessment reduces the need for individual feedback in massive courses, but often focuses only on scoring solutions, rather than assessing whether students correctly understand problems. We present an enriched approach to automatic assessment that explicitly assists students in understanding the detailed specification of technical problems that they are asked to solve, in addition to evaluating their solutions. Students are given a suite of solution test cases, but they must first unlock each test case by validating its behavior before they are allowed to apply it to their proposed solution. When provided with this automated feedback early in the problem-solving process, students ask fewer clarificatory questions and express less confusion about assessments. As a result, instructors spend less time explaining problems to students. In a 1300-person university course, we observed that the vast majority of students chose to validate their understanding of test cases before attempting to solve problems. These students reported that the validation process improved their understanding.","PeriodicalId":20664,"journal":{"name":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2015-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2724660.2724679","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 21

Abstract

Automatic assessment reduces the need for individual feedback in massive courses, but often focuses only on scoring solutions, rather than assessing whether students correctly understand problems. We present an enriched approach to automatic assessment that explicitly assists students in understanding the detailed specification of technical problems that they are asked to solve, in addition to evaluating their solutions. Students are given a suite of solution test cases, but they must first unlock each test case by validating its behavior before they are allowed to apply it to their proposed solution. When provided with this automated feedback early in the problem-solving process, students ask fewer clarificatory questions and express less confusion about assessments. As a result, instructors spend less time explaining problems to students. In a 1300-person university course, we observed that the vast majority of students chose to validate their understanding of test cases before attempting to solve problems. These students reported that the validation process improved their understanding.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
解决方案之前的问题:大规模的自动问题澄清
在大规模的课程中,自动评估减少了对个人反馈的需求,但通常只关注对解决方案的评分,而不是评估学生是否正确理解问题。我们提出了一种丰富的自动评估方法,明确地帮助学生理解他们被要求解决的技术问题的详细说明,以及评估他们的解决方案。学生将获得一套解决方案测试用例,但是他们必须首先通过验证其行为来解锁每个测试用例,然后才允许将其应用于他们提出的解决方案。当在问题解决过程的早期提供这种自动反馈时,学生会提出更少的澄清性问题,并且对评估表达更少的困惑。因此,教师花更少的时间向学生解释问题。在一个1300人的大学课程中,我们观察到绝大多数学生选择在尝试解决问题之前验证他们对测试用例的理解。这些学生报告说,验证过程提高了他们的理解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Learning is Not a Spectator Sport: Doing is Better than Watching for Learning from a MOOC Learnersourcing of Complex Assessments All It Takes Is One: Evidence for a Strategy for Seeding Large Scale Peer Learning Interactions Designing MOOCs as Interactive Places for Collaborative Learning Who You Are or What You Do: Comparing the Predictive Power of Demographics vs. Activity Patterns in Massive Open Online Courses (MOOCs)
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1