Lucas Mendonça de Souza, I. M. Félix, B. M. Ferreira, A. Brandão, L. O. Brandão
{"title":"我知道你去年夏天写的代码","authors":"Lucas Mendonça de Souza, I. M. Félix, B. M. Ferreira, A. Brandão, L. O. Brandão","doi":"10.5753/sbie.2021.218673","DOIUrl":null,"url":null,"abstract":"The outbreak of the COVID-19 pandemic caused a surge in enrollments in online courses. Consequently, this boost in numbers of students affected teachers ability to evaluate exercises and resolve doubts. In this context, tools designed to evaluate and provide feedback on code solutions can be used in programming courses to reduce teachers workload. Nonetheless, even with using such tools, the literature shows that learning how to program is a challenging task. Programming is complex and the programming language employed can also affect students outcomes. Thus, designing good exercises can reduce students difficulties in identifying the problem and help reduce syntax challenges. This research employs learning analytics processes on automatic evaluation tools interaction logs and code solutions to find metrics capable of identifying problematic exercises and their difficulty. In this context, an exercise is considered problematic if students have problems interpreting its description or its solution requires complex programming structures like loops, conditionals and recursion. The data comes from online introductory programming courses. Results show that the computed metrics can identify problematic exercises, as well as those that are being challenging.","PeriodicalId":298990,"journal":{"name":"Anais do XXXII Simpósio Brasileiro de Informática na Educação (SBIE 2021)","volume":"2532 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"I know what you coded last summer\",\"authors\":\"Lucas Mendonça de Souza, I. M. Félix, B. M. Ferreira, A. Brandão, L. O. Brandão\",\"doi\":\"10.5753/sbie.2021.218673\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The outbreak of the COVID-19 pandemic caused a surge in enrollments in online courses. Consequently, this boost in numbers of students affected teachers ability to evaluate exercises and resolve doubts. In this context, tools designed to evaluate and provide feedback on code solutions can be used in programming courses to reduce teachers workload. Nonetheless, even with using such tools, the literature shows that learning how to program is a challenging task. Programming is complex and the programming language employed can also affect students outcomes. Thus, designing good exercises can reduce students difficulties in identifying the problem and help reduce syntax challenges. This research employs learning analytics processes on automatic evaluation tools interaction logs and code solutions to find metrics capable of identifying problematic exercises and their difficulty. In this context, an exercise is considered problematic if students have problems interpreting its description or its solution requires complex programming structures like loops, conditionals and recursion. The data comes from online introductory programming courses. Results show that the computed metrics can identify problematic exercises, as well as those that are being challenging.\",\"PeriodicalId\":298990,\"journal\":{\"name\":\"Anais do XXXII Simpósio Brasileiro de Informática na Educação (SBIE 2021)\",\"volume\":\"2532 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Anais do XXXII Simpósio Brasileiro de Informática na Educação (SBIE 2021)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5753/sbie.2021.218673\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Anais do XXXII Simpósio Brasileiro de Informática na Educação (SBIE 2021)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5753/sbie.2021.218673","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The outbreak of the COVID-19 pandemic caused a surge in enrollments in online courses. Consequently, this boost in numbers of students affected teachers ability to evaluate exercises and resolve doubts. In this context, tools designed to evaluate and provide feedback on code solutions can be used in programming courses to reduce teachers workload. Nonetheless, even with using such tools, the literature shows that learning how to program is a challenging task. Programming is complex and the programming language employed can also affect students outcomes. Thus, designing good exercises can reduce students difficulties in identifying the problem and help reduce syntax challenges. This research employs learning analytics processes on automatic evaluation tools interaction logs and code solutions to find metrics capable of identifying problematic exercises and their difficulty. In this context, an exercise is considered problematic if students have problems interpreting its description or its solution requires complex programming structures like loops, conditionals and recursion. The data comes from online introductory programming courses. Results show that the computed metrics can identify problematic exercises, as well as those that are being challenging.