Theresia Devi Indriasari, Andrew Luxton-Reilly, Paul Denny
Peer code review has been shown to have several benefits for students, including the development of both technical skills and soft skills. However, a lack of motivation has been identified as one of the barriers to successful peer code review in programming courses. Low motivation may result in students avoiding or delaying their peer review tasks, reducing the potential benefits. In this study, gamification is used to overcome this barrier. We focus on motivating two behaviors: increasing the number of reviews submitted by students, and encouraging students to submit those reviews early. We conduct a randomized controlled study (N = 178) that compares the behavior of a control group engaged in peer code review using an online tool, with a gamification group that uses a modified version of the tool that includes targeted game elements. The results show a statistically significant difference in the number of submitted reviews between the control and gamification groups. Furthermore, the majority of students in the gamification group report that the game elements motivate them. Based on our findings, the game elements and game mechanics seem to be a promising method to motivate students in online peer code review activities.
{"title":"Improving Student Peer Code Review Using Gamification","authors":"Theresia Devi Indriasari, Andrew Luxton-Reilly, Paul Denny","doi":"10.1145/3441636.3442308","DOIUrl":"https://doi.org/10.1145/3441636.3442308","url":null,"abstract":"Peer code review has been shown to have several benefits for students, including the development of both technical skills and soft skills. However, a lack of motivation has been identified as one of the barriers to successful peer code review in programming courses. Low motivation may result in students avoiding or delaying their peer review tasks, reducing the potential benefits. In this study, gamification is used to overcome this barrier. We focus on motivating two behaviors: increasing the number of reviews submitted by students, and encouraging students to submit those reviews early. We conduct a randomized controlled study (N = 178) that compares the behavior of a control group engaged in peer code review using an online tool, with a gamification group that uses a modified version of the tool that includes targeted game elements. The results show a statistically significant difference in the number of submitted reviews between the control and gamification groups. Furthermore, the majority of students in the gamification group report that the game elements motivate them. Based on our findings, the game elements and game mechanics seem to be a promising method to motivate students in online peer code review activities.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"367 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122843570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jessica McBroom, Benjamin Paassen, Bryn Jeffries, I. Koprinska, K. Yacef
The behavior of students during completion of a learning task can give crucial insights into typical misconceptions as well as issues with the task design. However, analysing the detailed trace of every individual student is time-consuming and infeasible for large-scale classes. In this paper, we propose progress networks as an analytical tool to make sense of student data and demonstrate the technique in large-scale online learning environments for computer programming. These networks, which are easily interpreted by teachers, summarise the progression of a student population through a learning task in a single diagram and, importantly, highlight locations where students fail to make progress. Using data from three different programming courses (N > 4000), we provide instructive examples of how to apply progress networks, including how to zoom in on areas of interest to identify reasons for student difficulty. In addition, we propose a simple technique for comparing progress networks across different cohorts of interest, for instance to analyse learning differences between older and younger students, and to investigate learning retention across tasks on the same programming concept. Finally, we discuss options to improve instructional design based on the insights from progress networks, and show that progress networks can also apply to smaller cohorts.
{"title":"Progress Networks as a Tool for Analysing Student Programming Difficulties","authors":"Jessica McBroom, Benjamin Paassen, Bryn Jeffries, I. Koprinska, K. Yacef","doi":"10.1145/3441636.3442366","DOIUrl":"https://doi.org/10.1145/3441636.3442366","url":null,"abstract":"The behavior of students during completion of a learning task can give crucial insights into typical misconceptions as well as issues with the task design. However, analysing the detailed trace of every individual student is time-consuming and infeasible for large-scale classes. In this paper, we propose progress networks as an analytical tool to make sense of student data and demonstrate the technique in large-scale online learning environments for computer programming. These networks, which are easily interpreted by teachers, summarise the progression of a student population through a learning task in a single diagram and, importantly, highlight locations where students fail to make progress. Using data from three different programming courses (N > 4000), we provide instructive examples of how to apply progress networks, including how to zoom in on areas of interest to identify reasons for student difficulty. In addition, we propose a simple technique for comparing progress networks across different cohorts of interest, for instance to analyse learning differences between older and younger students, and to investigate learning retention across tasks on the same programming concept. Finally, we discuss options to improve instructional design based on the insights from progress networks, and show that progress networks can also apply to smaller cohorts.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129289325","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
High failure and attrition rates in first-year, college-level computing courses are a big concern for institutions and instructors. For many years, computing instructors have devoted substantial time and energy to increase retention in those courses. Despite that, computer science is still facing the problem of student recruitment and retention especially for women and underrepresented minorities. Since students’ background is hard to manipulate, instructors can motivate many aspects of student experience inside the class. This paper presents the results of strategic changes adopted in an introductory computer science course, including increasing in-class collaboration, diversifying the teaching assistant team, and changing lab placement. We compared performance, retention rate, sense of belonging, and pre-assessment quiz grades with students taking the same course from the previous year. Our results show that when first-year students take the course with the adoption of the new changes, retention rates and sense of belonging significantly increase, and students perform better. Women, in particular, show an increase in performance and retention rates. They actually did better than men when taking the updated class. Despite that, the new changes have unexpected effects on the underrepresented minorities.
{"title":"Rethinking CS0 to Improve Performance and Retention","authors":"Noura Albarakati, L. DiPippo, V. Wolfe","doi":"10.1145/3441636.3442314","DOIUrl":"https://doi.org/10.1145/3441636.3442314","url":null,"abstract":"High failure and attrition rates in first-year, college-level computing courses are a big concern for institutions and instructors. For many years, computing instructors have devoted substantial time and energy to increase retention in those courses. Despite that, computer science is still facing the problem of student recruitment and retention especially for women and underrepresented minorities. Since students’ background is hard to manipulate, instructors can motivate many aspects of student experience inside the class. This paper presents the results of strategic changes adopted in an introductory computer science course, including increasing in-class collaboration, diversifying the teaching assistant team, and changing lab placement. We compared performance, retention rate, sense of belonging, and pre-assessment quiz grades with students taking the same course from the previous year. Our results show that when first-year students take the course with the adoption of the new changes, retention rates and sense of belonging significantly increase, and students perform better. Women, in particular, show an increase in performance and retention rates. They actually did better than men when taking the updated class. Despite that, the new changes have unexpected effects on the underrepresented minorities.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114320448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Some novice programming students procrastinate the start of a programming assignment. These students also seem to struggle knowing whether or not their program meets the requirements of the assignment. Both of these factors play a role in a student’s success in a programming project with negative impact on their grades; but more importantly, a negative impact on learning. Finding ways to motivate students to start earlier could improve learning outcomes and performance in the course with little, if any, risk. This paper reports on the experience of giving students ‘Overnight Feedback’ with the goals of 1) motivating earlier project work, and 2) helping students to know when they have met, or not met, project requirements. This report is more about the concept of providing feedback on a once-daily basis - overnight; and less about the specific tool used. In our experience, the Overnight Feedback technique seems to have achieved the goal of getting students to start earlier. Late submissions dropped from approximately 34% for a project without a feedback system to approximately 4% with Overnight Feedback. Also included in this report is a summary of students’ perspectives on the Overnight Feedback tool we used, which were collected as responses to a survey.
{"title":"Overnight Feedback Reduces Late Submissions on Programming Projects in CS1","authors":"D. Bouvier, Ellie Lovellette, John Matta","doi":"10.1145/3441636.3442319","DOIUrl":"https://doi.org/10.1145/3441636.3442319","url":null,"abstract":"Some novice programming students procrastinate the start of a programming assignment. These students also seem to struggle knowing whether or not their program meets the requirements of the assignment. Both of these factors play a role in a student’s success in a programming project with negative impact on their grades; but more importantly, a negative impact on learning. Finding ways to motivate students to start earlier could improve learning outcomes and performance in the course with little, if any, risk. This paper reports on the experience of giving students ‘Overnight Feedback’ with the goals of 1) motivating earlier project work, and 2) helping students to know when they have met, or not met, project requirements. This report is more about the concept of providing feedback on a once-daily basis - overnight; and less about the specific tool used. In our experience, the Overnight Feedback technique seems to have achieved the goal of getting students to start earlier. Late submissions dropped from approximately 34% for a project without a feedback system to approximately 4% with Overnight Feedback. Also included in this report is a summary of students’ perspectives on the Overnight Feedback tool we used, which were collected as responses to a survey.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122134673","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lecture recordings are a common and useful resource for students due to the flexibility they provide. Recent rapid shifts to online modes of teaching have made recorded lectures especially valuable as many students rely on them to access course content. As a result, there is a pressing need to understand how students consume lecture content via recordings and what patterns of access are effective. In this research, we investigate the common lecture recording viewing behaviors of students and the relationship between lecture recording viewing and academic performance in a first-year programming course. Several trends were observed. A significant positive correlation between lecture recording views and final grades was identified. Students who repeated the course after failing it once, achieved, on average, higher grades if they had more lecture recording views in their second attempt. A small number of students engaged in habitual “binge-watching” behavior and those students had lower grades, on average, than students who watched more regularly. This was a particular problem for repeating students, attempting the course a second time, who failed the course at twice the rate if they engaged in habitual binge-watching. Although lecture recordings appear to be a helpful learning resource for students in introductory programming courses, this work suggests that binge-watching is an unproductive study strategy that should be discouraged.
{"title":"Lecture Recordings, Viewing Habits, and Performance in an Introductory Programming Course","authors":"Valerie Picardo, Paul Denny, Andrew Luxton-Reilly","doi":"10.1145/3441636.3442307","DOIUrl":"https://doi.org/10.1145/3441636.3442307","url":null,"abstract":"Lecture recordings are a common and useful resource for students due to the flexibility they provide. Recent rapid shifts to online modes of teaching have made recorded lectures especially valuable as many students rely on them to access course content. As a result, there is a pressing need to understand how students consume lecture content via recordings and what patterns of access are effective. In this research, we investigate the common lecture recording viewing behaviors of students and the relationship between lecture recording viewing and academic performance in a first-year programming course. Several trends were observed. A significant positive correlation between lecture recording views and final grades was identified. Students who repeated the course after failing it once, achieved, on average, higher grades if they had more lecture recording views in their second attempt. A small number of students engaged in habitual “binge-watching” behavior and those students had lower grades, on average, than students who watched more regularly. This was a particular problem for repeating students, attempting the course a second time, who failed the course at twice the rate if they engaged in habitual binge-watching. Although lecture recordings appear to be a helpful learning resource for students in introductory programming courses, this work suggests that binge-watching is an unproductive study strategy that should be discouraged.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"49 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114060473","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Australia is facing a shortfall in the number of appropriately skilled ICT workers needed for Australia’s future workforce. One avenue to address this issue is for universities and stakeholders to conduct initiatives to attract prospective students to ICT courses, by building self-efficacy and positive attitudes towards ICT careers. Previous gender-focused interventions have reported immediate positive effects in female students’ attitudes and confidence towards IT. There have been, however, no longer-term measures of the longitudinal changes of both male and female students’ attitudes and self-efficacy towards IT from such interventions. This research sought to investigate and compare the longitudinal effects of three different levels of interventions for IT upon primary school-aged students’ attitudes and self-efficacy towards computer programming and their interest to pursue programming. Immediate positive effects were generated by each of these interventions, which is consistent with previous research. A major finding of this research was that these immediate positive effects fade out over the following two school terms. This effect is significant with respect to students’ interest in pursuing programming. This is a novel finding revealed by the longitudinal nature of this research project. Interest and self-efficacy behaved differently over time. While self-efficacy still incurred some fade-out over the two terms, the residual benefit after two school terms was that students’ enhancement in self-efficacy had been retained above their initial baseline measures. Findings also indicate that the interventions may have different effects based on gender.
{"title":"Helping students get IT: Investigating the longitudinal impacts of IT school outreach in Australia","authors":"A. Fletcher, Raina Mason, Graham Cooper","doi":"10.1145/3441636.3442312","DOIUrl":"https://doi.org/10.1145/3441636.3442312","url":null,"abstract":"Australia is facing a shortfall in the number of appropriately skilled ICT workers needed for Australia’s future workforce. One avenue to address this issue is for universities and stakeholders to conduct initiatives to attract prospective students to ICT courses, by building self-efficacy and positive attitudes towards ICT careers. Previous gender-focused interventions have reported immediate positive effects in female students’ attitudes and confidence towards IT. There have been, however, no longer-term measures of the longitudinal changes of both male and female students’ attitudes and self-efficacy towards IT from such interventions. This research sought to investigate and compare the longitudinal effects of three different levels of interventions for IT upon primary school-aged students’ attitudes and self-efficacy towards computer programming and their interest to pursue programming. Immediate positive effects were generated by each of these interventions, which is consistent with previous research. A major finding of this research was that these immediate positive effects fade out over the following two school terms. This effect is significant with respect to students’ interest in pursuing programming. This is a novel finding revealed by the longitudinal nature of this research project. Interest and self-efficacy behaved differently over time. While self-efficacy still incurred some fade-out over the two terms, the residual benefit after two school terms was that students’ enhancement in self-efficacy had been retained above their initial baseline measures. Findings also indicate that the interventions may have different effects based on gender.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131100747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Claudia Ott, S. Mills, N. Stanger, Senorita John, S. Zwanenburg
Academic course advice for computing students has changed and presents new challenges. It has become more complex due to an increasing number of specialised study pathways, and less personal as centralised enrolment systems have taken the place of academic advisors. To improve course advice practices for students in computing disciplines at our university, a series of interviews and an online questionnaire were conducted. We asked students where they seek information before and during their studies, what kind of information they value, and how confident they are with their study choices. We found that half of our students changed study subjects along the way and that students would like to have a higher-level view of career pathways to plan their studies. It was also highlighted that students value information provided by other students and that they prefer talking to academic staff members in person over online information. As a result of this study, we share recommendations on how to support students better when making decisions about their study subjects.
{"title":"Is this Degree for Me?Exploring computing students’ study decisions","authors":"Claudia Ott, S. Mills, N. Stanger, Senorita John, S. Zwanenburg","doi":"10.1145/3441636.3442310","DOIUrl":"https://doi.org/10.1145/3441636.3442310","url":null,"abstract":"Academic course advice for computing students has changed and presents new challenges. It has become more complex due to an increasing number of specialised study pathways, and less personal as centralised enrolment systems have taken the place of academic advisors. To improve course advice practices for students in computing disciplines at our university, a series of interviews and an online questionnaire were conducted. We asked students where they seek information before and during their studies, what kind of information they value, and how confident they are with their study choices. We found that half of our students changed study subjects along the way and that students would like to have a higher-level view of career pathways to plan their studies. It was also highlighted that students value information provided by other students and that they prefer talking to academic staff members in person over online information. As a result of this study, we share recommendations on how to support students better when making decisions about their study subjects.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131790189","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Teachers new to computing who are not familiar with technical “jargon” can feel like they have landed in a foreign world, making them reluctant to take on the subject, and potentially leading to misconceptions and misunderstandings in the classroom. The diversity of technical words, metaphors, and phrases in different contexts can make their meanings confusing, ambiguous or misunderstood for the diverse user groups in computing education. Therefore, understanding the nature of the commonly claimed difficulties and confusion caused by computer jargon among teachers becomes important for finding ways to address this issue. This paper presents the findings of an empirical study conducted to understand the nature of teachers’ understanding of computational terms (jargon) related to Computational Thinking concepts, and how a relevant professional development intervention can help to resolve issues related to them. The results indicate that the nature of the teachers’ understanding of computational terms can be either that the computational meaning is not known, that the computational context is unclear, or their applicability is unclear. Concerns about teachers finding computer jargon difficult to understand can be because the computational context in which they are applied makes them difficult for teachers to understand, rather than not knowing their meanings in the first place. Moreover, appropriate support can enable teachers to learn the techniques and skills that the terminology refers to.
{"title":"Teachers’ understanding of technical terms in a Computational Thinking curriculum","authors":"Bhagya Munasinghe, T. Bell, A. Robins","doi":"10.1145/3441636.3442311","DOIUrl":"https://doi.org/10.1145/3441636.3442311","url":null,"abstract":"Teachers new to computing who are not familiar with technical “jargon” can feel like they have landed in a foreign world, making them reluctant to take on the subject, and potentially leading to misconceptions and misunderstandings in the classroom. The diversity of technical words, metaphors, and phrases in different contexts can make their meanings confusing, ambiguous or misunderstood for the diverse user groups in computing education. Therefore, understanding the nature of the commonly claimed difficulties and confusion caused by computer jargon among teachers becomes important for finding ways to address this issue. This paper presents the findings of an empirical study conducted to understand the nature of teachers’ understanding of computational terms (jargon) related to Computational Thinking concepts, and how a relevant professional development intervention can help to resolve issues related to them. The results indicate that the nature of the teachers’ understanding of computational terms can be either that the computational meaning is not known, that the computational context is unclear, or their applicability is unclear. Concerns about teachers finding computer jargon difficult to understand can be because the computational context in which they are applied makes them difficult for teachers to understand, rather than not knowing their meanings in the first place. Moreover, appropriate support can enable teachers to learn the techniques and skills that the terminology refers to.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116533166","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Australia is a destination of choice for international students studying IT who in 2019 comprised 62% of IT enrolments in universities [9]. Studying in English is often problematic for students from a Non-English Speaking Background (NESB), leading to these students facing barriers with reading instructional materials, reading and writing code and learning English at the same time as learning technical skills. In the area of assessment, NESB students perform at lower levels than domestic students and, anecdotally, these students struggle with exams and tests, where cognitive resources are reduced as a result of stress. This paper is based on an earlier study where the exam format was modified to remove extraneous cognitive load, and keyword glossaries in Mandarin were given to Chinese students as part of a database course exam. The results of the earlier study showed significant improvement in student performance, leading to draft guidelines for developing exams for international students. The current study used these guidelines to redesign an undergraduate database exam, provided students with a choice of keyword glossaries in several languages, and surveyed students immediately after the exam. The results of this intervention were statistically compared with another technical course for students at the same level in the same semester. There was a significant interaction effect, with international students in the Database course performing better than would be expected without intervention. There was no longer a significant difference between international student and domestic student performance in the exam. Results are discussed with respect to cognitive load and mental effort measures.
{"title":"Leveling the playing field for international students in IT courses","authors":"Raina Mason, Carolyn Seton","doi":"10.1145/3441636.3442316","DOIUrl":"https://doi.org/10.1145/3441636.3442316","url":null,"abstract":"Australia is a destination of choice for international students studying IT who in 2019 comprised 62% of IT enrolments in universities [9]. Studying in English is often problematic for students from a Non-English Speaking Background (NESB), leading to these students facing barriers with reading instructional materials, reading and writing code and learning English at the same time as learning technical skills. In the area of assessment, NESB students perform at lower levels than domestic students and, anecdotally, these students struggle with exams and tests, where cognitive resources are reduced as a result of stress. This paper is based on an earlier study where the exam format was modified to remove extraneous cognitive load, and keyword glossaries in Mandarin were given to Chinese students as part of a database course exam. The results of the earlier study showed significant improvement in student performance, leading to draft guidelines for developing exams for international students. The current study used these guidelines to redesign an undergraduate database exam, provided students with a choice of keyword glossaries in several languages, and surveyed students immediately after the exam. The results of this intervention were statistically compared with another technical course for students at the same level in the same semester. There was a significant interaction effect, with international students in the Database course performing better than would be expected without intervention. There was no longer a significant difference between international student and domestic student performance in the exam. Results are discussed with respect to cognitive load and mental effort measures.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132504663","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Programming assignments are a common form of assessment in introductory courses and often require substantial work to complete. Students must therefore plan and manage their time carefully, especially leading up to published deadlines. Although time management is an important metacognitive skill that students must develop, it is rarely taught explicitly. Prior research has explored various approaches for reducing procrastination and other unproductive behaviours in students, but these are often ineffective or impractical in large courses. In this work, we investigate a scalable intervention that incentivizes students to begin work early. We provide automatically generated feedback to students who submit their work-in-progress prior to two fixed deadlines scheduled earlier than the final deadline for the assignment. Although voluntary, we find that many students welcome this early feedback and improve the quality of their work across each iteration. Especially for at-risk students, who have failed an earlier module in the course, engaging with the early feedback opportunities results in significantly better work at the time of final submission.
{"title":"Promoting Early Engagement with Programming Assignments Using Scheduled Automated Feedback","authors":"Paul Denny, Jacqueline L. Whalley, Juho Leinonen","doi":"10.1145/3441636.3442309","DOIUrl":"https://doi.org/10.1145/3441636.3442309","url":null,"abstract":"Programming assignments are a common form of assessment in introductory courses and often require substantial work to complete. Students must therefore plan and manage their time carefully, especially leading up to published deadlines. Although time management is an important metacognitive skill that students must develop, it is rarely taught explicitly. Prior research has explored various approaches for reducing procrastination and other unproductive behaviours in students, but these are often ineffective or impractical in large courses. In this work, we investigate a scalable intervention that incentivizes students to begin work early. We provide automatically generated feedback to students who submit their work-in-progress prior to two fixed deadlines scheduled earlier than the final deadline for the assignment. Although voluntary, we find that many students welcome this early feedback and improve the quality of their work across each iteration. Especially for at-risk students, who have failed an earlier module in the course, engaging with the early feedback opportunities results in significantly better work at the time of final submission.","PeriodicalId":334899,"journal":{"name":"Proceedings of the 23rd Australasian Computing Education Conference","volume":"66 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116317272","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}