Joseph B. Herzog, P. Herzog, Paul G. Talaga, Christopher M. Stanley, G. Ricco
{"title":"在计算机程序设计导论课程中提供建构性回答问题与选择题之间关系的洞见","authors":"Joseph B. Herzog, P. Herzog, Paul G. Talaga, Christopher M. Stanley, G. Ricco","doi":"10.1109/FIE43999.2019.9028548","DOIUrl":null,"url":null,"abstract":"This Research-to-Practice Work in Progress (WIP) investigates the format of student assessment questions. In particular, the focus is on the relationship between student performance on open-ended, constructed-response questions (CRQs) versus close-ended, multiple-choice-response questions (MCQs) in first-year introductory programming courses. We introduce a study to evaluate whether these different response formats return distinct or comparable results. In order to assess this, we compare and correlate student scores on each question type. Our focus is on assessments (exams and tests) in first-year classes. The paper investigates two first-year programming courses with a total of seven sections and approximately 180 combined students. The subject of the sequential set of courses is the procedural C programming language. Based on extant studies comparing student performance on MCQs to their performance on open-ended questions, we investigate whether MCQ scores predict CRQ scores. Preliminary results on the comparison between student performance on these two question formats are presented to assess whether MCQs produce similar results as CRQs, or whether MCQs yield unique contributions. Possible avenues for future work are also discussed.","PeriodicalId":6700,"journal":{"name":"2019 IEEE Frontiers in Education Conference (FIE)","volume":"39 1","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Providing Insight into the Relationship Between Constructed Response Questions and Multiple Choice Questions in Introduction to Computer Programming Courses\",\"authors\":\"Joseph B. Herzog, P. Herzog, Paul G. Talaga, Christopher M. Stanley, G. Ricco\",\"doi\":\"10.1109/FIE43999.2019.9028548\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This Research-to-Practice Work in Progress (WIP) investigates the format of student assessment questions. In particular, the focus is on the relationship between student performance on open-ended, constructed-response questions (CRQs) versus close-ended, multiple-choice-response questions (MCQs) in first-year introductory programming courses. We introduce a study to evaluate whether these different response formats return distinct or comparable results. In order to assess this, we compare and correlate student scores on each question type. Our focus is on assessments (exams and tests) in first-year classes. The paper investigates two first-year programming courses with a total of seven sections and approximately 180 combined students. The subject of the sequential set of courses is the procedural C programming language. Based on extant studies comparing student performance on MCQs to their performance on open-ended questions, we investigate whether MCQ scores predict CRQ scores. Preliminary results on the comparison between student performance on these two question formats are presented to assess whether MCQs produce similar results as CRQs, or whether MCQs yield unique contributions. Possible avenues for future work are also discussed.\",\"PeriodicalId\":6700,\"journal\":{\"name\":\"2019 IEEE Frontiers in Education Conference (FIE)\",\"volume\":\"39 1\",\"pages\":\"1-5\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE Frontiers in Education Conference (FIE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FIE43999.2019.9028548\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Frontiers in Education Conference (FIE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FIE43999.2019.9028548","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Providing Insight into the Relationship Between Constructed Response Questions and Multiple Choice Questions in Introduction to Computer Programming Courses
This Research-to-Practice Work in Progress (WIP) investigates the format of student assessment questions. In particular, the focus is on the relationship between student performance on open-ended, constructed-response questions (CRQs) versus close-ended, multiple-choice-response questions (MCQs) in first-year introductory programming courses. We introduce a study to evaluate whether these different response formats return distinct or comparable results. In order to assess this, we compare and correlate student scores on each question type. Our focus is on assessments (exams and tests) in first-year classes. The paper investigates two first-year programming courses with a total of seven sections and approximately 180 combined students. The subject of the sequential set of courses is the procedural C programming language. Based on extant studies comparing student performance on MCQs to their performance on open-ended questions, we investigate whether MCQ scores predict CRQ scores. Preliminary results on the comparison between student performance on these two question formats are presented to assess whether MCQs produce similar results as CRQs, or whether MCQs yield unique contributions. Possible avenues for future work are also discussed.