Sungjin Nam, Steven Lonn, Thomas Brown, Cinda-Sue Davis, Darryl Koch
Every college student registers for courses from a catalog of numerous offerings each term. Selecting the courses in which to enroll, and in what combinations, can dramatically impact each student's chances for academic success. Taking inspiration from the STEM Academy, we wanted to identify the characteristics of engineering students who graduate with 3.0 or above grade point average. The overall goal of the Customized Course Advising project is to determine the optimal term-by-term course selections for all engineering students based on their incoming characteristics and previous course history and performance, paying particular attention to concurrent enrollment. We found that ACT Math, SAT Math, and Advanced Placement exam can be effective measures to measure the students' academic preparation level. Also, we found that some concurrent course-enrollment patterns are highly predictive of first-term and overall academic success.
{"title":"Customized course advising: investigating engineering student success with incoming profiles and patterns of concurrent course enrollment","authors":"Sungjin Nam, Steven Lonn, Thomas Brown, Cinda-Sue Davis, Darryl Koch","doi":"10.1145/2567574.2567589","DOIUrl":"https://doi.org/10.1145/2567574.2567589","url":null,"abstract":"Every college student registers for courses from a catalog of numerous offerings each term. Selecting the courses in which to enroll, and in what combinations, can dramatically impact each student's chances for academic success. Taking inspiration from the STEM Academy, we wanted to identify the characteristics of engineering students who graduate with 3.0 or above grade point average. The overall goal of the Customized Course Advising project is to determine the optimal term-by-term course selections for all engineering students based on their incoming characteristics and previous course history and performance, paying particular attention to concurrent enrollment. We found that ACT Math, SAT Math, and Advanced Placement exam can be effective measures to measure the students' academic preparation level. Also, we found that some concurrent course-enrollment patterns are highly predictive of first-term and overall academic success.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130688521","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
One of the most crucial distinctions between Chinese and Western languages is that the former is based on ideograms, whereas the latter is based on phonograms. Due to this distinction, Western learners of Chinese often experience more difficulties in grasping correct character stroke sequence and/or stroke direction relative to native Chinese speakers. In this paper, we designed a HanZi writing environment with automatic feedback to address the above issue. Before the collection of HanZi characters on a massive scale, we conducted a pilot study to collect handwritten Chinese samples from 160 college students in the U.S. The findings from this study enabled us to further refine the learning environment and design optimal learning and teaching strategies for learners and teachers.
{"title":"Hanzi handwriting acquisition with automatic feedback","authors":"Chin-Hwa Kuo, Jian-Wen Peng, Wen-Chen Chang","doi":"10.1145/2567574.2567575","DOIUrl":"https://doi.org/10.1145/2567574.2567575","url":null,"abstract":"One of the most crucial distinctions between Chinese and Western languages is that the former is based on ideograms, whereas the latter is based on phonograms. Due to this distinction, Western learners of Chinese often experience more difficulties in grasping correct character stroke sequence and/or stroke direction relative to native Chinese speakers. In this paper, we designed a HanZi writing environment with automatic feedback to address the above issue. Before the collection of HanZi characters on a massive scale, we conducted a pilot study to collect handwritten Chinese samples from 160 college students in the U.S. The findings from this study enabled us to further refine the learning environment and design optimal learning and teaching strategies for learners and teachers.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122213697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents an analysis of resource access patterns in a recently conducted master level university course. The specialty of the course was that it followed a new teaching approach by providing additional learning resources such as wikis, self-tests and videos. To gain deeper insights into the usage of the provided learning material we have built dynamic bipartite student -- resource networks based on event logs of resource access. These networks are analysed using methods adapted from social network analysis. In particular we uncover bipartite clusters of students and resources in those networks and propose a method to identify patterns and traces of their evolution over time.
{"title":"Analysis of dynamic resource access patterns in a blended learning course","authors":"Tobias Hecking, Sabrina Ziebarth, H. Hoppe","doi":"10.1145/2567574.2567584","DOIUrl":"https://doi.org/10.1145/2567574.2567584","url":null,"abstract":"This paper presents an analysis of resource access patterns in a recently conducted master level university course. The specialty of the course was that it followed a new teaching approach by providing additional learning resources such as wikis, self-tests and videos. To gain deeper insights into the usage of the provided learning material we have built dynamic bipartite student -- resource networks based on event logs of resource access. These networks are analysed using methods adapted from social network analysis. In particular we uncover bipartite clusters of students and resources in those networks and propose a method to identify patterns and traces of their evolution over time.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117135247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
While the landscape of learning analytics is relatively well defined, the extent to which institutions are ready to embark on an analytics implementation is less known. Further, while work has been done on measuring the maturity of an institution's implementation, this work fails to investigate how an institution that has not implemented analytics to date might become mature over time. To that end, the authors developed and piloted a survey, the Learning Analytics Readiness Instrument (LARI), in an attempt to help institutions successfully prepare themselves for a successfully analytics implementation. The LARI is comprised of 90 items encompassing five factors related to a learning analytics implementation: (1) Ability, (2) Data, (3) Culture and Process, (4) Governance and Infrastructure, and, (5) Overall Readiness Perception. Each of the five factors has a high internal consistency, as does the overall tool. This paper discusses the need for a survey such as the LARI, the tool's psychometric properties, the authors' broad interpretations of the findings, and next steps for the LARI and the research in this field.
{"title":"An exercise in institutional reflection: the learning analytics readiness instrument (LARI)","authors":"Kimberly E. Arnold, Steven Lonn, M. Pistilli","doi":"10.1145/2567574.2567621","DOIUrl":"https://doi.org/10.1145/2567574.2567621","url":null,"abstract":"While the landscape of learning analytics is relatively well defined, the extent to which institutions are ready to embark on an analytics implementation is less known. Further, while work has been done on measuring the maturity of an institution's implementation, this work fails to investigate how an institution that has not implemented analytics to date might become mature over time. To that end, the authors developed and piloted a survey, the Learning Analytics Readiness Instrument (LARI), in an attempt to help institutions successfully prepare themselves for a successfully analytics implementation. The LARI is comprised of 90 items encompassing five factors related to a learning analytics implementation: (1) Ability, (2) Data, (3) Culture and Process, (4) Governance and Infrastructure, and, (5) Overall Readiness Perception. Each of the five factors has a high internal consistency, as does the overall tool. This paper discusses the need for a survey such as the LARI, the tool's psychometric properties, the authors' broad interpretations of the findings, and next steps for the LARI and the research in this field.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117013229","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
One of the key promises of Learning Analytics research is to create tools that could help educational institutions to gain a better insight of the inner workings of their programs, in order to tune or correct them. This work presents a set of simple techniques that applied to readily available historical academic data could provide such insights. The techniques described are real course difficulty estimation, dependance estimation, curriculum coherence, dropout paths and load/performance graph. The description of these techniques is accompanied by its application to real academic data from a Computer Science program. The results of the analysis are used to obtain recommendations for curriculum re-design.
{"title":"Techniques for data-driven curriculum analysis","authors":"G. Méndez, X. Ochoa, K. Chiluiza","doi":"10.1145/2567574.2567591","DOIUrl":"https://doi.org/10.1145/2567574.2567591","url":null,"abstract":"One of the key promises of Learning Analytics research is to create tools that could help educational institutions to gain a better insight of the inner workings of their programs, in order to tune or correct them. This work presents a set of simple techniques that applied to readily available historical academic data could provide such insights. The techniques described are real course difficulty estimation, dependance estimation, curriculum coherence, dropout paths and load/performance graph. The description of these techniques is accompanied by its application to real academic data from a Computer Science program. The results of the analysis are used to obtain recommendations for curriculum re-design.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126457998","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we explain a portion of the design research process that we used to develop the learning analytics for a manipulative-based fractions intervention program. In particular, we highlight a set of qualitative interviews that we conducted with individual students after a short study in which students in three classes at the same school learned to use virtual manipulatives to compare pairs of proper fractions and order groups of 3 proper fractions. These qualitative interviews provided us with considerable information that helped us improve the interactions students have with the virtual manipulatives and produce more sophisticated and informative analytics. We emphasize the importance of using mixed-methods during the iterative cycles of development that define design research.
{"title":"Interaction design for improved analytics","authors":"M. Mendiburo, Brian Sulcer, T. S. Hasselbring","doi":"10.1145/2567574.2567628","DOIUrl":"https://doi.org/10.1145/2567574.2567628","url":null,"abstract":"In this paper, we explain a portion of the design research process that we used to develop the learning analytics for a manipulative-based fractions intervention program. In particular, we highlight a set of qualitative interviews that we conducted with individual students after a short study in which students in three classes at the same school learned to use virtual manipulatives to compare pairs of proper fractions and order groups of 3 proper fractions. These qualitative interviews provided us with considerable information that helped us improve the interactions students have with the virtual manipulatives and produce more sophisticated and informative analytics. We emphasize the importance of using mixed-methods during the iterative cycles of development that define design research.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"23 6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130237106","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article addresses a relatively unexplored area in the emerging field of learning analytics, the design of learning analytics interventions. A learning analytics intervention is defined as the surrounding frame of activity through which analytic tools, data, and reports are taken up and used. It is a soft technology that involves the orchestration of the human process of engaging with the analytics as part of the larger teaching and learning activity. This paper first makes the case for the overall importance of intervention design, situating it within the larger landscape of the learning analytics field, and then considers the specific issues of intervention design for student use of learning analytics. Four principles of pedagogical learning analytics intervention design that can be used by teachers and course developers to support the productive use of learning analytics by students are introduced: Integration, Agency, Reference Frame and Dialogue. In addition three core processes in which to engage students are described: Grounding, Goal-Setting and Reflection. These principles and processes are united in a preliminary model of pedagogical learning analytics intervention design for students, presented as a starting point for further inquiry.
{"title":"Designing pedagogical interventions to support student use of learning analytics","authors":"A. Wise","doi":"10.1145/2567574.2567588","DOIUrl":"https://doi.org/10.1145/2567574.2567588","url":null,"abstract":"This article addresses a relatively unexplored area in the emerging field of learning analytics, the design of learning analytics interventions. A learning analytics intervention is defined as the surrounding frame of activity through which analytic tools, data, and reports are taken up and used. It is a soft technology that involves the orchestration of the human process of engaging with the analytics as part of the larger teaching and learning activity. This paper first makes the case for the overall importance of intervention design, situating it within the larger landscape of the learning analytics field, and then considers the specific issues of intervention design for student use of learning analytics. Four principles of pedagogical learning analytics intervention design that can be used by teachers and course developers to support the productive use of learning analytics by students are introduced: Integration, Agency, Reference Frame and Dialogue. In addition three core processes in which to engage students are described: Grounding, Goal-Setting and Reflection. These principles and processes are united in a preliminary model of pedagogical learning analytics intervention design for students, presented as a starting point for further inquiry.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"137 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115047441","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This case study describes how course features and individual & social learning analytics were scaled up to support "participatory" learning. An existing online course was turned into a "big open online course" (BOOC) offered to hundreds. Compared to typical open courses, relatively high levels of persistence, individual & social engagement, and achievement were obtained. These results suggest that innovative learning analytics might best be scaled (a) incrementally, (b) using design-based research methods, (c) focusing on engagement in consequential & contextual knowledge, (d) using emerging situative assessment theories.
{"title":"Small to big before massive: scaling up participatory learning analytics","authors":"D. Hickey, Tara Alana Kelley, Xinyi Shen","doi":"10.1145/2567574.2567626","DOIUrl":"https://doi.org/10.1145/2567574.2567626","url":null,"abstract":"This case study describes how course features and individual & social learning analytics were scaled up to support \"participatory\" learning. An existing online course was turned into a \"big open online course\" (BOOC) offered to hundreds. Compared to typical open courses, relatively high levels of persistence, individual & social engagement, and achievement were obtained. These results suggest that innovative learning analytics might best be scaled (a) incrementally, (b) using design-based research methods, (c) focusing on engagement in consequential & contextual knowledge, (d) using emerging situative assessment theories.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131109947","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper synthesizes some of the technical decisions, design strategies & concepts developed during the execution of Open Academic Analytics Initiative (OAAI), a research program aimed at improving student retention rates in colleges, by deploying an open-source academic early alert system to identify the students at academic risk. The paper explains the prototype demonstration of the system, detailing several dimensions of data mining & analysis such as: data integration, predictive modelling and scoring with reporting. The paper should be relevant to practitioners and academicians who want to better understand the implementation of an OAAI academic early-alert system.
{"title":"Open academic early alert system: technical demonstration","authors":"Sandeep M. Jayaprakash, E. Lauría","doi":"10.1145/2567574.2567578","DOIUrl":"https://doi.org/10.1145/2567574.2567578","url":null,"abstract":"This paper synthesizes some of the technical decisions, design strategies & concepts developed during the execution of Open Academic Analytics Initiative (OAAI), a research program aimed at improving student retention rates in colleges, by deploying an open-source academic early alert system to identify the students at academic risk. The paper explains the prototype demonstration of the system, detailing several dimensions of data mining & analysis such as: data integration, predictive modelling and scoring with reporting. The paper should be relevant to practitioners and academicians who want to better understand the implementation of an OAAI academic early-alert system.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133133746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Peer assessment is seen as a powerful supporting tool to achieve scalability in the evaluation of complex assignments in large courses, possibly virtual ones, as in the context of massive open online courses (MOOCs). However, the adoption of peer assessment is slow due in part to the lack of ready-to-use systems. Furthermore, the validity of peer assessment is still under discussion. In this paper, in order to tackle some of these issues, we present as a proof-of-concept of a novel extension of Graasp, a social media platform, to setup a peer assessment activity. We then report a case study of peer assessment using Graasp in a Social Media course with 60 master's level university students and analyze the level of agreement between students and instructors in the evaluation of short individual reports. Finally, to see if both instructor and student evaluations were based on appearance of project reports rather than on content, we conducted a study with 40 kids who rated reports solely on their look. Our results convey the fact that unlike the kid evaluation, which shows a low level of agreement with instructors, student assessment is reliable since the level of agreement between instructors and students was high.
{"title":"Peer assessment based on ratings in a social media course","authors":"A. Vozniuk, A. Holzer, D. Gillet","doi":"10.1145/2567574.2567608","DOIUrl":"https://doi.org/10.1145/2567574.2567608","url":null,"abstract":"Peer assessment is seen as a powerful supporting tool to achieve scalability in the evaluation of complex assignments in large courses, possibly virtual ones, as in the context of massive open online courses (MOOCs). However, the adoption of peer assessment is slow due in part to the lack of ready-to-use systems. Furthermore, the validity of peer assessment is still under discussion. In this paper, in order to tackle some of these issues, we present as a proof-of-concept of a novel extension of Graasp, a social media platform, to setup a peer assessment activity. We then report a case study of peer assessment using Graasp in a Social Media course with 60 master's level university students and analyze the level of agreement between students and instructors in the evaluation of short individual reports. Finally, to see if both instructor and student evaluations were based on appearance of project reports rather than on content, we conducted a study with 40 kids who rated reports solely on their look. Our results convey the fact that unlike the kid evaluation, which shows a low level of agreement with instructors, student assessment is reliable since the level of agreement between instructors and students was high.","PeriodicalId":178564,"journal":{"name":"Proceedings of the Fourth International Conference on Learning Analytics And Knowledge","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133632091","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}