We present a mixed methods comparative assessment of three online discussion tools widely used in higher education. We combine different data types (quantitative, qualitative) and sources (usage data, survey responses) to assess Piazza, Padlet, and Blackboard discussion boards. We highlight and discuss the differences between, and relative merits of, the tools and report the preferences of students on a large introductory Economics unit for them. We summarise lessons learnt from using the tools and make recommendations to engender greater student engagement with online discussion spaces.
A large literature has shown that novices and experts across the STEM disciplines differ markedly in how they approach and solve problems. Many STEM education scholars find that giving students scaffolded exercises in which they work with a visualization tool can be highly effective in teaching novices to think more like experts. Using model visualization software developed for EconGraphs, we have created three custom collections of interactive exercises for use in microeconomics courses at three institutions, two in the US and one in the UK. Based on surveys of the students (n=71, 167, and 57 respectively at the three institutions), students found the new exercises helpful. Additional analysis at two of the institutions suggests that the exercises were more likely to be valued by students with weaker math skills, students with weaker introductory microeconomics skills, and, in some cases, students who agreed that economics was interesting and applicable to their daily lives. The contributions of this paper are to illustrate how the free content from EconGraphs and the associated authoring tools may be used to create exercises that enhance the teaching and learning experience at a range of institutions; to provide the exercises themselves as a public good; and to invite further innovation and investigation in this area.
We investigate the impact of repeated contextual nudging on task com- pletion and performance within the setting of traditional face-to-face education, adding to a literature which has focused on non-contextual nudges, provided contextual nudges in online learning settings, or used one-time nudges. We undertake a clustered randomised controlled trial on first-year Business School students at Heriot-Watt University, employing weekly nudges during tutorials to encourage students to establish learning intentions for the upcoming week. We rule out significant treatment effects from the nudging intervention on tutorial attendance, the most important predictor of final marks of all intermediate outcomes that we consider. We are unable to make causal statements on the effects of the intervention on final marks. Nonetheless, this experiment yields valuable insights into the dynamics of repeated contextual nudging within traditional educational settings as well as shedding light on aspects of student behaviour.
The economics literature provides mixed results on the effect of online classroom technology on student outcomes. An emerging body of behavioral studies suggests that videoconferencing is a cognitively exhausting activity and that the camera that is a salient feature of virtual meetings is a leading cause of this fatigue, with a more pronounced effect on women. Thus, economists and behavioral scientists debate whether online education and its technology provide an effective method of instruction. We use a field experimental design to examine how camera use in online synchronous economics classes affects learning outcomes. We find that changing the method of student engagement from turning the camera off to turning it on increases quiz grades by 0.18 standard deviations while we control for students’ gender and aptitude.
We investigate the impact of infographics on student learning in principles of economics classes. Infographics display facts, data, and information about a specific topic in a clear and easy to understand manner, using graphics and pictures to help summarize, explain, and display information. Creating an infographic can be a fun, engaging, and powerful educational tool for learning new topics in both introductory and advanced courses. We use a randomized classroom experiment to evaluate the effect of creating infographics on student learning in principles of microeconomics courses as measured by exam performance. The randomization across topics enables a student-fixed effects estimation of the effect of infographics. We find students were 2.1–2.2 percentage points more likely to answer a multiple-choice question correctly relating to the topics for which they completed an infographic relative to those they did not. We provide suggestive evidence that infographics may help higher ability students most, increasing their scores on both multiple choice and short answer questions.

