Ben Motz Profile Picture

Ben Motz

  • bmotz@indiana.edu
  • Psychology Building A200B
  • (812) 855-0318
  • Home Website
  • Research Scientist
    Psychological and Brain Sciences
  • Faculty Fellow for Academic Analytics
    UITS

Education

  • Ph.D., Cognitive Science, Indiana University Bloomington, 2018
  • M.S., Cognitive Science, University of California San Diego, 2005
  • B.S., Cognitive Science with minor in Psychology, Indiana University, 2002

Representative publications

ManyClasses 1: Assessing the generalizable effect of immediate versus delayed feedback across many college classes (2019)
Emily Fyfe, Joshua de Leeuw, Paulo Carvalho, Robert Goldstone and Benjamin Motz
PsyArXiv.

Psychology researchers have long attempted to identify educational practices that improve student learning. However, experimental research on these practices is often conducted in laboratory contexts or in a single class, threatening the external validity of the results. In this paper, we establish an experimental paradigm for evaluating the benefits of recommended practices across a variety of authentic educational contexts–a model we call ManyClasses. The core feature is that researchers examine the same research question and measure the same experimental effect across many classes spanning a range of topics, institutions, teacher implementations, and student populations. We report the first ManyClasses study, which examined how the timing of feedback on class assignments, either immediate or delayed by a few days, affected subsequent performance on class assessments. Across XX classes,[summarize effect of feedback timing, including key moderators]. More broadly, these findings provide evidence regarding the feasibility of conducting within-class randomized experiments across a range of naturally occurring learning environments.

Self‐regulated studying behavior, and the social norms that influence it (2018)
Julie R Eyink, Benjamin A Motz, Gordon Heltzel and Torrin M Liddell
Journal of Applied Social Psychology,

Teachers commonly use injunctive norms when telling students what they should be doing. But researchers find that sometimes descriptive norms, information about what others are actually doing, are more powerful influencers of behavior. In the present work, we examine which norm is more effective at increasing self‐regulated studying and performance in an online college course across two semesters. To do this, we randomly assigned 751 undergraduate Introductory Psychology students to receive email messages at the start of every content unit that either contained descriptive norms, injunctive norms, information about the course, or a no message control. We found that injunctive norms increased study behaviors aimed at fulfilling course requirements (completion of assigned activities), but did not improve learning outcomes. Descriptive norms increased behaviors aimed at improving knowledge (ungraded …

Embedding Experiments: Staking Causal Inference in Authentic Educational Contexts (2018)
Benjamin A Motz, Paulo F Carvalho, Joshua R de Leeuw and Robert L Goldstone
Journal of Learning Analytics, 5 (2), 47-59

To identify the ways teachers and educational systems can improve learning, researchers need to make causal inferences. Analyses of existing datasets play an important role in detecting causal patterns, but conducting experiments also plays an indispensable role in this research. In this article, we advocate for experiments to be embedded in real educational contexts, allowing researchers to test whether interventions such as a learning activity, new technology, or advising strategy elicit reliable improvements in authentic student behaviours and educational outcomes. Embedded experiments, wherein theoretically relevant variables are systematically manipulated in real learning contexts, carry strong benefits for making causal inferences, particularly when allied with the data-rich resources of contemporary e-learning environments. Toward this goal, we offer a field guide to embedded experimentation, reviewing experimental design choices, addressing ethical concerns, discussing the importance of involving teachers, and reviewing how interventions can be deployed in a variety of contexts, at a range of scales. Causal inference is a critical component of a field that aims to improve student learning; including experimentation alongside analyses of existing data in learning analytics is the most compelling way to test causal claims.

A dissociation between engagement and learning: Enthusiastic instructions fail to reliably improve performance on a memory task (2017)
Benjamin A Motz, Joshua R. de Leeuw, Paulo F. Carvalho, Kaley L. Liang and Robert L. Goldstone
PLoS ONE, 12 (7), e0181775

Despite widespread assertions that enthusiasm is an important quality of effective teaching, empirical research on the effect of enthusiasm on learning and memory is mixed and largely inconclusive. To help resolve these inconsistencies, we conducted a carefully-controlled laboratory experiment, investigating whether enthusiastic instructions for a memory task would improve recall accuracy. Scripted videos, either enthusiastic or neutral, were used to manipulate the delivery of task instructions. We also manipulated the sequence of learning items, replicating the spacing effect, a known cognitive technique for memory improvement. Although spaced study reliably improved test performance, we found no reliable effect of enthusiasm on memory performance across two experiments. We did, however, find that enthusiastic instructions caused participants to respond to more item prompts, leaving fewer test questions blank, an outcome typically associated with increased task motivation. We find no support for the popular claim that enthusiastic instruction will improve learning, although it may still improve engagement. This dissociation between motivation and learning is discussed, as well as its implications for education and future research on student learning.

An in vivo study of self-regulated study sequencing in introductory psychology courses (2016)
Paulo F Carvalho, David W Braithwaite, Joshua R de Leeuw, Benjamin A Motz and Robert L Goldstone
PloS one, 11 (3), e0152115

Study sequence can have a profound influence on learning. In this study we investigated how students decide to sequence their study in a naturalistic context and whether their choices result in improved learning. In the study reported here, 2061 undergraduate students enrolled in an Introductory Psychology course completed an online homework tutorial on measures of central tendency, a topic relevant to an exam that counted towards their grades. One group of students was enabled to choose their own study sequence during the tutorial (Self-Regulated group), while the other group of students studied the same materials in sequences chosen by other students (Yoked group). Students who chose their sequence of study showed a clear tendency to block their study by concept, and this tendency was positively associated with subsequent exam performance. In the Yoked group, study sequence had no effect on exam performance. These results suggest that despite findings that blocked study is maladaptive when assigned by an experimenter, it may actually be adaptive when chosen by the learner in a naturalistic context.

The Cognitive Costs of Context: The Effects of Concreteness and Immersiveness in Instructional Examples (2015)
Samuel Day, Benjamin Motz and Robert Goldstone
Frontiers in Psychology, 6 (1876), 13-Jan

Prior research has established that while the use of concrete, familiar examples can provide many important benefits for learning, it is also associated with some serious disadvantages, particularly in learners’ ability to recognize and transfer their knowledge to new analogous situations. However, it is not immediately clear whether this pattern would hold in real world educational contexts, in which the role of such examples in student engagement and ease of processing might be of enough importance to overshadow any potential negative impact. We conducted two experiments in which curriculum-relevant material was presented in natural classroom environments, first with college undergraduates and then with middle-school students. All students in each study received the same relevant content, but the degree of contextualization in these materials was varied between students. In both studies, we found that greater contextualization was associated with poorer transfer performance. We interpret these results as reflecting a greater degree of embeddedness for the knowledge acquired from richer, more concrete materials, such that the underlying principles are represented in a less abstract and generalizable form.

Know thy students: Providing aggregate student data to instructors (2015)
Benjamin A Motz, Julie Anne Teague and Linda Shepard
EDUCAUSE Review Online,

Institutions of higher education aggregate and warehouse many terabytes of student data, aiding routine functions including degree conferral, financial aid eligibility, online course management, and more. In recent years, scholars of teaching and learning have begun exploring new ways to utilize these" big data" resources, extracting information that might directly contribute to the institution's educational mission. These nascent methodologies have come to be termed learning analytics. 1 For the most part, efforts in learning analytics have focused primarily on the development of predictive safety nets, empirically derived early warning systems that deploy flags and interventions for students who are underperforming or otherwise" at risk." 2 Nonetheless, some believe that learning analytics has largely untapped potential, with far more value to be gained from mining student data, beyond providing diagnostic tools. 3 Accordingly, this article describes a novel use of institutional data resources in higher education. Rather than providing information about students at risk, we aimed to develop a system that would broadly inform instructors about all their enrolled students, providing summarized institutional data about the aggregate characteristics of the students enrolled in their respective classes. We call it the Student Profile Report (SPR), a short document that summarizes student records, intended to provide a useful snapshot of information about the population of students taking a course prior to the start of a semester. We

Edit your profile