“Think like an expert”: Brain scans watch learning in action

Written by
Liz Fuller-Wright, Office of Communications
March 26, 2021

What does learning look like inside the brain?

Can a brain scan reveal if a student is learning a tough curriculum or falling behind?

These and other questions prompted a team of Princeton neuroscientists to launch an ambitious experiment, scanning 24 students’ brains six times during the 2018 spring semester to quite literally watch them learn.

“Our study constitutes — by far — the most thorough neural investigation of learning in a real-world college course,” said Meir Meshulam, a postdoctoral research associate at the Princeton Neuroscience Institute who is the lead author of a study appearing in the March 26 issue of Nature Communications.

Meshulam and his colleagues found that as the students progressed through an introductory computer science course, the curriculum material left detectable “neural fingerprints” in their brains that were shared — the researchers used the term “aligned” — with other students in the class as well as experts in the field. The more aligned the patterns were, the more likely the students were to do well.

This dovetailed with a series of joint studies from the labs of Uri Hasson, a professor of neuroscience and psychology at Princeton, and Ken Norman, Princeton’s Huo Professor in Computational and Theoretical Neuroscience and chair of the Department of Psychology. Hasson and Norman, the senior authors of the new study, had previously found that the neural patterns across different people are similar when people agree on the interpretation of a narrative (an episode of the BBC’s “Sherlock”). In the current study the researchers went one step further and asked, for the first time, how we develop shared understanding as we learn abstract academic material. With no plots to follow, would students’ brain patterns still show parallel structures as they learn new concepts?

Yes, it turns out.

“We found that students got ‘aligned’ with each other as they were watching the lecture videos,” said Norman. That tracked with what previous studies had found when people viewed “Sherlock” episodes.

“And then we found that the better a student is aligned with the rest of the class, the better they will do on the final exam,” Norman said.

“This just blows my mind, over and over,” said Hasson. “If we’re all in scanners and they show us a picture of a car, all our brains will show something similar because it’s a simple visual input. Okay, no big deal. But why would the way I’m representing an abstract mathematical concept in my brain be similar to the way you represent it in your brain? That’s something we did not expect to find. It still amazes me — still, today! — that it worked.”

Then the researchers went back and looked at their early data and realized that in the very first scan, taken after the second week of the course, alignment or lack thereof could predict how well students would do on the final exam.

“Teachers could use this to identify their struggling students early and try to help them,” said Meshulam. “You could say, ‘Let’s rethink this. Let me help you see how these ideas fit together.’”

Their study also reinforces the importance of group work and collaboration, Meshulam said. “I would definitely encourage students to work in groups, to talk to each other, to communicate as much as much as possible, because the class as a whole is a good thing to be aligned with.”

Jennifer Rexford, chair of Princeton’s Department of Computer Science and the Gordon Y.S. Wu Professor in Engineering, worked closely with the neuroscientists as they prepared their experiment.

“As educators, we are always searching for new and better ways to reach our students, and to evaluate how effective our teaching strategies might be,” she said. “This research project was an exciting opportunity to go right to the source — literally to the brains of students as they acquire new computer science knowledge — to gain fundamental insights into how they learn the material. My hope is that insights from these kinds of studies can give us much more rigorous ways to guide how to teach our students.”

A ‘conga line’ of research subjects

All of the subjects were undergraduates enrolled in Spring 2018’s “Introduction to Computer Science,” one of Princeton’s most popular courses. The research team scanned the students every few weeks through the semester. Each time they lay in the fMRI machines, the students watched video lectures from the flipped course. (In a “flipped course,” professors pre-record video lectures that students watch on their own, leaving class time free for discussions or working on assignments.)

“To be clear, we’re not saying that that all the learning took place during the flipped lectures,” Norman cautioned. “During class, students were working intensively on programming problems and talking to other students and their instructors. We know that a lot of the conceptual shaping is happening in that interactive space. Our measurements are just pinging the students’ brains to see the effects of that work.”

The computer science students — all volunteers who received no academic bonus for participating, though they did get some cookies — agreed not to watch the lectures before their scan. That created a very tight schedule, because the scanners had to be available at just the right time to get all 24 students in and out at the right points in the semester.

“We shut down the entire scanning suite for the Princeton Neuroscience Institute,” recalled Meshulam. “We had a conga line of students marching in on weekends, one after the next — six times in a row.”

The logistics were incredibly daunting, the researchers agreed.

“You could look at this and say, ‘Oh, they scanned 20, 25 students? Lots of fMRI studies have scanned 20 students,’” said Norman. “But not six times on a very tight schedule as students are engaged in a high demanding academic course. I think the sheer scope of it — when you look at the volume of scanning data and the specificity, making sure we were staying in line with their learning experience and not disrupting anything — it was an enormous undertaking.”

The research team also borrowed an assessment created by the course’s professors, a series of open-ended questions usually given to students hoping to test out of the introductory class. The students provided written answers before the beginning of the semester — all scored a zero — and then took it verbally in the scanner at the end of the semester, so the researchers could assess how much they learned during the semester.

Then the neuroscientists brought in five computer science “experts,” a mix of graduate students, postdoctoral researchers and assistants in instruction. These experts watched a 15-minute recap video of the flipped lectures and answered the same series of questions.

“We could then compare the experts’ brain activity during the exam with the students’ brain activity while answering the same questions, and we found what we had hoped we’d find, that there is a very tight link,” said Meshulam. “So we could use either the class average or the experts’ template to do the same kind of magic, to predict how well a given student will do in the final exam.”

Why does neural alignment predict exam performance? The key turned out to be whether the students were grouping concepts correctly or incorrectly lumping unrelated theories together. Our brains use these “knowledge structures” to parse the world around us; because we have linked together panthers and tigers but have a separate linkage for trees and grass, we are able to respond correctly — and differently — the first time we encounter mountain lions or dandelions.

The scans revealed, in essence, which students believed dandelions were predators.

It has taken years to analyze the data gathered during those sessions, and the research team and others will continue to mine it for years to come, said Norman. To facilitate this, the authors publicly shared their data online.

“We believe that this work could have potentially transformative applications to education,” he said. “The method is so powerful because it’s so general. This is not just a method for diagnosing how this particular computer science course changed people’s concepts, it’s a method for diagnosing any kind of learning.

“I’m very excited about the applications,” he added. “I think as more people in education come to understand this, they’ll find lots of ways of leveraging it for good.”

All of the paper’s authors are or were researchers in Princeton’s Department of Psychology and the Princeton Neuroscience Institute. In addition to Meshulam, Norman and Hasson, they are Mai Nguyen, a 2020 Ph.D. alumna; Liat Hasenfratz, an associate research scholar; and former research specialists Hanna Hillman and Yun-Fei Liu.

Neural alignment predicts learning outcomes in students taking an introduction to computer science course,” by Meir Meshulam, Liat Hasenfratz, Hanna Hillman, Yun-Fei Liu, Mai Nguyen, Kenneth A. Norman and Uri Hasson, appears in the March 26 issue of the open access journal Nature Communications (DOI: 10.1038/s41467-021-22202-3). The authors wish to thank Princeton COS 126 staff and in particular Robert Sedgewick, Dan Leyzberg, Christopher Moretti, Kevin Wayne, Ibrahim Albluwi, Bridger Hahn, Thomas Schaffner, and Rachel Protacio; Mona Fixdal and the McGraw Center for Teaching and Learning; The Scully Center for the Neuroscience of Mind and Behavior; Peter J. Ramadge; and members of the Norman and Hasson labs for fruitful discussions. This study was supported by NIH Grant DP1-HD091948 to U.H. and by Intel Labs.

For this study, the researchers scanned neurotypical students, but future work will investigate neural divergences, such as autism, dyslexia, dyspraxia or attention-deficit-hyperactivity-disorder (ADHD).

View main article.