With digital education platforms generating data on how millions of students are learning, they are also sitting on veritable information gold mines for researchers who are trying to improve education.
An ethical and legal conundrum stands in the way: how to responsibly share that data without opening students up to the possibility of having their personal information exposed to outside parties.
Now a consortium of education researchers and learning platforms are developing what they hope is a solution — researchers will never see the actual data.
The project dubbed SafeInsights, helmed by OpenStax at Rice University, is supported by a $90 million grant from the National Science Foundation over five years.
The idea is for SafeInsights to serve as a bridge between its learning platform and research partners, alongside collaborators helping flesh out how the exchange will work to safeguard student privacy.
“In a normal situation, you end up taking data from learning websites and apps and giving it to researchers for them to study and for them to analyze it to learn from,” JP Slavinsky, SafeInsights executive director and OpenStax technical director, says. “Instead, we're taking the researchers’ questions to that data. This creates a safer environment for research that's easier for schools and platforms to participate in, because the data is staying where it is already.”
Deeper Insights on a Large Scale
Another way to think of SafeInsights is as a telescope, say Slavinsky and his colleague Richard Baraniuk, the founder and director of OpenStax, which publishes open access course materials. It will allow researchers to peer into the vast amount of data from learning platforms like the University of Pennsylvania’s Massive Online Open Courses and Quill.org.
Researchers would develop questions — then transform those questions into computer code that can sift through the data — to be delivered to learning platforms. After the results are generated, they would be returned to researchers without the data ever having to be directly shared.
“It is really a partnership where we have researchers coming together with schools and platforms, and we're jointly trying to solve some problems of interest,” Slavinsky says. “We are providing that telescope for others to bring their research agenda and the questions they want to answer. So we're less involved on what specifically is going to be asked and more on making as many questions as possible answerable.”
Part of why this model would be so powerful is how it would increase the scale at which education research is done, Baraniuk says. There are plenty of studies that have small sample sizes of about 50 college students, he explains, who participate as part of a psychology class.
“A lot of the studies are about freshman college kids, right? Well, that's not representative of the huge breadth of different students,” Baraniuk says. “The only way you're gonna be able to see that breadth is by doing large studies, so really the first key behind SafeInsights is partnering with these digital education websites and apps who host literally millions of students every day.”
Another aspect where he sees the project opening new doors for researchers is the diversity of the student populations represented by the learning platform partners, which include education apps for reading, writing and science along with learning management systems.
“By putting together all of these puzzle pieces, the idea is that we can — at a very large scale — get to see a more complete picture of these students,” Baraniuk says. “The big goal of ours is to try to remove as much friction as possible so that more useful research can happen, and then more research-backed pedagogies and teaching techniques can actually get applied. But while removing that friction, how do we keep everything really safeguarded?”
Creating Trust, Protecting Privacy
Before any research takes place, SafeInsights partners at the Future of Privacy Forum are helping develop the policies that will shape how the program guards students’ data.
John Verdi, the Future of Privacy Forum’s senior vice president for policy, says the goal is to have privacy protections baked into how everything operates. Part of that is helping to develop what he calls the “data enclave,” or the process by which researchers can query a learning platform’s data without having direct access. Other aspects include helping develop the review process for how research projects are selected, training researchers on privacy and publishing lessons learned about operating with privacy at the forefront.
“Even if you have great technical safeguards in place, even if you do great ethical vetting,” he says about the training aspect, “at the end of the day, researchers themselves have decisions to make about how to responsibly use the system. They need to understand how the system works.”
The protection of student data privacy in education is generally “woefully under-funded,” he says, but it’s safeguarding that information that allows students to trust learning platforms — and ultimately create research opportunities like SafeInsights.
“Tasking students and parents to protect data is the wrong place to put that responsibility,” Verdi says. “Instead, what we need to do is build digital infrastructure that is privacy respectful by default, and [that] provides assurances that information will be kept confidential and used ethically.”