With anxiety over AI growing, the federal government published its blueprint for how to keep privacy from flatlining in the digital age.
Published last week, the Biden Administration’s “Blueprint for an AI Bill of Rights,” a non-binding set of principles meant to safeguard privacy, included a provision for data privacy and notes education as one of the key areas involved.
The blueprint was immediately characterized as broadly “toothless” in the fight to mend Big Tech and the private sector’s ways, with the tech writer Khari Johnson arguing that the blueprint has less bite than similar European legislation while noticing that the blueprint doesn’t mention the possibility of banning some AI. Instead, Johnson noted, the blueprint is most likely to course-correct the federal government’s relationship to machine learning.
To privacy experts, it’s a leap forward that at least underlines the need for more public discussion of the issues.
Slow Progress Is Still Progress
What does an ‘AI Bill of Rights’ mean for education?
It’s unclear how the blueprint will be used by the Department of Education, says Jason Kelley, an associate director of digital strategy for the Electronic Frontier Foundation, a prominent digital privacy nonprofit.
Education is one of the areas specifically mentioned in the bill, but observers have noted that the timeline for the Department of Education is relatively sluggish. For example: Guidance on using AI for teaching and learning is slated for 2023, later than deadlines for other government agencies.
And whatever guidelines emerge won’t be a panacea for the education system. But that the government recognizes that students’ rights are being violated by machine learning tools is a “great step forward,” Kelley wrote in an email to EdSurge.
The release of the blueprint comes at a time when privacy seems elusive in schools, both K-12 and college. And there have been calls for federal intervention on those fronts for some time.
Of particular concern are the use of AI surveillance systems. For instance: One recent Center for Democracy in Technology study found that schools more often use surveillance systems to punish students than to protect them. The technology, while intended to prevent school shootings or alert authorities to self-harm risks, can harm vulnerable students, like LGBTQ+ students, the most, the study noted.
The blueprint signals to schools—and edtech developers—that humans should be reviewing the decisions made by AI tools, Kelley said. It also shows, he adds, that transparency is “essential” and that data privacy “must be paramount.”
Bring It Into the Classroom
A lot of what’s in the blueprint relies on basic principles of privacy, says Linette Attai, a data privacy expert and the president of the consulting firm PlayWell, LLC.
Even so, translating the rather broad blueprint into specific legislation could be tricky.
“There’s no one-size-fits-all technology,” Attai says. She suggests that school districts get more business savvy about their tech and continuously assess how that tech is impacting their communities. And school leaders need to clearly spell out what they’re trying to accomplish rather than just bringing in flashy new gadgets, she adds.
While the attention to these issues may be new, the challenge isn’t.
In a study of how college students and professors think about the digital systems they use, Barbara Fister found that the educators and students she talked to had never thought seriously about the digital platforms they were using. When she told students about it, they were upset. But they felt powerless. “There was no informed consent involved, as far as we could tell,” says Fister, a professor emerita at Gustavus Adolphus College and the inaugural scholar-in-residence for Project Information Literacy.
Students were learning more from each other than from teachers, and lessons about information literacy teaching seemed to rely on guidance that was already out of date, Fister says. Many college students seemed to not expect to learn about how to manage digital tools from their professors, she says.
That was before the pandemic, in 2019. These platforms are likely on people’s radars now, she says. But the issues they raise don’t have to stay outside the classroom.
Fister likes the blueprint’s approach, partly because its recommended materials lay out specific examples of how algorithms are being used, which she sees as useful for those looking to bring this issue into the classroom for discussion.
“It's stuff that students can get really excited about,” Fister says. “Because it's taking a thing that's sort of in the ether, it's something that affects them.”