The pandemic has largely changed public perceptions about the appropriate use of technology for young people, argues Katie Davis, associate professor in the information school at the University of Washington.
“The pandemic forced us to confront the fact that technology is absolutely essential in our lives, and especially during crises,” she says. Now, she says, discussion is shifting to questions of “When is technology good? When is it bad? What should its role be in young people's development at each stage of their progression, from toddlers all the way up to emerging adulthood and beyond?”
The EdSurge Podcast recently interviewed Davis, who has done research on the intersection of child development and technology for nearly 20 years. She lays out a framework for how to best match tech with each stage of growth in a new book, “Technology's Child: Digital Media’s Role in the Ages and Stages of Growing Up.” It celebrates when technology can help kids thrive — as well as cautions about when it can get in the way.
Sometimes the problems posed by gadgets can emerge in unexpected ways, she says, such as when literacy apps aimed at young readers feature too many bells and whistles, like a word’s meaning popping up on screen as children tap it, or rich sounds playing as children read.
“You think, that must be really good when learning to read, to hear the word being sounded out. And in theory, these do seem like good ways to enhance the learning experience,” Davis says. “However, we have to remember that especially for young children, there's a limit to their information-processing bandwidth. If you think of a computer, an analogy to a computer, they have just smaller CPUs than we do as adults.”
And she says there is a growing awareness of how some tech companies design their systems to do things that aren’t in the users’ best interest, a phenomenon referred to as “dark patterns.” A common example of a dark pattern, Davis says, is the autoplay feature on YouTube that often keeps viewers watching and can make it more difficult for a parent to convince their young child to put down a device.
Davis calls for increased regulation of tech companies to rein in such design features.
“Relying on the tech companies to regulate themselves doesn't work,” she argues, “because it's just not in their best interest financially to place user well-being front and center. Unfortunately, that's just not what makes them a lot of money.”
But she acknowledges that regulations can have unintended consequences that can be harmful as well. So she calls on academics to conduct more research to help inform best practices for tech tools, so that they foster well-being and are more effective for education.
Listen to the episode on Apple Podcasts, Overcast, Spotify or wherever you get your podcasts, or use the player on this page.