A few weeks ago, Ferris State University made a splashy announcement that it planned to enroll two chatbot “students” in its classes, calling it a novel way for colleges to test their curricula.
The unusual idea seems in some ways like a publicity stunt to call attention to a the academic major it offers in artificial intelligence — and local TV news stations pounced on the notion that nonhuman classmates would be participating side-by-side in hybrid college classes with T-shirt-clad young people. But the experiment points to interesting possibilities — and raises ethical questions — about how the latest AI tech might be used to improve teaching.
In fact, the experiment at the Michigan public college could be said to mark a new generation in an area known as “learning analytics.” That’s an approach that’s grown over the past decade or so where colleges try to harness the digital breadcrumbs left by students as they move through digital platforms and online course materials to find patterns that can improve course design and even personalize material for individual students.
“AI could afford us a novel way of seeing into something we haven’t seen into before,” says Kyle Bowen, deputy chief information officer at Arizona State University. “Now we can have the notion of a data doppelganger … the notion that we have something that reflects a persona at a data level.”
In other words, rather than just watching how students click, generative AI tools like ChatGPT make it possible for educators to create simulations of students that embody different profiles — say, a first-generation student or a student struggling in a certain subject — and see what happens when they encounter material in college courses.
“How can we fine-tune responses from AI so they reflect the diversity of our student body or reflect the needs of a first-year student?” Bowen asks, suggesting that doing so could bring new insights to people who design learning experiences.
While Arizona State hasn’t created virtual students, it recently announced a big commitment to experimenting with AI to improve its teaching. Last month the university became the first higher ed institution to partner with OpenAI, the organization behind ChatGPT, with the goal of “enhancing student success” and “streamlining organizational processes.”
And other universities are making pushes into the latest AI as well to better understand student data. When Paul LeBlanc stepped down as president of Southern New Hampshire University late last year, he announced that his next step would be to lead a project at the university to use ChatGPT and other AI tools to reshape college teaching.
So what could generative AI do to improve learning?
Creating AI ‘Students’
So far few details of Ferris State’s experiment have been released — and university spokesman Dave Murray told EdSurge that the chatbot students have not yet started taking classes.
Officials say they are still being built. The two chatbots are dubbed Ann and Fry, the former named after university librarian Ann Breitenwischer and the latter a nod to the fact that a leader of the effort, Kasey Thompson, once worked in the corporate office of McDonald’s. Actual students were interviewed to help develop the personas of the AI bots.
The bots will reportedly be equipped with voice recognition and speech capabilities that will allow them to participate in class discussions with actual students and ask questions of professors. The AI agents will also be fed information from the course syllabi and turn in assignments.
“The whole role of a university and college is evolving to meet the needs of how society is evolving,” Thompson, special assistant to the president for innovation and entrepreneurship at Ferris State, told a local television station. “And what we’re hoping to learn from Ann and Fry is: What does that look like? How can we make that experience better for students?”
Murray says “the goal is to have them in classes this semester.”
Seth Brott, a sophomore at Ferris State University majoring in information security, plans to give his robot classmates a warm welcome.
He says he was “excited” when one of his professors told him about the plan. “I’d love to be in a class with one of these bots and see how they perform,” he says.
Brott says he has experimented with ChatGPT on a few assignments for classes. He says the tech did help him come up with ideas for a public speaking class, but it was less useful when he was allowed to use it in an information security class to suggest ways to protect a data system.
So does he think the chatbots will be able to pass his courses?
“At the moment the chatbots probably can’t perform very well,” he guesses, “but they can learn. When they make a mistake, they receive feedback much like we do.” And he says over time he can imagine the college could refine a chatbot student to be able to thrive in the classroom.
He said he’s excited the university is attempting the innovative experiment. And he also hopes it might push the university to improve its teaching. One friend of his, for instance, recently told him about a course where everyone in the class had an average grade of only 60 percent by midterms. To him, that seemed like a chance to send in a chatbot to see how the instruction could be made clearer for students.
Not every student is enthusiastic, though. Johnny Chang, a Stanford University graduate student who organized a national online seminar last summer to encourage more educators to learn about and try AI, had some questions about the approach at Ferris State.
“If the goal is to get feedback about the student experience, they should build tools to help administrators better talk to real students,” Chang says.
He is currently pursuing a master’s degree in computer science and is focusing on artificial intelligence, and he says the danger of creating chatbot students is that they might bring in “inherent bias” based on how they are trained. For instance, if the chatbot students are trained based on only students of a certain type, Chang says, “the underrepresented student population might end up feeling unsupported.”
That’s not to say that AI can’t play a role in a university making improvements, however. He suggested that leaders at Ferris State could create a tool that would nudge students at various times in their learning process and ask them to fill out quick survey questions. Then AI could be used to sort, organize and synthesize all that data in ways that would have been too difficult to do using previous technologies.
“If the goal is to get insights from student behaviors, what these chatbots are good at is analyzing and summarizing — almost like a copilot for administrators,” Chang says.
The spokesman for Ferris State, Murray, says that the university is up for trying various approaches.
“We often talk to students about their experiences and make changes based on feedback. This is an additional approach,” he says. “We are interested in seeing what types of educational applications we can develop. We’ll learn what works, but also what needs to be refined and what might not work at all.”
Building a ‘Syllabot’
At Arizona State, Bowen says that after a call to the community for ideas for how to use ChatGPT, leaders have approved more than 100 different projects involving hundreds of faculty and staff members. Later they plan to invite students to lead projects as well.
“We want to have a lot of experimentation that takes place,” he says.
One idea being explored is a project that he says they “jokingly call Syllabot.” The concept is: What if a syllabus was something that students could ask questions of rather than a static document?
“If you have an assignment to work on — say a writing prompt — they might ask, ‘How might I approach it?’ he says.
Overall, he says, the university is working on a strategy around “an AI platform for ASU that blends our data here.”
And once large language models can blend with analytical data specific to the college, Bowen says the big question will be, “How can it help us take action on that insight?”