For video game players, the 2010s were a decade of extended reality. 2012 saw the development of Oculus, a virtual reality headset company that Facebook later acquired for more than $2 billion. In 2013, Google started selling consumers Glass, a computer built into a pair of glasses. 2016 brought us Pokemon Go, the addictive game that saw players running around their neighborhoods hunting virtual creatures projected onto the physical landscape through their smartphones.
2020 was predicted to be the first year of the “ambient computing” decade, when these technologies would infiltrate the lives of everyone else, not just those in the gaming world. Just as smartphones have become indispensable to us, extended reality and the Internet of Things were set to become our daily norm.
But shortly into 2020, COVID-19 hit, and we all went … two dimensional. As the grand experiment in remote teaching and learning began, universities rushed to adapt. Zoom, Teams, Skype, and FaceTime all became daily fixtures, and many of us quickly became fatigued by seeing our colleagues, students and far-away loved ones almost exclusively in 2D. Most video conferencing solutions were not designed to be online classrooms. While we have been able to use these tools in that way, most educators would readily point to what is missing from the current video platforms that could improve online teaching: tools to better facilitate student interactions, including enhanced polling and quizzing features, group work tools, and more.
While universities continue to increase in-person and HyFlex courses, hoping to soon see campuses return to normalcy, there is mounting evidence that the increased interest in digital tools for teaching and learning will persist even after the pandemic. Incoming first-year students today are digital natives, and their innate understanding of and ability to use computers and the internet is greater than any generation entering college before them. We have to ensure that in this decade of ambient computing, higher education does not miss opportunities to leverage innovative technologies that enhance learning. We should move beyond 2D solutions and take advantage of what extended reality (XR) and virtual reality (VR) have to offer us.
And it is not enough to try to use existing VR/XR applications and tailor them to educational scenarios. These tools can and should be created with pedagogy, student experience, and learning outcomes as the priority.
At Columbia University, we’ve been building the infrastructure to support this type of innovation for years. Professor Courtney Cogburn created the 1,000 Cut Journey, an immersive VR research project that allows participants to embody an avatar that experiences various forms of racism. Professor Shantanu Lal has implemented VR headsets for pediatric dentistry patients who become anxious during procedures. At Columbia Engineering, professor Steven Feiner’s Computer Graphics and User Interfaces Lab explores the design and development of 2D and 3D user interfaces for a broad range of applications and devices. Professor Letty Moss-Salentijn is working with Feiner’s lab to create dental training simulations to guide dental students through the process of nerve block injection. Faculty, students and staff at Columbia’s Media Center for Art History have created hundreds of virtual reality panoramas of archaeology projects and fieldwork that are available on the Art Atlas platform.
This technology proved useful to our faculty and students during the pandemic. For example, this past fall, professor Brent Stockwell shipped Oculus headsets to students so that they could take part in discussion sessions in VR. Instead of studying drawings or renderings of molecules, students could see these virtually in 3D. They could walk around, manipulate and interact with these structures with their classmates to learn key biochemistry concepts and solve problems.
One student participant noted: “The 3D protein models and stereochemistry of reactions was very helpful in understanding selectivity [and] helped me retain concepts better. Being able to move and resize the models was also helpful for gaining a different perspective.”
But another student added: “The VR headset makes it difficult to look at the 3D content and take notes simultaneously.”
Measuring the effectiveness of these XR projects in education is not as simple as evaluating whether students learned more through this method than through alternative methods of instruction. The interactivity and connectivity that students and faculty experience in XR must be included in analysis, particularly when comparing it to remote education. Online education often does not easily allow the serendipitous interactions that can happen in person, but XR can encourage these types of interactions.
To analyze the results of Stockwell’s VR experiment—which was supported by an Office of the Provost Teaching & Learning Grant that provides money and in-kind assistance to faculty looking to innovate and integrate new educational methods and technologies in their teaching—Stockwell is working with Columbia’s Science of Learning Research Initiative to see what could be improved for future iterations.
Students have embraced this type of technology beyond their coursework, too. In spring 2020, a group of Columbia students began to build “LionCraft,” a recreation of Columbia’s Morningside campus in Minecraft. Even though students were spread out around the world, they still found creative and fun ways to run into each other on campus, in an immersive online format.
LionCraft and the many similar projects that were simultaneously created at other universities make it quite clear that the current 2D remote-learning experiment cannot continue as the only solution to how we innovate in education and its modalities. Defining our online, hybrid and in-person teaching and learning by the new wave of extended realities technologies—rather than those of earlier eras—is key as we enter the post-pandemic era.