If you take data out of context, does it lose its meaning? This question constantly plagues Steven Gaudino, VP of Product Management at Reasoning Mind.
The Houston-based nonprofit organization develops online blended learning programs that can be used as core or supplemental curriculum in math classrooms, or to guide small group work in after-school programs. The nonprofit has been researching and iterating its math methodology since 2000, and its products have been used in classrooms since 2005. Currently, Reasoning Mind offers five programs that serve over 140,000 students in Pre-K through seventh grade and provides professional development to help teachers understand the organization’s approach to mathematics.
The programs collect a variety of data, from usage to performance, which Gaudino believes can guide instruction to target specific student needs. But he worries that data loses its meaning when it is shrunken down, exported and fed into other systems. He realizes that some administrators desire to see the data in a single dashboard, but questions whether combining it will help students learn. “Centralizing information in itself is not necessarily going to improve educational outcomes for students,” he says.
Losing Context When Exporting Data
Gaudino says that Reasoning Mind's programs use artificial intelligence to decide the sequence of content each student sees and to help teachers guide instruction; therefore data collection is a fundamental component of its system.
The reporting functionalities were quite limited at first. The programs could display high-level metrics such as the percentage of problems that students solved correctly, but teachers had to manually input that data into their own Excel spreadsheets. It perturbed the product development team and spurred action. “Simultaneously impressed with their efforts and horrified that it was necessary, we made it a priority to get them reports that better aligned with their needs,” Gaudino explains. This realization led to the development of a new set of reports designed to give teachers more specific information that would guide instruction. The new reports allow a teacher can click on a cell to see problems a student solved incorrectly and review the student’s work. They also provide notifications to follow up with specific students during the class period.
Today, the system collects typical data points including student roster information (name, grade and class) and usage data, such as how long a student has spent on a particular activity. But Gaudino is most excited about a more complex type of data. For example, Reasoning Mind’s programs involve open-response questions that the system can analyze to identify which part of a problem led to student error; the system then uses that information to generate problems tailored to specific misunderstandings a student has. Using student performance data, the system can also identify which students may be best able to mentor others, and connect online tutors for specific students that need additional support.
The level of complexity in the data collected by Reasoning Mind’s programs makes it impossible to export every data point to another system. “We can’t feed every solution to every problem a student has done to some other database,” says Gaudino. “It would be too detailed.” The challenge for Reasoning Mind is figuring out which data points are the most meaningful and actionable for teachers.
“We can share the amount of time spent online, the percentage of problems solved correctly or topics that a student struggled on. But you lose some context as soon as you leave the Reasoning Mind environment,” Gaudino explains. The richness of data is lost when it is exported to a separate system, and the organization fears that teachers won’t be able to use it to inform instruction. This fear has prompted apprehension about prioritizing data export capabilities.
Resolving Interoperability Issues District-by-District
Reasoning Mind has worked with its customers to address data sharing and rostering requests, which have accelerated over the past five years, according to Gaudino. To date, this has been a district-by-district effort and the organization has not created a consistent approach. Gaudino, who has been with the organization for over ten years, speculates that at some point, the industry will consolidate, adopting a common set of standards or reporting dashboard that will help make data sharing more meaningful. Until that happens, Reasoning Mind will support districts on a case-by-case basis.
Many districts are interested in optimizing student roster data, which includes the administrative information that allows teachers to create class accounts in the system. In particular, there’s been greater demand for creating single sign-on solutions that allow students to log in to different programs using the same credentials, observes William McGuinness, director of product management at Reasoning Mind. Several solutions have emerged from companies including Clever, Microsoft and Google. (Reasoning Mind will be launching integration with Clever in the fall of 2017.)
The organization has also experimented with building its own API to meet a district’s needs. During the 2014-2015 school year, as part of a blended learning initiative led by LEAP Innovations, Chicago Public Schools (CPS) piloted Reasoning Mind’s Foundations curriculum with grades two through five. During the pilot, CPS requested that Reasoning Mind build an API to support single sign on with Engrade, the district’s learning management system.
McGuinness recalls that it wasn’t difficult for the team to build the API. The greatest challenge was figuring out what data to send and how it will impact student learning. “We can do some exploratory work and come up with something that spits out data within a few weeks, but is that the best use of our time?” he asks. “Or do we invest in working with districts to build something more integrated with support for teachers in actually using the data?”
There was also a more fundamental logistics issue: every school system is different, and building APIs for each one is not feasible. “What you need in order for an API to work is for all districts to use the same set of standards to connect to vendors, but that is technically complicated.”
Watchfully Waiting for Consolidation
For now, the organization is taking a “wait and see” approach to interoperability. The product team is currently investigating the Learning Tools Interoperability (LTI) standards from IMS Global Learning Consortium.
“Currently, things are pretty disparate, and since our goal is to meet the most needs, we’re going to be watchful,” Gaudino explains. He believes that unification around a set of standards will happen naturally out of efficiency and convenience, but he cautions that it can’t stop there because above all else, interoperability must improve teaching and learning—and to do that, the data must be meaningful.