Deciding which tools to use in the classroom is no easy task. To make matters more complicated, there isn’t a perfect catalogue of every edtech product full with reviews, product details and exact specifications where school leaders can quickly enter their needs and poof! Out pops an optimal program.
Instead, teachers and administrators must resort to a lesson many of us recall from Algebra I: trial and error. Notice a need, find a product, give it a shot and if it doesn't work, replace it. It’s a simple idea that, while easy to sell, most school leaders with experience testing out products would scoff at. That’s because edtech pilots are far more costly and time consuming than factoring trinomials, and leave many busy educators feeling overwhelmed by the process of self-evaluating product quality and effectiveness.
“We’re in an age now where tech is moving so rapidly and we don’t want to just do something to do it,” Todd Keruskin, assistant superintendent at Elizabeth Forward School District in Pennsylvania, says in a post from DC-based nonprofit Digital Promise. “We want to be able to analyze results.”
Noticing that struggle, Digital Promise decided to carve a step-by-step guide, the Ed-Tech Pilot Framework, for those interested in—or perhaps intimidated by—evaluating an edtech product. “Districts rely heavily on edtech pilots to make purchasing decisions, but the pilots are informal and might not generate the evidence they need for a smart purchasing decision,” says Aubrey Francisco, a research director at Digital Promise. “There was a need to define pilots and come up with a process to help them gather the necessary info to make those decisions.”
The free online tool provides clarity to the otherwise ambiguous and consumptive process of edtech procurement through eight steps that school leaders, educators and administrators can follow whether they’re starting from scratch or have already narrowed their focus.
- Identify Need: Using tools like a school technology needs assessment from the Friday Institute School, the tool has users start by first specifying what problem they are trying to solve.
- Discover & Select: There’s a lot to choose from when it comes to edtech products. Step 2 points the user toward evaluation rubrics, studies and indices (including EdSurge’s own Index and Concierge).
- Planning: In order to reach goals, districts must be able to clearly articulate what those goals are by answering the question, “what does success look like?”
- Train and Implement: It’s necessary to set aside time to train educators and prepare students for new learning tools and programs. This step provides survey questions to check in with educators as well as sample job descriptions for districts looking to hire edtech specialists.
- Collect Data: Data can be a powerful tool in judging a tool’s success. But crucial to that is good quantitative and qualitative data-collecting. Digital Promise provides example surveys and questions to pull from when analyzing products.
- Analyze & Decide: Good data alone can only go so far. Analysis and implementation of data is crucial to gleaning important information from collected data.
- Negotiate & Purchase: Armed with the right knowledge and information, districts can negotiate their contracts with product developers. The framework includes resources like a guide for negotiating with ed-tech vendors and cost toolkit.
- Summarize & Share: Sharing the pilot experience—good or bad—helps other districts in their decision-making, and templates and example reports help make telling the story easier.
The extensive and step-specific resources are meant to work as a reference material, Francisco explains. “Each pilot is unique and has a unique need. What we think about is how much does it cost and how much time does it take.”
Producing the framework itself was somewhat of a trial-and-error process. Over the past three years, Digital Promise reviewed findings from pilot studies of 15 products in 14 of the nonprofit’s League of Innovative Schools districts—a national network of education leaders working to improve student outcome through technology and partnerships. “We wanted to learn with the League what the challenges with pilots were… After that we had a good idea of how we thought these pilots could be structured, so we conducted pilots ourselves,” Francisco says. “This is a collection of lessons learned.”
That advice is a good starting point. With few outlets available for districts and educators to relay their experiences with products and pilots, simply sharing what works and what doesn’t could help cut down on the error portion of trial and error, and make product pilots more effective, efficient, and most importantly, informative.