Last month, the Partnership for Assessment of Readiness for College and Careers (PARCC), one of two consortia developing Common Core assessments, released a series of example items that could be a game changer for both schools and assessment vendors. This was the first set of released on its technology platform, and gave the education world a first glimpse into exactly what the phrase “technology-enhanced items” means.
The release went by fairly quietly, attracting relatively little attention compared to most Common Core milestones. But the importance of this release is anything but insignificant.
For the past year or two, we’ve known that Common Core assessments from both PARCC and Smarter Balanced will be taken online and include “technology-enhanced items.” And as a result, huge amounts of energy and attention have been paid two key questions: 1) Are our teachers ready to make the transition instructionally to prepare students for these tests, and 2) is our technology infrastructure and hardware ready to handle the administration of these tests?
However, while it’s hard to dispute that these questions highlight two important and pressing issues, these new released items from PARCC will be the light that illuminates the question that’s been hiding in the corner until now--one that I argue we must find an answer to soon:
As someone who has dug deeply into these released items, and is familiar with the current state of the market for digital assessment platforms, I firmly believe that the answer to the question above should be a resounding no. Now let’s talk about why.
The transition to digital assessment
Prior to PARCC’s release of these technology-enhanced items, many educators oversimplified these items by simply referring to them as “drag-and-drop”, without truly understanding exactly what that means from an assessment standpoint. I believe there was an image in the back of the minds of school leaders, teachers, and vendors that this would essentially amount to a new way of simply inputting multiple choice questions.
However, PARCC’s sample items demonstrate that this is far from the truth.
These items showcase features and frameworks that are matched in very few other online assessments. Those features include graph-based answer input, successive interdependent answers, tabbed multi-part passages, to-scale digital rulers and protractors, and a built-in TI-84 calculator.
PARCC exams will use technology to fundamentally change how students show their mastery--in a way that will be hard (if not impossible) to replicate on paper. This has far reaching impact on schools at all levels of digital assessment integration, as well as on vendors in all areas of the market.
So what do schools and vendors do to prepare?
A call to action for schools
At DSST (Denver School of Science and Technology) Public Schools, we made the switch to digital, standards-aligned assessment more than five years ago, and now give teacher-created online assessments every day in our classrooms. Hundreds of assessments are given on our online assessment platform, ActivProgress, every day. And even as I write this, over 1,000 students are logged on, taking trimester final exams.
Yet even with that strong foothold and an innovative solution in place, PARCC’s released items identify 20+ additional features that we need to incorporate into our assessment platform in order to replicate the PARCC experience.
How should educators respond? Schools must be continuously rethinking and reshaping their assessment patterns as more and more information comes out about what these questions will look like--consider it “a call to action” that requires continuous iteration and modification.
A call to action for vendors
That call doesn’t just stop at schools. These released items should have a major impact on the mindsets of digital assessment vendors. The bar for online assessment has been raised, and vendors can benefit from an enormous bump in demand for platforms like ActivProgress, Mastery Connect, and Illuminate T&A.
Keep in mind, vendors: these demands will come with greatly raised expectations, especially as it relates to new question types (drag-and-drop selections, successive dependent multiple choice questions, graphing input answers, and equation editor input answers, to name a few).
But it won’t stop there. The bar will also be raised for tools and features of the testing environment. The PARCC items boast a test-taking environment that includes things like the ability to flag questions, tabbed multi-part passages, and type-restricted input boxes--as well essential test-taking tools like protractors. These features have instantly become the baseline for quality and feature sets for online assessment tools.
In basic terms? Gone are the days of your grandmother’s 4-function digital calculator (like the one found on the NWEA MAP exams).
When a school director or district official speaks with vendors on the topic of digital assessment platforms, the first and most important question they will ask vendors is, “Does this assessment platform have everything that PARCC/Smarter Balanced does?” If the answer is no, schools will pass you by.
Building compatibility capabilities
Additionally, I see a second call to action for edtech companies--a call for technology-enhanced question compatibility amongst assessment vendors.
In the good ol’ days of pre-PARCC digital assessment, it was fairly easy to transfer question content from one tool/software to another. Item banks from publishers could be easily imported into just about any external assessment tools.
Because of this content integration, school and district leaders could evaluate the assessment author independently from the assessment administration tool. We appreciated this at DSST--we could shop for the highest quality, most well-written assessment items, and then import them (after exporting into WebCT3 or Angel formats) into the test administration tool that best fits our needs.
But now, there’s a problem.
PARCC has taken a different direction with question types, so we no longer have a way for a vendor’s question content to be pulled into another vendor’s platform.
What this means is that for the time being, when schools, districts, and charter school organizations search for high quality, CCSS-aligned item banks from assessment authors, they will have to assess question content and assessment administration tools as a package deal. Unfortunately, this problem is accentuated by the fact that we often see companies commit their energy and resources to one or the other. The vendors with the highest quality, strongest rigor activities often don’t have the highest quality tool for creating assessments, administering them to students, and/or breaking down the resulting data.
So vendors, it’s up to you. One of the original goals behind the shift to Common Core was to enable national collaboration. Now, vendors have a unique and powerful opportunity to support this type of collaboration by through sharing and assessment interoperability. Who will be the first to accept this call to action? Only time will tell.
Further steps
Without question, the most important area that school and edtech industry leaders should be focusing on throughout the Common Core transition is meeting the intense instructional challenges of these new standards. But in my opinion, a strong part of that transition should relate to how we’re meeting the challenges of more rigorous and robust digital assessment on a daily basis.
Ultimately, my hope is that PARCC’s released items won’t go overlooked. They will raise the bar of expectations for digital assessment platforms across the country, and my plea to you is this: schools and vendors, prepare now to meet that bar.