In fall 2022, Alaina Walton, Director of Academic Assessment at Stockton University, was given what sounded like a straightforward task: find and implement an assessment management platform.
But like most things in assessment, it didn’t stay simple for long.
Over the next two and a half years, the work stalled, restarted, and reshaped itself more than once. Leadership changed. Coordination across departments never fully clicked. A six-month rollout plan quickly proved unrealistic for an institution that had historically operated without a centralized platform.
By early 2024, things hit a breaking point.
Alaina and her colleague Nicole Soos, Assessment Management Coordinator, were running large Zoom training sessions with 40 to 60 attendees - and watching them unravel in real time. Program-specific questions piled up faster than they could answer them. What was supposed to create clarity was creating more confusion.
“We were on the verge of falling apart,” Alaina shared in a recent webinar. “We needed to pause and really think about what it would take to make this work.”
That pause changed everything.
What they rebuilt from that moment offers a practical—and honest—look at what it actually takes to make platform adoption work. Not just technically, but institutionally.
Here are five lessons from Stockton’s experience.
1. Build your buying committee before you select an assessment management platform
It’s easy to think of platform selection as a small group decision. But adoption doesn’t happen in a small group—it happens across an institution.
Alaina recognized that early and brought together a cross-functional group: IT leadership, deans, program chairs, and accreditation managers. Not just to evaluate vendors, but to define what success needed to look like.
That group shaped the criteria in a meaningful way:
-
Integration with the LMS wasn’t a “nice to have”- it was essential for scalable data collection
-
Flexibility mattered because not every program operates the same way
-
Ease of use wasn’t negotiable because they knew complexity would slow adoption before it even started
When institutions skip this step, they often end up with a platform that works well for the people who selected it but not for the people expected to use it.
Building your village early doesn’t just improve your decision. It builds the alignment you will rely on later.
2. Verbal buy-in won’t survive your next org chart change
Momentum in higher education is fragile. A project that feels fully supported can lose traction quickly when leadership changes. Not because the work isn’t valuable, but because that support wasn’t structurally embedded.
That’s what Stockton experienced mid-rollout. By the time they signed with HelioCampus in May 2023, reporting lines had shifted. The original momentum didn’t disappear, but it had to be rebuilt.
The lesson here isn’t just “get buy-in.” It’s: make it durable. Embed the work into departmental goals, align it with institutional priorities, and establish clear governance and share accountability.
When adoption lives in structure, not just in relationships, it’s much more likely to survive change.
3. Audit assessment readiness before you launch
One of the biggest assumptions Stockton made was that programs would move at roughly the same pace. When Alaina and Nicole took a step back to assess where programs actually were, the variation was significant:
-
Some had strong, well-functioning assessment processes
-
Others had outdated or overly broad outcomes
-
Some had never meaningfully connected data to improvement
-
And in many cases, faculty weren’t consistently using the LMS—making scalable data collection difficult from the start
“Some programs needed foundational support before they needed platform training,” Alaina said. “Others needed relationship-building before either.”
That’s the reality of assessment work. Readiness isn’t just about skill—it’s about culture, confidence, capacity, and trust.
Taking time to understand that upfront allows you to tier your approach, set more realistic timelines, and avoid the moment where a training session turns into confusion for everyone. You can use our Assessment Readiness Evaluation tool to get started.
4. Beware of a 'train the masses' approach that can slow assessment platform adoption
The large Zoom trainings didn’t fail because people weren’t engaged. They failed because they weren’t designed for how people actually learn. When every program has different needs, centralized training can only go so far.
Stockton’s shift was simple, but powerful.
Instead of trying to train everyone at once, they identified one or two “platform specialists” within each school. These individuals learned the system more deeply and became the go-to resource for their peers.
In one semester, a team of two became a network of 15, which changed the dynamic completely. Now, questions were answered in context, faculty were learning from peers who understood their programs, and support felt more accessible and relevant. It shifted from centralized control to distributed support and made adoption feel more natural.
5. Lead with grace because deadlines alone won’t get you there
This is the part that’s hardest to plan for and the easiest to overlook. Adopting an assessment management platform isn’t just a technical change. It’s a behavioral one. It asks people to rethink existing processes, learn new systems, and engage with assessment in a different way.
All on top of everything else they’re already doing. Not every program will move at the same pace. Not every step will go as planned.
For Stockton, progress started to accelerate when they stopped asking, Why aren’t they adopting? and started asking, What support do they need? “Grace became just as important as any technical decision we made,” Alaina said. That mindset didn't lower expectations—it made them achievable.
The village you build determines the assessment management platform adoption you get
Stockton’s progress—from a stalled rollout to institution-wide momentum—didn’t come from the platform alone. It came from:
-
the people they brought in,
-
the structure they built.
-
and the patience they allowed for the process to evolve
As of early 2026, their goal is to have all 160+ programs linking assignments in the platform by spring 2027. That kind of progress doesn’t happen all at once. It builds cycle by cycle.
If you’re at the beginning of an assessment management platform search or in the middle of an implementation that isn’t going as planned this is the part worth remembering:
Adoption isn’t something you enforce. It’s something you build.
FAQs:


