News

Using data to identify at-risk students early

Share with colleagues

Download the full Case Study

Take an in-depth look at how educators have used Cadmus to better deliver assessments.

Thank you!

We'll get back to you shortly.

By the time a student fails an assessment, there’s usually not much room left to help. A low grade or a missed submission tends to be the point where a problem becomes visible, not where it starts.

Most institutions are already trying to support students well. The difficulty is timing. If the first clear signal only shows up at submission, any response is happening after the fact.

Disengagement rarely happens overnight

Students don’t usually disengage all at once. It tends to show up in smaller ways first. A delayed start. Logging in but not really getting going. Starting strong, then dropping off halfway through.

On their own, these don’t necessarily mean much. But over time, they can point to a student who is starting to struggle.

The issue is that these patterns are often hard to see. In many cases, educators only encounter the final submission. The stops and starts, the gaps, the points where a student got stuck are not visible.

Outcome-based data reflects the same limitation. Grades, pass rates, progression data all tell you where students ended up. They are useful for seeing trends across a cohort, but they don’t show what was happening while the work was being done. By the time a poor outcome appears, the earlier signals have already passed.

Looking at the learning process itself

A more useful place to look is the process of learning itself. Instead of focusing only on what gets submitted, it helps to look at how that work comes together.

That might include:

  • When a student first engages with the task
  • How often they return to it
  • How much time they spend working on it
  • How their drafts change over time
  • Whether they use the guidance or feedback available
  • Where progress seems to slow down or stop

None of these signals are perfect on their own. But taken together, they start to give a clearer picture.

They are also easier to work with when they are captured as the work is happening. When students are completing assessments in an environment that records drafting and interaction, you can see the development of the work rather than trying to infer it afterwards.

That makes a practical difference. Instead of guessing how a student approached a task, you can see it.

There is a growing body of research backing this up. Work using behavioural trace data has shown that these kinds of patterns can be used to identify at-risk students during a course, not just at the end.

Turning visibility into support

Seeing more is only useful if it changes what you do next.

When there is visibility into student progress, it becomes easier to step in earlier. Not in a heavy-handed way, but in small, timely ways.

A student who hasn’t started might need a nudge. Someone dipping in and out might benefit from a quick check-in. Someone putting in time but not moving forward might need more specific guidance.

This is where timing matters. The same support offered earlier can have a different effect than it would later.

Research in learning analytics reflects this. Predictive models can flag students who are likely to struggle, but that alone does not change outcomes. What seems to matter is what happens next. In one study (Dai et al., 2025), over 30% of students identified as at risk re-engaged with learning activities after receiving targeted feedback within a couple of weeks.

That kind of shift is not universal, but it is enough to show that early intervention can change behaviour.

Having a view across a whole class also helps. Patterns become easier to spot, and it is clearer where attention is needed most.

A different lens on academic integrity

Looking at the process also changes how academic integrity is approached.

A final submission is a snapshot. If something looks off, it can be hard to tell why. It might be misconduct, or it might just be a student having an unusually good or bad attempt.

Process data adds context. You can see whether a piece of work developed over time, how it changed, and how much effort went into it.

That does not remove the need for judgment, but it does make those judgments easier to ground. In some cases, it also allows issues to be picked up earlier, rather than only at the point of submission.

Designing for early visibility

If early visibility matters, it has to be built into the assessment itself.

Tasks that are broken into stages naturally create more points of interaction. Guidance that sits alongside the work encourages students to engage as they go, rather than all at once at the end.

When this all happens in one place, those interactions can be seen without adding extra work for educators.

It also helps to be selective about what you pay attention to. Not every data point is useful. What matters is whether the signal is clear enough to act on.

From seeing to acting

More data does not automatically lead to better decisions. What matters is whether it helps you act at the right time.

In practice, a few simple patterns often go a long way:

  • No real engagement early on
  • Activity that drops off after an initial start
  • Very little drafting or revision close to the deadline
  • Repeated activity that does not lead to progress

These are not perfect indicators, but they are early ones.

Some students will still struggle. That is part of teaching. The difference is whether those moments are visible early enough to respond, and whether there is enough context to make that response useful.

Category

Student Success

More News

Load more
What a design-led model of academic integrity actually looks like in practice

Academic Integrity

Leadership

What a design-led model of academic integrity actually looks like in practice

In this piece, Cadmus Co-CEO Brigitte Elliott puts language to a growing shift in the sector: from treating academic integrity as a compliance challenge to understanding it as a function of assessment design. Drawing on decades of research, she outlines how institutions can move beyond detection-led responses and build systems that produce verifiable learning by design.

Brigitte Elliott, Co-CEO, Cadmus

2026-04-13

Why detection-first integrity strategies are becoming a sunk cost for universities

Academic Integrity

Leadership

Why detection-first integrity strategies are becoming a sunk cost for universities

In this article, Founder & Co-CEO Herk Kailis puts language to a growing tension across institutions: as the market leans further into detection, the response risks becoming more reactive than strategic. Through a commercial lens, he explores how this is beginning to resemble a classic sunk cost trap, and why the stronger long-term play is to design assessments that are harder to game in the first place.

Herk Kailis, Founder & Co-CEO, Cadmus

2026-03-30

How we use data at Cadmus

Product

How we use data at Cadmus

Being online creates new opportunities to understand how learning happens. Discover how Cadmus uses data to support learning, improve assessment design, and build transparency and trust across every stage of the assessment process.

Cadmus

2026-03-31