Compliance

One number

On-time

Matters

Not just “done”

Recovery

Hours

Not weeks

A practical KPI playbook

How to Measure
Compliance in Clinical Trials.

Compliance is not a single metric. It’s a system of measures: task completion, on-time behavior, wearable wear-time, device health, and the speed at which a study detects and recovers drift.

ePRO completion · Wear-time adherence · Device health · Response windows

ePRO
WT
SYNC
TTR

Compliance Measurement

Tasks + Wear-time + Device Health + Recovery

“I’m wearing it… but it stopped syncing.”
That’s a device health event. Compliance can look “fine” until you measure sync + recovery time.
The best studies track early signals—and intervene before the endpoint gap appears.
Make compliance measurable Then make it recoverable

Definition: Compliance in Clinical Trials

Clinical trial compliance is the degree to which participants (and the study team) complete protocol-required tasks within defined time windows—producing usable, longitudinal data with minimal gaps.

Key point Measuring compliance as “% completed” is incomplete. The most useful definitions include: on-time behavior, device health, and time-to-recovery.

Task completion

Did it happen?

ePRO / eCOA / diaries

On-time completion

When?

Window, late threshold

Wearable adherence

Was it worn + synced?

Wear-time + device health

Compliance is usually the first sign of risk—long before dropout. That’s why it’s your most actionable metric.

Compliance measurement across tasks, wear-time, and device health

The 4-Layer Compliance Framework

Think of compliance as a stack. If you measure only the top layer, you miss why the data degraded.

Layer 1: Completion

Was the protocol-required task done at all? (ePRO, diary, assessment, visit, device wear event.)

Layer 2: Timeliness

Was it done within the defined window? “Late but completed” often behaves like missing data.

Layer 3: Device Health

Was the device able to produce usable data? Sync success, battery, pairing, permissions.

Layer 4: Recovery

When drift happens, how fast does the study detect and recover it? Time-to-recovery is the real differentiator.

Why this matters If you don’t measure device health and recovery time, you can report “high compliance” while the endpoint signal quietly disappears.

The KPI Table: What to Measure (with Simple Formulas)

These KPIs are intentionally practical—built to drive actions. Use them per patient, per site, and at the study level.

KPI What it tells you Simple formula Best used for
ePRO Completion Rate % of assigned ePROs completed (Completed ePROs ÷ Assigned ePROs) × 100 Overall adherence, risk trending
On-Time Completion Rate % completed within the allowed window (On-time completions ÷ Assigned) × 100 Endpoint integrity, operational response needs
Late Completion Rate How often tasks happen after the window (Late completions ÷ Assigned) × 100 Friction detection (UI, reminders, burden)
Consecutive Missed Days Early drift signal (behavior breakdown) Count of consecutive days missed Triggering outreach + escalation rules
Wear-Time Adherence Whether the device was worn enough (Hours worn ÷ Expected hours) × 100 Digital endpoint usability, wear-time decay
Valid Data Days Days meeting minimum “valid” criteria Count of days meeting endpoint rules Endpoint readiness, statistical power protection
Sync Success Rate Whether data actually arrived (Successful syncs ÷ Expected syncs) × 100 Silent failure detection (Bluetooth, permissions)
Device Health Score Composite: battery + pairing + sync + app status Weighted score (study-defined) Proactive troubleshooting + triage
Time-to-Recovery (TTR) How fast you restore compliance after drift Time from issue detection → resolution Operational quality, prevention of endpoint gaps
Escalation Rate How often patient issues reach the site (Site escalations ÷ Active participants) × 100 Site burden control, workflow effectiveness

Want a faster way to explain this internally? Use a single sentence: Compliance = Completion + Timeliness + Device Health + Recovery speed.

Use the Compliance Calculator

KPI dashboard with ePRO completion, wear-time, and device sync health

Thresholds & Escalation Rules (Make Compliance Actionable)

Metrics only matter if they trigger action. Below is a practical set of starting thresholds that many studies adapt.

ePRO completion threshold

Target: ≥ 90–95% completion (study-dependent).
Trigger: 2 missed tasks in a row → outreach within 24 hours.

On-time completion threshold

Target: ≥ 85–90% on-time.
Trigger: late completion trend for 3+ tasks → friction investigation (UI, reminders, burden).

Wear-time threshold

Target: endpoint-defined (e.g., ≥ 18 hrs/day, or ≥ 5 valid days/week).
Trigger: drop below threshold for 2 days → proactive outreach + troubleshooting.

Sync / device health threshold

Target: expected data receipt daily/near-real-time (study-defined).
Trigger: no sync for 24 hours → check battery, pairing, app permissions, OS settings.

Time-to-Recovery (TTR) threshold

Target: recover most issues within 24–48 hours.
Trigger: unresolved beyond 48 hours → escalate with context and steps attempted.

Escalation quality

Target: fewer alerts, higher context.
Rule: don’t escalate until troubleshooting playbook is executed (unless safety/clinical threshold).

Important Escalation is not a substitute for monitoring. A good model resolves most issues before the site ever sees them.

What “Good” Compliance Dashboards Show

The goal isn’t pretty charts. It’s early detection + fast response. A compliance dashboard should answer: Who is drifting? Why? What was done? Did it work?

If your dashboard cannot show time-to-recovery and intervention outcomes, it’s reporting compliance—not operating it.

Explore Real-Time Analytics

Compliance dashboards showing completion, wear-time, device health, and intervention outcomes

The Operating Model That Prevents Missing Data

Software captures data. It rarely enforces behavior. The highest-performing studies treat compliance as an operating model: daily detection, rapid outreach, guided recovery, and escalation only when recovery fails.

Simple rule If you can’t answer “who owns recovery” for each compliance failure mode, your study will experience avoidable missing data.

Learn about Concierge-as-a-Service™

Operating model for compliance: detect, outreach, recover, escalate

Related Knowledge

If you’re building a compliance plan, these topics usually come next.

Why trials lose data continuity

Compliance drift + wear-time decay + silent failures between visits.

Read that page

Wearables & digital endpoints

How to treat wear-time and validity as compliance metrics—not “device issues.”

Explore wearables

Unified eCOA / ePRO

Completion is easier when the UI is consistent and reminders are designed correctly.

Explore eCOA

Real-time analytics

Dashboards that show recovery windows, intervention outcomes, and risk—fast.

Explore analytics

FAQ

What’s the biggest mistake in measuring compliance?

Treating compliance as a single “% completed” number. You also need on-time behavior, device health, and time-to-recovery.

How do I measure wearable compliance correctly?

Use wear-time adherence (hours worn vs expected), valid data days (days meeting endpoint criteria), and sync success (whether data actually arrived). A device can be “worn” and still produce no usable data if sync fails.

How do I choose thresholds?

Define thresholds per endpoint: minimum valid days, minimum wear-time, completion windows, and escalation rules. Then test thresholds during pilot/feasibility to calibrate them.

Is compliance the same as retention?

No. Retention is staying enrolled. Compliance is reliably producing protocol-required data over time. Compliance often degrades before dropout occurs.

Compliance vs retention and practical measurement guidance

Want a Compliance Measurement Plan for Your Study?

If your endpoints depend on longitudinal data, measure compliance as a system: completion, timeliness, device health, and time-to-recovery—then operationalize response windows.

Book a Compliance Walkthrough

Use the Compliance Calculator →