Framework

V3 / DiMe

Verification · analytical · clinical

Reproducibility

Yes

Every endpoint version re-runnable on stored inputs

Output

Submission-ready

Evidence + change history + example data

Validation built for digital submissions, not screenshots

Digital Endpoint
Validation

Validation is what separates a digital metric from a digital endpoint. It is the evidence — verification, analytical, clinical — that a measurement is correct, reproducible, and reflects the clinical question the protocol asks. Delve builds validation into the operating model so it can defend the endpoint years later.

V3-aligned · Reproducible · Submission-ready

V
A
C
DOC

Validation Workflow

Verify → Validate → Document → Reproduce

If we get audited, can someone reproduce our digital endpoint result?
Yes. Each endpoint version ships with the data used to validate it, the procedure to re-run it, and the algorithm version that produced it.
Reproducibility is not optional — it's how the endpoint survives review.
Reproducible, not just collected Three layers + a paper trail

What 'Validation' Actually Means for Digital Endpoints

In clinical research, validation of a digital endpoint is not one document — it is a stack of three: verification, analytical validation, and clinical validation. Together they establish that the sensor measures correctly, the algorithm computes correctly, and the resulting metric reflects the clinical outcome of interest.

All three layers, documented end-to-end. That's the bar for a defensible digital endpoint.

Related pages: Regulatory-Ready Endpoints · Digital Endpoints

Three-layer validation pipeline for digital endpoints with documented evidence

Why Validation Falls Apart in Practice

Validation rarely fails because the science is wrong. It fails because one of the three layers is partial, undocumented, or unrepeatable years later when a reviewer asks.

Verification only

Sensor verified, algorithm not analytically validated. Half a stack isn't a stack.

Black-box analytical work

Algorithm 'validated' but the procedure isn't documented in a way anyone else can reproduce.

No clinical anchor

Validation against synthetic or generic data — without showing the metric tracks the clinical question.

Lost source data

Reproducibility requires the inputs. If source data is gone, the endpoint can't be re-analyzed.

Drifting algorithms

An algorithm that changes mid-study without version control invalidates the prior validation.

Late documentation

Validation written after the trial is harder to defend than validation built alongside it.

Validation is something you build alongside the trial — not something you assemble at submission time.

Delve Validation vs Marketing-Style 'Validation'

Marketing-style 'validation'

  • Sensor-level verification only
  • Algorithm logic not published
  • No clinical validation evidence
  • Source data not retained
  • Algorithm versioning not enforced

Delve validation

  • Verification + analytical + clinical, all documented
  • Documented, reviewable algorithms
  • Clinical validation against target population
  • Source data retained with provenance
  • Versioned algorithms with change history

A 'validated' sensor isn't a validated digital endpoint. Three layers, or it's not the same thing.

What a Strong Validation Package Includes

Packages that hold up under regulatory and audit review are organized around reviewers' actual questions — not just internal QA.

Strong validation makes the endpoint defensible years after the trial ends — not just at filing.

See related pages: Regulatory-Ready Endpoints · Security & Privacy · Digital Endpoints

Three-layer validation evidence package for digital endpoints

Validation Aligned With Current Frameworks

Delve tracks the frameworks reviewers actually use — and updates validation patterns as those frameworks evolve.

DiMe V3

Digital Medicine Society V3 framework for verification, analytical, and clinical validation.

FDA guidance

Methodology aligned with current FDA guidance on digital health technologies.

EMA qualification

Endpoint validation takes EMA qualification practice into account.

21 CFR Part 11

Electronic records and signatures handled in 21 CFR Part 11-compliant infrastructure.

ICH GCP

Validation operations follow ICH GCP principles.

Auditable trail

Immutable audit trail and documented procedures for inspections.

Validation is most useful when it speaks the language of the reviewers who will read it.

FAQ

Do you validate the device or the algorithm?

Both, plus the clinical interpretation. Verification covers the device. Analytical validation covers the algorithm. Clinical validation covers what the resulting metric means for the protocol's population.

Can validation evidence be reused across studies?

Generally yes for verification and analytical validation. Clinical validation usually has to be evaluated against the specific target population, especially if it differs meaningfully from prior work.

How do you handle an algorithm update mid-study?

Updates ship as new versions. Existing patients can stay on their original version where appropriate; new cohorts can use the updated version. The change history is documented either way.

Build Digital Endpoints With Defensible Validation

Delve develops validation packages alongside the trial — verification, analytical, clinical — so your regulatory and statistical teams have what they need at submission and at audit.

Book a Validation Discussion

Explore Delve wearables capabilities →