Framework
V3 / DiMe
Verification · analytical · clinical
Reproducibility
Yes
Every endpoint version re-runnable on stored inputs
Output
Submission-ready
Evidence + change history + example data
Validation Workflow
Verify → Validate → Document → Reproduce
In clinical research, validation of a digital endpoint is not one document — it is a stack of three: verification, analytical validation, and clinical validation. Together they establish that the sensor measures correctly, the algorithm computes correctly, and the resulting metric reflects the clinical outcome of interest.
All three layers, documented end-to-end. That's the bar for a defensible digital endpoint.
Related pages: Regulatory-Ready Endpoints · Digital Endpoints
Validation rarely fails because the science is wrong. It fails because one of the three layers is partial, undocumented, or unrepeatable years later when a reviewer asks.
Sensor verified, algorithm not analytically validated. Half a stack isn't a stack.
Algorithm 'validated' but the procedure isn't documented in a way anyone else can reproduce.
Validation against synthetic or generic data — without showing the metric tracks the clinical question.
Reproducibility requires the inputs. If source data is gone, the endpoint can't be re-analyzed.
An algorithm that changes mid-study without version control invalidates the prior validation.
Validation written after the trial is harder to defend than validation built alongside it.
Validation is something you build alongside the trial — not something you assemble at submission time.
A 'validated' sensor isn't a validated digital endpoint. Three layers, or it's not the same thing.
Packages that hold up under regulatory and audit review are organized around reviewers' actual questions — not just internal QA.
Strong validation makes the endpoint defensible years after the trial ends — not just at filing.
See related pages: Regulatory-Ready Endpoints · Security & Privacy · Digital Endpoints
Delve tracks the frameworks reviewers actually use — and updates validation patterns as those frameworks evolve.
Digital Medicine Society V3 framework for verification, analytical, and clinical validation.
Methodology aligned with current FDA guidance on digital health technologies.
Endpoint validation takes EMA qualification practice into account.
Electronic records and signatures handled in 21 CFR Part 11-compliant infrastructure.
Validation operations follow ICH GCP principles.
Immutable audit trail and documented procedures for inspections.
Validation is most useful when it speaks the language of the reviewers who will read it.
Both, plus the clinical interpretation. Verification covers the device. Analytical validation covers the algorithm. Clinical validation covers what the resulting metric means for the protocol's population.
Generally yes for verification and analytical validation. Clinical validation usually has to be evaluated against the specific target population, especially if it differs meaningfully from prior work.
Updates ship as new versions. Existing patients can stay on their original version where appropriate; new cohorts can use the updated version. The change history is documented either way.
Delve develops validation packages alongside the trial — verification, analytical, clinical — so your regulatory and statistical teams have what they need at submission and at audit.
Book a Validation Discussion