Gap type

Silent

Looks compliant, data unusable

Failure modes

4

Contact · Sync · Device · User

Detection

Real-time

Before data lock, not after

Wearable data quality

Signal Quality Control
for Wearable Data in Clinical Trials

Wearable data looks complete in a dashboard. It often isn't. Patients can wear devices consistently and still generate unusable data if signal quality is not monitored. Here are the four most common quality failure modes — and what real-time QC catches before they become regulatory problems.

Real-time QC · Device health monitoring · Endpoint protection

PPG
Sync
ECG
QC

Signal QC Model

Monitor · Detect · Recover · Document

"I wore the device every day — I thought the data was all being recorded."
Wear-time and data quality are not the same thing. Real-time signal QC catches the difference before data lock makes it irreversible.

Compliance and Data Quality Are Not the Same Thing

Compliance monitoring answers the question: is the patient wearing the device? Signal quality monitoring answers a different question: is the device generating usable data?

The gap between these two questions is where endpoint integrity is lost. A patient can have excellent wear-time compliance — 22 hours per day, every day — while generating data that fails quality thresholds due to sensor placement, sync failure, or device malfunction. The compliance dashboard looks green. The data is not.

Signal quality failures are particularly dangerous because they are often invisible until data lock. Unlike a missed diary entry or a non-wear day, a signal quality failure produces what appears to be complete data. Only analysis reveals that the data is not usable.

See also: Signal Quality Control · Wearables & Digital Endpoints

Signal quality control for wearable data in clinical trials

Four Signal Quality Failure Modes — and What Real-Time QC Catches

1. Poor sensor contact

Optical sensors measuring PPG-based metrics — HRV, SpO2, heart rate — are sensitive to skin contact quality. Loose wear, hair interference, and incorrect device positioning all introduce artifacts that degrade signal quality. Real-time waveform monitoring can flag sessions with high artifact rates and trigger patient retraining before the pattern becomes habitual.

2. Sync failures

Bluetooth sync failures create data gaps that are completely invisible in wear-time metrics. A patient wearing a device with broken sync is counted as fully compliant while generating no transmitted data. Monitoring sync health — not just wear-time — is essential for studies where data transmission is part of the compliance model.

3. Device hardware and firmware issues

Battery degradation, sensor drift, and firmware updates can systematically alter data quality without any visible change in patient behavior. Device health monitoring that tracks battery level, firmware version, and sensor calibration status catches these issues before they affect endpoint data.

4. Motion artifacts in ECG and activity data

Motion artifacts are expected in wearable data but must be quantified and managed. Studies that accept raw activity data without artifact analysis overstate the usability of the dataset. Real-time artifact rate monitoring helps identify patients whose activity patterns consistently produce high artifact rates requiring protocol-specific accommodation.

FAQ

At what frequency should signal QC be reviewed in a wearable study?

Daily or near-daily QC review is the appropriate standard for studies where wearable data supports primary or key secondary endpoints. Weekly QC is insufficient — problems identified a week after they begin have already produced a significant data gap that is harder to recover.

Can signal quality issues be corrected after data collection?

Some artifact-affected data can be algorithmically cleaned during analysis, but the acceptable thresholds for cleaning must be defined in the statistical analysis plan before data lock. Data with artifact rates above the pre-specified threshold cannot be salvaged post hoc.

How should signal QC be documented for regulatory submissions?

Regulatory submissions should include documentation of the QC process, thresholds used to identify quality failures, and the recovery actions taken. A documented, prospectively defined QC protocol is much stronger than post-hoc data cleaning explanations.

Using Wearables for a Primary or Key Secondary Endpoint?

Delve's signal QC layer monitors data quality in real time — catching sensor contact issues, sync failures, and device health problems before they become endpoint integrity problems.

Talk About Signal QC

See Delve Signal QC →