Gap type
Silent
Looks compliant, data unusable
Failure modes
4
Contact · Sync · Device · User
Detection
Real-time
Before data lock, not after
Signal QC Model
Monitor · Detect · Recover · Document
Compliance monitoring answers the question: is the patient wearing the device? Signal quality monitoring answers a different question: is the device generating usable data?
The gap between these two questions is where endpoint integrity is lost. A patient can have excellent wear-time compliance — 22 hours per day, every day — while generating data that fails quality thresholds due to sensor placement, sync failure, or device malfunction. The compliance dashboard looks green. The data is not.
Signal quality failures are particularly dangerous because they are often invisible until data lock. Unlike a missed diary entry or a non-wear day, a signal quality failure produces what appears to be complete data. Only analysis reveals that the data is not usable.
See also: Signal Quality Control · Wearables & Digital Endpoints
Optical sensors measuring PPG-based metrics — HRV, SpO2, heart rate — are sensitive to skin contact quality. Loose wear, hair interference, and incorrect device positioning all introduce artifacts that degrade signal quality. Real-time waveform monitoring can flag sessions with high artifact rates and trigger patient retraining before the pattern becomes habitual.
Bluetooth sync failures create data gaps that are completely invisible in wear-time metrics. A patient wearing a device with broken sync is counted as fully compliant while generating no transmitted data. Monitoring sync health — not just wear-time — is essential for studies where data transmission is part of the compliance model.
Battery degradation, sensor drift, and firmware updates can systematically alter data quality without any visible change in patient behavior. Device health monitoring that tracks battery level, firmware version, and sensor calibration status catches these issues before they affect endpoint data.
Motion artifacts are expected in wearable data but must be quantified and managed. Studies that accept raw activity data without artifact analysis overstate the usability of the dataset. Real-time artifact rate monitoring helps identify patients whose activity patterns consistently produce high artifact rates requiring protocol-specific accommodation.
Daily or near-daily QC review is the appropriate standard for studies where wearable data supports primary or key secondary endpoints. Weekly QC is insufficient — problems identified a week after they begin have already produced a significant data gap that is harder to recover.
Some artifact-affected data can be algorithmically cleaned during analysis, but the acceptable thresholds for cleaning must be defined in the statistical analysis plan before data lock. Data with artifact rates above the pre-specified threshold cannot be salvaged post hoc.
Regulatory submissions should include documentation of the QC process, thresholds used to identify quality failures, and the recovery actions taken. A documented, prospectively defined QC protocol is much stronger than post-hoc data cleaning explanations.
Delve's signal QC layer monitors data quality in real time — catching sensor contact issues, sync failures, and device health problems before they become endpoint integrity problems.
Talk About Signal QC