A flexible deep learning architecture for temporal sleep stage classification using accelerometry and photoplethysmography

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

A flexible deep learning architecture for temporal sleep stage classification using accelerometry and photoplethysmography. / Olsen, Mads; Zeitzer, Jamie M.; Richardson, Risa N.; Davidenko, Polina; Jennum, Poul J.; Sorensen, Helge B.D.; Mignot, Emmanuel.

I: IEEE Transactions on Biomedical Engineering, Bind 70, Nr. 1, 2023, s. 228-237.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Olsen, M, Zeitzer, JM, Richardson, RN, Davidenko, P, Jennum, PJ, Sorensen, HBD & Mignot, E 2023, 'A flexible deep learning architecture for temporal sleep stage classification using accelerometry and photoplethysmography', IEEE Transactions on Biomedical Engineering, bind 70, nr. 1, s. 228-237. https://doi.org/10.1109/TBME.2022.3187945

APA

Olsen, M., Zeitzer, J. M., Richardson, R. N., Davidenko, P., Jennum, P. J., Sorensen, H. B. D., & Mignot, E. (2023). A flexible deep learning architecture for temporal sleep stage classification using accelerometry and photoplethysmography. IEEE Transactions on Biomedical Engineering, 70(1), 228-237. https://doi.org/10.1109/TBME.2022.3187945

Vancouver

Olsen M, Zeitzer JM, Richardson RN, Davidenko P, Jennum PJ, Sorensen HBD o.a. A flexible deep learning architecture for temporal sleep stage classification using accelerometry and photoplethysmography. IEEE Transactions on Biomedical Engineering. 2023;70(1):228-237. https://doi.org/10.1109/TBME.2022.3187945

Author

Olsen, Mads ; Zeitzer, Jamie M. ; Richardson, Risa N. ; Davidenko, Polina ; Jennum, Poul J. ; Sorensen, Helge B.D. ; Mignot, Emmanuel. / A flexible deep learning architecture for temporal sleep stage classification using accelerometry and photoplethysmography. I: IEEE Transactions on Biomedical Engineering. 2023 ; Bind 70, Nr. 1. s. 228-237.

Bibtex

@article{ef9704b9c1174065bcebe01cd2e5447a,
title = "A flexible deep learning architecture for temporal sleep stage classification using accelerometry and photoplethysmography",
abstract = "Wrist-worn consumer sleep technologies (CST) that contain accelerometers (ACC) and photoplethysmography (PPG) are increasingly common and hold great potential to function as out-of-clinic (OOC) sleep monitoring systems. However, very few validation studies exist because raw data from CSTs are rarely made accessible for external use. We present a deep neural network (DNN) with a strong temporal core, inspired by U-Net, that can process multivariate time series inputs with different dimensionality to predict sleep stages (wake, light-, deep-, and REM sleep) using ACC and PPG signals from nocturnal recordings. The DNN was trained and tested on 3 internal datasets, comprising raw data both from clinical and wrist-worn devices from 301 recordings (PSG-PPG: 266, Wrist-worn PPG: 35). External validation was performed on a hold-out test dataset containing 35 recordings comprising only raw data from a wrist-worn CST. An accuracy = 0.71±0.09, 0.76±0.07, 0.73±0.06, and κ = 0.58±0.13, 0.64±0.09, 0.59±0.09 was achieved on the internal test sets. Our experiments show that spectral preprocessing yields superior performance when compared to surrogate-, feature-, raw data-based preparation. Combining both modalities produce the overall best performance, although PPG proved to be the most impactful and was the only modality capable of detecting REM sleep well. Including ACC improved model precision to wake and sleep metric estimation. Increasing input segment size improved performance consistently; the best performance was achieved using 1024 epochs (∼8.5 hrs.). An accuracy = 0.69±0.13 and κ = 0.58±0.18 was achieved on the hold-out test dataset, proving the generalizability and robustness of our approach to raw data collected with a wrist-worn CST.",
keywords = "Classification algorithms, consumer sleep technologies, Decoding, Deep learning, deep learning, Feature extraction, mHealth, Rapid eye movement sleep, Recording, Sleep apnea, sleep stage classification, wrist actigraphy",
author = "Mads Olsen and Zeitzer, {Jamie M.} and Richardson, {Risa N.} and Polina Davidenko and Jennum, {Poul J.} and Sorensen, {Helge B.D.} and Emmanuel Mignot",
note = "Publisher Copyright: IEEE",
year = "2023",
doi = "10.1109/TBME.2022.3187945",
language = "English",
volume = "70",
pages = "228--237",
journal = "IEEE Transactions on Biomedical Engineering",
issn = "0018-9294",
publisher = "Institute of Electrical and Electronics Engineers",
number = "1",

}

RIS

TY - JOUR

T1 - A flexible deep learning architecture for temporal sleep stage classification using accelerometry and photoplethysmography

AU - Olsen, Mads

AU - Zeitzer, Jamie M.

AU - Richardson, Risa N.

AU - Davidenko, Polina

AU - Jennum, Poul J.

AU - Sorensen, Helge B.D.

AU - Mignot, Emmanuel

N1 - Publisher Copyright: IEEE

PY - 2023

Y1 - 2023

N2 - Wrist-worn consumer sleep technologies (CST) that contain accelerometers (ACC) and photoplethysmography (PPG) are increasingly common and hold great potential to function as out-of-clinic (OOC) sleep monitoring systems. However, very few validation studies exist because raw data from CSTs are rarely made accessible for external use. We present a deep neural network (DNN) with a strong temporal core, inspired by U-Net, that can process multivariate time series inputs with different dimensionality to predict sleep stages (wake, light-, deep-, and REM sleep) using ACC and PPG signals from nocturnal recordings. The DNN was trained and tested on 3 internal datasets, comprising raw data both from clinical and wrist-worn devices from 301 recordings (PSG-PPG: 266, Wrist-worn PPG: 35). External validation was performed on a hold-out test dataset containing 35 recordings comprising only raw data from a wrist-worn CST. An accuracy = 0.71±0.09, 0.76±0.07, 0.73±0.06, and κ = 0.58±0.13, 0.64±0.09, 0.59±0.09 was achieved on the internal test sets. Our experiments show that spectral preprocessing yields superior performance when compared to surrogate-, feature-, raw data-based preparation. Combining both modalities produce the overall best performance, although PPG proved to be the most impactful and was the only modality capable of detecting REM sleep well. Including ACC improved model precision to wake and sleep metric estimation. Increasing input segment size improved performance consistently; the best performance was achieved using 1024 epochs (∼8.5 hrs.). An accuracy = 0.69±0.13 and κ = 0.58±0.18 was achieved on the hold-out test dataset, proving the generalizability and robustness of our approach to raw data collected with a wrist-worn CST.

AB - Wrist-worn consumer sleep technologies (CST) that contain accelerometers (ACC) and photoplethysmography (PPG) are increasingly common and hold great potential to function as out-of-clinic (OOC) sleep monitoring systems. However, very few validation studies exist because raw data from CSTs are rarely made accessible for external use. We present a deep neural network (DNN) with a strong temporal core, inspired by U-Net, that can process multivariate time series inputs with different dimensionality to predict sleep stages (wake, light-, deep-, and REM sleep) using ACC and PPG signals from nocturnal recordings. The DNN was trained and tested on 3 internal datasets, comprising raw data both from clinical and wrist-worn devices from 301 recordings (PSG-PPG: 266, Wrist-worn PPG: 35). External validation was performed on a hold-out test dataset containing 35 recordings comprising only raw data from a wrist-worn CST. An accuracy = 0.71±0.09, 0.76±0.07, 0.73±0.06, and κ = 0.58±0.13, 0.64±0.09, 0.59±0.09 was achieved on the internal test sets. Our experiments show that spectral preprocessing yields superior performance when compared to surrogate-, feature-, raw data-based preparation. Combining both modalities produce the overall best performance, although PPG proved to be the most impactful and was the only modality capable of detecting REM sleep well. Including ACC improved model precision to wake and sleep metric estimation. Increasing input segment size improved performance consistently; the best performance was achieved using 1024 epochs (∼8.5 hrs.). An accuracy = 0.69±0.13 and κ = 0.58±0.18 was achieved on the hold-out test dataset, proving the generalizability and robustness of our approach to raw data collected with a wrist-worn CST.

KW - Classification algorithms

KW - consumer sleep technologies

KW - Decoding

KW - Deep learning

KW - deep learning

KW - Feature extraction

KW - mHealth

KW - Rapid eye movement sleep

KW - Recording

KW - Sleep apnea

KW - sleep stage classification

KW - wrist actigraphy

U2 - 10.1109/TBME.2022.3187945

DO - 10.1109/TBME.2022.3187945

M3 - Journal article

C2 - 35786544

AN - SCOPUS:85134212549

VL - 70

SP - 228

EP - 237

JO - IEEE Transactions on Biomedical Engineering

JF - IEEE Transactions on Biomedical Engineering

SN - 0018-9294

IS - 1

ER -

ID: 324665181