Database Open Access

Motion and heart rate from a wrist-worn wearable and labeled sleep from polysomnography

Olivia Walch

Published: Oct. 8, 2019. Version: 1.0.0


When using this resource, please cite: (show more options)
Walch, O. (2019). Motion and heart rate from a wrist-worn wearable and labeled sleep from polysomnography (version 1.0.0). PhysioNet. https://doi.org/10.13026/hmhs-py35.

Additionally, please cite the original publication:

Olivia Walch, Yitong Huang, Daniel Forger, Cathy Goldstein, Sleep stage prediction with raw acceleration and photoplethysmography heart rate data derived from a consumer wearable device, Sleep

Please include the standard citation for PhysioNet: (show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220.

Abstract

This project contains acceleration (in units of g) and heart rate (bpm, measured from photoplethysmography) recorded from the Apple Watch, as well as labeled sleep scored from gold-standard polysomnography. Data were collected at the University of Michigan from June 2017 to March 2019, and there are 31 subjects in total. Code to read and process these files is available on GitHub. The paper corresponding to the work is Walch et al., "Sleep stage prediction with raw acceleration and photoplethysmography heart rate data derived from a consumer wearable device", SLEEP (2019).


Background

There are a number of consumer wearable devices purporting to track sleep on the market; however, the algorithms used to score sleep in these devices are rarely disclosed. On top of that, the raw sensor data from the devices are not usually available for use outside the manufacturer. This limits the usefulness of these devices in research and the clinic. We wrote an app to extract heart rate and accelerometer data from the Apple Watch. We collected data via the Apple Watch using this app from people undergoing polysomnography, as well as in the week leading up to the sleep lab recording. We then looked at the contributions of motion, heart rate, and a proxy for the circadian clock to the ability of classifiers to score sleep.


Methods

Subjects in this trial wore an Apple Watch to collect their ambulatory activity patterns for a week before spending one night in a sleep lab. During that night, acceleration (in g) and heart rate (in beats per minute, bpm) were collected from the Apple Watch while they underwent polysomnography. Each type of data recorded from the Apple Watch and the labeled sleep from polysomnography is saved in a separate file, tagged with a random subject identifier. 

For a full description of the methods, see [1]. To summarize, we recruited 39 subjects from the University of Michigan after approval by the University of Michigan Institutional Review Board. Subjects wore an Apple Watch for 7 - 14 days to collect their ambulatory steps. On the last day, they spent the night in the lab for an eight hour sleep opportunity, and we recorded acceleration and heart rate from their Apple Watch while they slept. Sample code to access these sensors on the Apple Watch is available here: https://github.com/ojwalch/sleep_accel. We excluded four people for errors with data transmission, three people for sleep apnea, and one person for REM sleep behavior disorder. In cases where the Watch ran out of battery in the middle of the night, data was cropped only to the window where valid collection occurred.


Data Description

The following types of data are provided: 

  • motion (acceleration): Recorded from the Apple Watch and saved as txt files with the naming convention '[subject-id-number]_acceleration.txt'

    Each line in this file has the format: date (in seconds since PSG start), x acceleration (in g), y acceleration, z acceleration
     
  • heart rate (bpm): Recorded from the Apple Watch and saved as txt files with the naming convention '[subject-id-number]_heartrate.txt'

    Each line in this file has the format: date (in seconds since PSG start), heart rate (bpm)
     
  •  steps (count): Recorded from the Apple Watch and saved in the format '[subject-id-number]_steps.txt'

    Each line in this file has the format: date (in seconds since PSG start), steps (total in bin from this timestamp to next timestamp)
     
  • labeled sleep: Recorded from polysomnography and saved in the format '[subject-id-number]_labeled_sleep.txt'

    Each line in this file has the format: date (in seconds since PSG start) stage (0-5, wake = 0, N1 = 1, N2 = 2, N3 = 3, REM = 5)

Usage Notes

Python code to process the data is available on http://www.github.com/ojwalch/sleep_classifiers/. The paper describing this work is available open-access here.

We encourage others to validate our work and build on it by applying novel analytical methods. In particular, our methods treat each epoch of sleep in isolation, which yields "blips" of sleep, wake, REM, and NREM. The global structure of sleep is not well captured by this approach. Alternative approaches that cluster nearby epochs or incorporate the probability of sleep stage transitions would be great directions for new work. 

Additionally, our work looked only at Wake vs Sleep and Wake/NREM/REM sleep. We encourage others to look at more four or five class classification; e.g. Wake/N1/N2/N3/REM, or Wake/N1+N2/N3/REM. 


Acknowledgements

This work was supported by the Exercise & Sport Science Initiative University of Michigan; Mobile sleep and circadian rhythm assessment for enhanced athletic performance (U056400) M-Cubed; Analyzing light, human sleep and circadian rhythms through smartphones (U049702), and NSF DMS 1714094. Thanks to Mallory Newsted, Jennifer Zollars, and the University of Michigan Sleep and Circadian Research Laboratory for their assistance.


Conflicts of Interest

Dr. Walch has given talks at Unilever events and received honorariums/travel expenses. She is the CEO of Arcascope, a company that makes circadian rhythms software. Arcascope did not sponsor this research. Dr. Goldstein receives royalties from UpToDate.


References

  1. Olivia Walch, Yitong Huang, Daniel Forger, Cathy Goldstein, Sleep stage prediction with raw acceleration and photoplethysmography heart rate data derived from a consumer wearable device, Sleep, , zsz180, https://doi.org/10.1093/sleep/zsz180

Share
Access

Access Policy:
Anyone can access the files, as long as they conform to the terms of the specified license.

License (for files):
Open Data Commons Attribution License v1.0

Corresponding Author
You must be logged in to view the contact information.

Files

Total uncompressed size: 2.2 GB.

Access the files
Folder Navigation: <base>
Name Size Modified
heart_rate
labels
motion
steps
LICENSE.txt (download) 19.9 KB 2019-10-08
SHA256SUMS.txt (download) 11.6 KB 2019-10-08