U-Time: A Fully Convolutional Network for Time Series Segmentation Applied to Sleep Staging

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Neural networks are becoming more and more popular for the analysis of physiological time-series. The most successful deep learning systems in this domain combine convolutional and recurrent layers to extract useful features to model temporal relations. Unfortunately, these recurrent models are difficult to tune and optimize. In our experience, they often require task-specific modifications, which makes them challenging to use for non-experts. We propose U-Time, a fully feed-forward deep learning approach to physiological time series segmentation developed for the analysis of sleep data. U-Time is a temporal fully convolutional network based on the U-Net architecture that was originally proposed for image segmentation. U-Time maps sequential inputs of arbitrary length to sequences of class labels on a freely chosen temporal scale. This is done by implicitly classifying every individual time-point of the input signal and aggregating these classifications over fixed intervals to form the final predictions. We evaluated U-Time for sleep stage classification on a large collection of sleep electroencephalography (EEG) datasets. In all cases, we found that U-Time reaches or outperforms current state-of-the-art deep learning models while being much more robust in the training process and without requiring architecture or hyperparameter adaptation across tasks.
OriginalsprogEngelsk
TitelAdvances in Neural Information Processing Systems 32 (NIPS 2019)
Vol/bind32
ForlagNIPS Proceedings
Publikationsdato1 okt. 2019
Sider4415-4426
StatusUdgivet - 1 okt. 2019
Begivenhed33rd Conference on Neural Information Processing Systems (NeurIPS 2019) - Vancouver, Canada
Varighed: 8 dec. 201914 dec. 2019

Konference

Konference33rd Conference on Neural Information Processing Systems (NeurIPS 2019)
LandCanada
ByVancouver
Periode08/12/201914/12/2019

ID: 239571874