HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Multi-Time Attention Networks for Irregularly Sampled Time Series

Satya Narayan Shukla Benjamin M. Marlin

Multi-Time Attention Networks for Irregularly Sampled Time Series

Abstract

Irregular sampling occurs in many time series modeling applications where it presents a significant challenge to standard deep learning models. This work is motivated by the analysis of physiological time series data in electronic health records, which are sparse, irregularly sampled, and multivariate. In this paper, we propose a new deep learning framework for this setting that we call Multi-Time Attention Networks. Multi-Time Attention Networks learn an embedding of continuous-time values and use an attention mechanism to produce a fixed-length representation of a time series containing a variable number of observations. We investigate the performance of this framework on interpolation and classification tasks using multiple datasets. Our results show that the proposed approach performs as well or better than a range of baseline and recently proposed models while offering significantly faster training times than current state-of-the-art methods.

Code Repositories

reml-lab/mTAN
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
time-series-classification-on-physionetmTAND-Full
AUC: 85.8%
time-series-classification-on-physionetmTAND-Enc
AUC: 85.4%

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Multi-Time Attention Networks for Irregularly Sampled Time Series | Papers | HyperAI