HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

SOM-VAE: Interpretable Discrete Representation Learning on Time Series

Vincent Fortuin; Matthias Hüser; Francesco Locatello; Heiko Strathmann; Gunnar Rätsch

SOM-VAE: Interpretable Discrete Representation Learning on Time Series

Abstract

High-dimensional time series are common in many domains. Since human cognition is not optimized to work well in high-dimensional spaces, these areas could benefit from interpretable low-dimensional representations. However, most representation learning algorithms for time series data are difficult to interpret. This is due to non-intuitive mappings from data features to salient properties of the representation and non-smoothness over time. To address this problem, we propose a new representation learning framework building on ideas from interpretable discrete dimensionality reduction and deep generative modeling. This framework allows us to learn discrete representations of time series, which give rise to smooth and interpretable embeddings with superior clustering performance. We introduce a new way to overcome the non-differentiability in discrete representation learning and present a gradient-based version of the traditional self-organizing map algorithm that is more performant than the original. Furthermore, to allow for a probabilistic interpretation of our method, we integrate a Markov model in the representation space. This model uncovers the temporal transition structure, improves clustering performance even further and provides additional explanatory insights as well as a natural representation of uncertainty. We evaluate our model in terms of clustering performance and interpretability on static (Fashion-)MNIST data, a time series of linearly interpolated (Fashion-)MNIST images, a chaotic Lorenz attractor system with two macro states, as well as on a challenging real world medical time series application on the eICU data set. Our learned representations compare favorably with competitor methods and facilitate downstream tasks on the real world data.

Code Repositories

KurochkinAlexey/SOM-VAE
pytorch
Mentioned in GitHub
ratschlab/SOM-VAE
Official
tf
Mentioned in GitHub
alexwndm/state-detection-somvae
pytorch
Mentioned in GitHub
ai-how/TIme-series-clustering
tf
Mentioned in GitHub
shrra/minisom
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
time-series-clustering-on-eicu-collaborativeSOM-VAE
NMI (physiology_12_hours): 0.0444
NMI (physiology_24_hours): 0.0354
NMI (physiology_6_hours): 0.0407
time-series-clustering-on-eicu-collaborativeSOM-VAE-prob
NMI (physiology_24_hours): 0.0421
NMI (physiology_6_hours): 0.0474

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
SOM-VAE: Interpretable Discrete Representation Learning on Time Series | Papers | HyperAI