HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

iTransformer: Inverted Transformers Are Effective for Time Series Forecasting

Yong Liu Tengge Hu Haoran Zhang Haixu Wu Shiyu Wang Lintao Ma Mingsheng Long

iTransformer: Inverted Transformers Are Effective for Time Series Forecasting

Abstract

The recent boom of linear forecasting models questions the ongoing passion for architectural modifications of Transformer-based forecasters. These forecasters leverage Transformers to model the global dependencies over temporal tokens of time series, with each token formed by multiple variates of the same timestamp. However, Transformers are challenged in forecasting series with larger lookback windows due to performance degradation and computation explosion. Besides, the embedding for each temporal token fuses multiple variates that represent potential delayed events and distinct physical measurements, which may fail in learning variate-centric representations and result in meaningless attention maps. In this work, we reflect on the competent duties of Transformer components and repurpose the Transformer architecture without any modification to the basic components. We propose iTransformer that simply applies the attention and feed-forward network on the inverted dimensions. Specifically, the time points of individual series are embedded into variate tokens which are utilized by the attention mechanism to capture multivariate correlations; meanwhile, the feed-forward network is applied for each variate token to learn nonlinear representations. The iTransformer model achieves state-of-the-art on challenging real-world datasets, which further empowers the Transformer family with promoted performance, generalization ability across different variates, and better utilization of arbitrary lookback windows, making it a nice alternative as the fundamental backbone of time series forecasting. Code is available at this repository: https://github.com/thuml/iTransformer.

Code Repositories

hughxx/tsf-new-paper-taste
pytorch
Mentioned in GitHub
sanjaylopa22/QCAAPatchTF
pytorch
Mentioned in GitHub
Hannibal046/GridTST
pytorch
Mentioned in GitHub
taohan10200/weather-5k
pytorch
Mentioned in GitHub
thuml/Time-Series-Library
pytorch
Mentioned in GitHub
thuml/iTransformer
Official
pytorch
Mentioned in GitHub
lss-1138/SegRNN
pytorch
Mentioned in GitHub
kwuking/TimeMixer
pytorch
Mentioned in GitHub
master-plc/fredf
pytorch
Mentioned in GitHub
lucidrains/iTransformer
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
time-series-forecasting-on-etth1-336-1iTransformer
MAE: 0.458
MSE: 0.487

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
iTransformer: Inverted Transformers Are Effective for Time Series Forecasting | Papers | HyperAI