HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

How Features Benefit: Parallel Series Embedding for Multivariate Time Series Forecasting with Transformer

{Zonglin Lyu Xuande Feng}

Abstract

Forecasting time series is an engaging and vitalmathematical topic. Theories and applications in related fieldshave been studied for decades, and deep learning has providedreliable tools in recent years. Transformer, capable to capturelonger sequence dependencies, was exploited as a powerful architecture in time series forecasting. While existing work majorlycontributed to breaking memory bottleneck of Trasnformer, howto effectively leverage multivariate time series remains barelyfocused. In this work, a novel architecture utilizing a primaryTransformer is proposed to conduct multivariate time seriespredictions. Our proposed architecture has two main advantages. Firstly, it accurately predicts multivariate time series withshorter or longer sequence lengths and steps. We benchmarkour proposed model with various baseline architectures on realworld datasets, and our model improved their performancessignificantly. Secondly, it can easily be leveraged in Transformerbased variants,

Benchmarks

BenchmarkMethodologyMetrics
time-series-forecasting-on-etth1-720-2Parallel Series Transformer
MAE: 0.286
MSE: 0.129

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
How Features Benefit: Parallel Series Embedding for Multivariate Time Series Forecasting with Transformer | Papers | HyperAI