HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

A New Deep Learning Architecture withInductive Bias Balance for Transformer Oil Temperature Forecasting

{Francisco Martínez-Álvarez & Gualberto Asencio-Cortés María Martínez-Ballesteros Manuel Jesús Jiménez-Navarro}

A New Deep Learning Architecture withInductive Bias Balance for Transformer Oil Temperature Forecasting

Abstract

Ensuring optimal performance of power transformers is a laborious task, where the insulation system is essential to decrease their deterioration. The insulation system uses the insulate oil required to control temperature. High temperatures may reduce the lifetime of transformers, leading to expensive maintenance. Deep learning architectures have been shown to obtain remarkable results in a wide range of fields. However, this improvement usually comes with an increase in computing resources, which increases the carbon footprint and hinders the optimization of the architectures. In this work, we develop a new deep learning architecture that obtain an efficacy which compete with the best current architectures in transformer oil temperature forecasting while improve the efficacy. Effective forecasting can help prevent high temperatures and monitor the future condition of power transformers, avoiding unnecessary waste. We attempt to balance the inductive bias included in our architecture through the proposed Smooth Residual Block. This mechanism divides the original problem into multiple subproblems, obtaining different representations of the time series, which collaboratively obtain the final forecasting. Our architecture is applied to the Electricity Transformer datasets, which obtain transformer insulate oil temperature measures from two transformers in China. The results achieve a 13% improvement in MSE and a 57% improvement in performance compared to, as far as we know, the best current architectures. Additionally, we analyze the behavior learned by this architecture to obtain an intuitive interpretation of the achieved solution.

Benchmarks

BenchmarkMethodologyMetrics
time-series-forecasting-on-etth1-24-2SRCNet
MAE: 0.129
time-series-forecasting-on-etth2-24-2SRCNet
MAE: 0.193

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
A New Deep Learning Architecture withInductive Bias Balance for Transformer Oil Temperature Forecasting | Papers | HyperAI