HyperAIHyperAI

Command Palette

Search for a command to run...

Residual Network

Date

3 years ago

The residual network "ResNet" is based on a simple network and converts it into a corresponding residual version by inserting shortcut links. It does not directly fit the target, but fits the residual.

The basic idea of ResNet is to introduce the concept of "shortcut connection" to make it easier to optimize. A multi-layer network containing a "shortcut connection" is called a residual block (shortcut connection, that is, the arrow from x to ⨁ on the right side of the figure).


If the original model seeks parameters in the space of function F(x), then the residual network seeks parameters in the space of x + f(x).

Design rules of residual network:

For the residual network ontology, there are two main points:

  • Convolution kernels with the same feature map scale and number;
  • The number of feature maps doubles while the size of feature maps is halved.

There are two solutions for quick links:

  • Fill with 0s around to increase the dimension;
  • Use Quick Projection.

Advantages of residual networks:

  • Easy to train
  • Linear Deepenable Network
  • Can be transplanted
  • Can reach 1000 layers of accuracy
  • The problem of gradient disappearance can be solved by reverse updating

Application of residual network:

Visual recognition, image generation, natural language processing, speech recognition, advertising, user prediction

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Residual Network | Wiki | HyperAI