HyperAIHyperAI

Command Palette

Search for a command to run...

Rectified Linear Unit

Date

7 years ago

Linear correction unit ReLU is a commonly used activation function in artificial neural networks. It is also called a linear rectification function, usually referring to a nonlinear function represented by a ramp function and its variants.

Features of Linear Correction Unit

Commonly used linear rectification units include the ramp function f ( x ) = max ( 0 , x ) and the leaky rectifier function Leaky ReLU, where x represents the input of the neuron.

Linear rectification is believed to have certain biological principles. Since it usually works better than other commonly used activation functions in practice, it is widely used in deep neural networks, including computer vision fields such as image recognition.

As a commonly used activation function in neural networks, ReLU retains the biological inspiration of the Step function. When the input is positive, the derivative is not zero, allowing gradient-based learning. When the input is negative, ReLU's learning speed may slow down or even cause the neuron to fail directly, because the input is less than zero and the gradient is zero, which will cause the weights to fail to be updated and remain silent during the rest of the training process.

Related words: activation function
Sub-words: slope Korean, leaky rectifier function

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Rectified Linear Unit | Wiki | HyperAI