HyperAIHyperAI

Command Palette

Search for a command to run...

Rectified Linear Unit

Date

7 years ago

Linear correction unit(ReLU), also known as the rectifier linear function, is an activation function commonly used in artificial neural networks, usually referring to nonlinear functions represented by ramp functions and their variants.

Features of Linear Correction Unit

The more commonly used ReLUs include the ramp function f ( x ) = max ( 0 , x ) and the leaky rectifier function (Leaky ReLU), where x is the input of the neuron.

Linear rectification is believed to have certain biological principles, and because it usually works better than other commonly used activation functions (such as logistic functions) in practice, it is widely used in today's deep neural networks in computer vision artificial intelligence fields such as image recognition.

ReLU is the most commonly used activation function in neural networks. It retains the biological inspiration of the step function (the neuron is activated only when the input exceeds the threshold), but the derivative is not zero when the input is positive, thus allowing gradient-based learning (although the derivative is undefined when x = 0).

Using this function makes the calculation very fast, because neither the function nor its derivatives involve complex mathematical operations. However, when the input is negative, the learning speed of ReLU may become very slow, or even make the neuron directly invalid, because the input is less than zero and the gradient is zero, so its weight cannot be updated and it will remain silent for the rest of the training process.

Related words: activation function
Sub-words: slope Korean, leaky rectifier function

References:

【1】https://zh.wikipedia.org/wiki/Linear Rectification Function

【2】https://www.jiqizhixin.com/articles/2017-10-10-3

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Rectified Linear Unit | Wiki | HyperAI