HyperAIHyperAI

Command Palette

Search for a command to run...

Threshold Logic Unit

Date

7 years ago

Threshold Logic Unit TLU is the basic unit of a neural network, and its schematic diagram is as follows:

Each input value and the corresponding weight are multiplied and summed. If the sum is greater than the TLU threshold, the output is 1, otherwise the output is 0. A single TLU can be used for simple action calculations, but to form a neural network, components such as TLU are required.

The threshold logic unit is actually a network structure composed of two layers of neurons. It receives external input at the input layer and transmits the signal to the output layer through the transformation of the activation function. Therefore, it is called the "threshold logic unit."

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Threshold Logic Unit | Wiki | HyperAI