HyperAIHyperAI

Command Palette

Search for a command to run...

Stochastic Gradient Gescent

Date

2 years ago

Stochastic Gradient DescentIt is a solution idea of the gradient descent algorithm, which can be used to solve the drawbacks of the gradient descent method. In the stochastic gradient descent method, only one training data can be used to update the parameters in each iteration.

Stochastic Gradient Descent Features

  • Advantages: Fast training speed
  • Disadvantages: reduced accuracy, not globally optimal, not easy to implement in parallel

Stochastic gradient descent will minimize the loss function of all training samples, so that the final solution is the global optimal solution, that is, the solution parameters will minimize the risk function; it will minimize the loss function of each sample. Although the loss function obtained in each iteration is not in the direction of the global optimal solution, the overall direction is the global optimal solution, and the final result is often near the global optimal solution.

Parent term: Gradient Descent
Related terms: batch gradient descent, mini-batch gradient descent

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Stochastic Gradient Gescent | Wiki | HyperAI