Source: wikibot/stochastic-gradient-descent

= Stochastic gradient descent
{wiki=Stochastic_gradient_descent}

Stochastic Gradient Descent (SGD) is an optimization algorithm commonly used for training machine learning models, particularly neural networks. The main goal of SGD is to minimize a loss function, which measures how well a model predicts the desired output. \#\#\# Key Concepts of Stochastic Gradient Descent: 1. **Gradient Descent**: - At a high level, gradient descent is an optimization technique that iteratively adjusts the parameters of a model to minimize the loss function.