Source: wikibot/hamilton-jacobi-bellman-equation
= Hamilton–Jacobi–Bellman equation
{wiki=Hamilton–Jacobi–Bellman_equation}
The Hamilton–Jacobi–Bellman (HJB) equation is a fundamental partial differential equation in optimal control theory and dynamic programming. It provides a necessary condition for an optimal control policy for a given dynamic optimization problem. \#\#\# Context In many control problems, we aim to find a control strategy that minimizes (or maximizes) a cost function over time.