Hamilton–Jacobi–Bellman equation

ID: hamilton-jacobi-bellman-equation

The Hamilton–Jacobi–Bellman (HJB) equation is a fundamental partial differential equation in optimal control theory and dynamic programming. It provides a necessary condition for an optimal control policy for a given dynamic optimization problem. ### Context In many control problems, we aim to find a control strategy that minimizes (or maximizes) a cost function over time.

New to topics? Read the docs here!