Toggle navigation
LeiSHI
Blog
C++
Ros
Linux
Machine learning
Control & dynamics
Life
Previous
Next
Control & dynamics
5 Jul 2023
Applied Optimal Control(01) - Unconstrained Optimization Basics (Nec/Suf Conditions, Gradient Descent)
Based on the lecture notes by Dr. Marin Kobilarov on “Applied Optimal Control (2021 Fall)” at Johns Hopkins University
Keypoints:
Optimality Conditions
Necessary conditions
$x
$ is a local minimum of $f(x)$ if $\nabla f(x
) = 0$ and $\nabla^2 f(x*)$ is p.s.d.
To tell if $\nabla^2 f(x
)$ is p.s.d., check if all the eigenvalues of $\nabla^2 f(x
)$ are non-negative.
Sufficient conditions
$x
$ is a local minimum of $f(x)$ if $\nabla f(x
) = 0$ and $\nabla^2 f(x*)$ is p.d.
To tell if $\nabla^2 f(x
)$ is p.d., check if all the eigenvalues of $\nabla^2 f(x
)$ are positive.
Numerical Solutions: gradient-based methods
Gradient descent
$x_{k+1} = x_k + \alpha_k d_k$
$d_k$ is the descent direction, s.t. $\nabla f(x_k)^T d_k < 0$
Steepest descent: $d_k = -\nabla f(x_k)$
Newton’s method: $d_k = -(\nabla^2 f(x_k))^{-1} \nabla f(x_k)$
Regularized Newton’s method: $d_k = -(\nabla^2 f(x_k) + \lambda I)^{-1} \nabla f(x_k)$
$\alpha_k$ is the step size, s.t. $f(x_k + \alpha_k d_k) < f(x_k)$
minimize $f(x_k + \alpha_k d_k)$ w.r.t. $\alpha_k$
Successive Stepsize Reduction (Armijo Rule)
Constant Stepsize
Tags:
Control & dynamics
0
comments
Please enable JavaScript to view the
comments powered by Disqus.