Commit 6363b29c authored by Erik Strand's avatar Erik Strand

Minor edits to optimization notes

parent 69bf539d
......@@ -146,7 +146,8 @@ the floor of a flat valley. Better yet, if your line search is smart enough to q
functions, then you just need to ensure you can't go downhill forever — i.e. $$H$$ is positive
semidefinite.
Moving along, define $$x_i = x_{i - 1} + \alpha_i u_i$$ as before. For any $$1 \leq i \leq n$$,
Moving along, define $$x_i = x_{i - 1} + \alpha_i u_i$$ via line minimizations as before. For any
$$1 \leq i \leq n$$,
$$
\begin{aligned}
......@@ -191,8 +192,8 @@ u_i^T H u_j = 0
$$
for all $$1 \leq j < i \leq n$$. Such vectors are called *conjugate vectors*, from which this
algorithm derives its name. (Though the careful reader will notice that it's not the gradients that
are conjugate &mdash; and the vectors that are conjugate aren't gradients.)
algorithm derives its name. (Though perhaps it's applied sloppily, since it's not the gradients
themselves that are conjugate.)
#### Base Case
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment