Skip to content
GitLab
Projects
Groups
Snippets
Help
Loading...
Help
What's new
7
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Open sidebar
Erik Strand
nmm_2020_site
Commits
6363b29c
Commit
6363b29c
authored
Apr 30, 2020
by
Erik Strand
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Minor edits to optimization notes
parent
69bf539d
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
3 deletions
+4
-3
_notes/optimization.md
_notes/optimization.md
+4
-3
No files found.
_notes/optimization.md
View file @
6363b29c
...
...
@@ -146,7 +146,8 @@ the floor of a flat valley. Better yet, if your line search is smart enough to q
functions, then you just need to ensure you can't go downhill forever
—
i.e. $$H$$ is positive
semidefinite.
Moving along, define $$x_i = x_{i - 1} +
\a
lpha_i u_i$$ as before. For any $$1
\l
eq i
\l
eq n$$,
Moving along, define $$x_i = x_{i - 1} +
\a
lpha_i u_i$$ via line minimizations as before. For any
$$1
\l
eq i
\l
eq n$$,
$$
\b
egin{aligned}
...
...
@@ -191,8 +192,8 @@ u_i^T H u_j = 0
$$
for all $$1
\l
eq j < i
\l
eq n$$. Such vectors are called
*conjugate vectors*
, from which this
algorithm derives its name. (Though
the careful reader will notice that
it's not the gradients
that
are conjugate
—
and the vector
s that are conjugate
aren't gradients
.)
algorithm derives its name. (Though
perhaps it's applied sloppily, since
it's not the gradients
themselve
s that are conjugate.)
#### Base Case
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment