cpp_starter
Various optimization algorithms
Dependencies
- a modern C++ compiler (i.e. one that supports C++17 features, ex. gcc 7+)
- cmake 3.13 or newer
- Eigen3 (installable via apt, yum, or homebrew)
Building
From the project's root directory,
mkdir build
cd build
cmake ..
make
Tips:
- you can build using multiple threads at once using e.g.
make -j4
- you can specify a specific compiler using
cmake .. -DCMAKE_CXX_COMPILER=g++-8
- for debugging symbols use
cmake .. -DCMAKE_BUILD_TYPE=Debug
TODO
save data to files and run python with a make command run with single size parameter to see what happens
For the first thing... an EVO experiment needs pop size, learning rate, n iterations, print stride (common) and a starting distribution (depends on type of distribution) separate command line tools for each experiment?
objective for optimization gives input type and gradient type, can eval both (apart and at once)
- should objective give actual input type, or const ref? For now it'll always be const &, so I'll add those elsewhere (need to be able to make the literal thing in some contexts)
- gradient type can be multiplied by scalar; input type can be += with gradient
expected value estimator?
- stores n_samples
- takes objective and distribution (objective only needs to eval points)
- input type is distribution::params
- gradient type is distribution::score (hm should I call this gradient?)
- can estimate value at params
- can estimate gradient wrt params at params
then I'll be able to change just the distribution - expected value
how to handle data logging?