Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a sentence to say can use Hyperband/ASHA together with quasi-random search #70

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -1735,6 +1735,10 @@ multi-host training can make it very easy to introduce bugs!*
trials to run in parallel and we can afford to run many trials in sequence,
Bayesian optimization becomes much more attractive despite making our tuning
results harder to interpret.
- Both quasi-random search and Bayesian optimization can be used together with
[Hyperband](https://jmlr.org/papers/volume18/16-558/16-558.pdf) /
[ASHA](https://arxiv.org/abs/1810.05934), which quickly discard points that
aren't promising from the search.

[^3]: Ben Recht and Kevin Jamieson
[pointed out](http://www.argmin.net/2016/06/20/hypertuning/) how strong
Expand Down