From 7322b9a775dfb6f217281b5906207797867aac36 Mon Sep 17 00:00:00 2001 From: Simon Blanke Date: Tue, 11 Apr 2023 10:45:15 +0200 Subject: [PATCH] Update README.md --- README.md | 254 +----------------------------------------------------- 1 file changed, 1 insertion(+), 253 deletions(-) diff --git a/README.md b/README.md index 9c518c8..0a23167 100644 --- a/README.md +++ b/README.md @@ -86,7 +86,7 @@ Gradient-Free-Optimizers is the optimization backend of Optimization algorithmsInstallationExamples • - API reference • + API referenceRoadmap @@ -817,257 +817,6 @@ opt.search(model, n_iter=50) -
- -## Basic API reference - -The API reference can also be found in the [official documentation](https://simonblanke.github.io/gradient-free-optimizers-documentation). - - -### General optimization arguments - -The following (general) arguments can be passed to any optimization class: - -- search_space - - Pass the search_space to the optimizer class to define the space were the optimization algorithm can search for the best parameters for the given objective function. - - example: - ```python - ... - - search_space = { - "x1": numpy.arange(-10, 31, 0.3), - "x2": numpy.arange(-10, 31, 0.3), - } - - opt = HillClimbingOptimizer(search_space) - - ... - ``` - -- initialize={"grid": 8, "vertices": 8, "random": 4, "warm_start": []} - - (dict, None) - - - The initialization dictionary automatically determines a number of parameters that will be evaluated in the first n iterations (n is the sum of the values in initialize). The initialize keywords are the following: - - grid - - Initializes positions in a grid like pattern. Positions that cannot be put into a grid are randomly positioned. - - vertices - - Initializes positions at the vertices of the search space. Positions that cannot be put into a vertices are randomly positioned. - - - random - - Number of random initialized positions - - - warm_start - - List of parameter dictionaries that marks additional start points for the optimization run. - - -- random_state=None - - (int, None) - - Random state for random processes in the random, numpy and scipy module. - - - -### Optimizer Classes - -Each optimization class needs the "search_space" as an input argument. Optionally "initialize" and optimizer-specific parameters can be passed as well. You can read more about each optimization-algorithm and its parameters in the [Optimization Tutorial](https://github.com/SimonBlanke/optimization-tutorial). - -- HillClimbingOptimizer -- StochasticHillClimbingOptimizer -- RepulsingHillClimbingOptimizer -- SimulatedAnnealingOptimizer -- DownhillSimplexOptimizer -- RandomSearchOptimizer -- GridSearchOptimizer -- RandomRestartHillClimbingOptimizer -- RandomAnnealingOptimizer -- PowellsMethod -- PatternSearch -- ParallelTemperingOptimizer -- ParticleSwarmOptimizer -- SpiralOptimization -- EvolutionStrategyOptimizer -- LipschitzOptimizer -- DirectAlgorithm -- BayesianOptimizer -- TreeStructuredParzenEstimators -- ForestOptimizer - - - -
- -
- .search(...) - -- objective_function - - (callable) - - - The objective function defines the optimization problem. The optimization algorithm will try to maximize the numerical value that is returned by the objective function by trying out different parameters from the search space. - - example: - ```python - def objective_function(para): - score = -(para["x1"] * para["x1"] + para["x2"] * para["x2"]) - return score - ``` - -- n_iter - - (int) - - - The number of iterations that will be performed during the optimiation run. The entire iteration consists of the optimization-step, which decides the next parameter that will be evaluated and the evaluation-step, which will run the objective function with the chosen parameter and return the score. - -- max_time=None - - (float, None) - - Maximum number of seconds until the optimization stops. The time will be checked after each completed iteration. - -- max_score=None - - (float, None) - - Maximum score until the optimization stops. The score will be checked after each completed iteration. - - -- early_stopping=None - - (dict, None) - - Stops the optimization run early if it did not achive any score-improvement within the last iterations. The early_stopping-parameter enables to set three parameters: - - `n_iter_no_change`: Non-optional int-parameter. This marks the last n iterations to look for an improvement over the iterations that came before n. If the best score of the entire run is within those last n iterations the run will continue (until other stopping criteria are met), otherwise the run will stop. - - `tol_abs`: Optional float-paramter. The score must have improved at least this absolute tolerance in the last n iterations over the best score in the iterations before n. This is an absolute value, so 0.1 means an imporvement of 0.8 -> 0.9 is acceptable but 0.81 -> 0.9 would stop the run. - - `tol_rel`: Optional float-paramter. The score must have imporved at least this relative tolerance (in percentage) in the last n iterations over the best score in the iterations before n. This is a relative value, so 10 means an imporvement of 0.8 -> 0.88 is acceptable but 0.8 -> 0.87 would stop the run. - - - -- memory=True - - (bool) - - Whether or not to use the "memory"-feature. The memory is a dictionary, which gets filled with parameters and scores during the optimization run. If the optimizer encounters a parameter that is already in the dictionary it just extracts the score instead of reevaluating the objective function (which can take a long time). - - -- memory_warm_start=None - - (pandas dataframe, None) - - Pandas dataframe that contains score and paramter information that will be automatically loaded into the memory-dictionary. - - example: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
scorex1x2x...
0.7560.10.2...
0.8230.30.1...
............
............
- - - -- verbosity=[ - "progress_bar", - "print_results", - "print_times" - ] - - (list, False) - - The verbosity list determines what part of the optimization information will be printed in the command line. - - -
- -
- -
- Results from attributes - - -- .search_data - - Dataframe containing information about the score and the value of each parameter. Each row shows the information of one optimization iteration. - - example: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
scorex1x2x...
0.7560.10.2...
0.8230.30.1...
............
............
- -- .best_score - - numerical value of the best score, that was found during the optimization run. - -- .best_para - - parameter dictionary of the best score, that was found during the optimization run. - - example: - ```python - { - 'x1': 0.2, - 'x2': 0.3, - } - ``` - -- .eval_times - - List of evaluation times (time of objective function evaluation) collected during the optimization run. - -- .iter_times - - List of iteration times (evaluation + optimization) collected during the optimization run. - - - -
- -
## Roadmap @@ -1165,7 +914,6 @@ Each optimization class needs the "search_space" as an input argument. Optionall -
## Gradient Free Optimizers <=> Hyperactive