You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Replace custom colour implementation, add docs for `logger.py`, `util.py` (#435)
* Replace custom colour implementation, add docs for `logger.py`, `util.py`
* minor typo/syntax fixes
* User `or` to separate different possible types
* Update docs & linting for `constraints.py`, `target_space.py` (#440)
* Run tests on any PR
* Update docs, linting
* Update bayes_opt/constraint.py
Co-authored-by: Leandro Braga <[email protected]>
* Rename mislabelled parameters
---------
Co-authored-by: Leandro Braga <[email protected]>
* Update various docstrings, add workflow to check docstrings (#445)
* Fixes issue-436: Constrained optimization does not allow duplicate points (#437)
* Update docs of `bayesian_optimization.py` and `observer.py`.
* Fix minor style issue in module docstring
* Update docs of `__init__.py` and `events.py`.
* Fix minor style issue in class docstring
* Add workflow to check docstrings
* Update bayes_opt/bayesian_optimization.py
Co-authored-by: Leandro Braga <[email protected]>
---------
Co-authored-by: YoungJae Bae <[email protected]>
Co-authored-by: Leandro Braga <[email protected]>
* Pydocstyle (#453)
* Improve acq_max seeding of L-BFGS-B optimization (#297)
---------
Co-authored-by: ptapping <[email protected]>
* Domain reduction, Sphinx docs (#455)
* Fixes issue-436: Constrained optimization does not allow duplicate points (#437)
* Update docs of `bayesian_optimization.py` and `observer.py`.
* Fix minor style issue in module docstring
* Update docs of `__init__.py` and `events.py`.
* Fix minor style issue in class docstring
* Add workflow to check docstrings
* Update bayes_opt/bayesian_optimization.py
Co-authored-by: Leandro Braga <[email protected]>
* Improve acq_max seeding of L-BFGS-B optimization (#297)
* bounds_transformer could bypass global_bounds due to the test logic within _trim function in domain_reduction.py (#441)
* Update trim bounds in domain_reduction.py
Previously, when the new upper limit was less than the original lower limit, the new_bounds could bypass the global_bounds.
* Update test_seq_domain_red.py
Added test cases to catch an error when both bounds of new_bounds exceeded the global_bounds
* Update domain_reduction.py
_trim function now avoids an error when both bounds for a given parameter in new_bounds exceed the global_bounds
* Update domain_reduction.py comments
* fixed English in domain_reduction.py
* use numpy to sort bounds, boundary exceeded warn.
* simple sort test added
* domain_red windows target_space to global_bounds
Added windowing function to improve the convergence of optimizers that use domain_reduction. Improved comments and documentation.
* target_space.max respects bounds; SDRT warnings
* Remove unused function.
This function was used to prototype a solution. It should not have been pushed and can be removed.
* Updated target_space.py docstrings
* Update tests/test_target_space.py
Co-authored-by: till-m <[email protected]>
* Added pbound warnings, updated various tests.
* updated line spacing for consistency and style
* added pbound test condition
---------
Co-authored-by: till-m <[email protected]>
* DomainReduction docs, docstyle
* Add missing doc dependency
---------
Co-authored-by: YoungJae Bae <[email protected]>
Co-authored-by: Leandro Braga <[email protected]>
Co-authored-by: ptapping <[email protected]>
Co-authored-by: Edgar <[email protected]>
* Small fixes, minor cosmetic changes
* Add some more docs to target space and constraint, cosmetic changes
* Remove duplicate code snippet
* Remove numpydoc + adjust "*" formatting accordingly
* Explicitly add D417, adjust code accordingly
* Adjust `TargetSpace.probe()` behaviour to be in line with docstring.
* Update bayes_opt/target_space.py
Co-authored-by: Edgar <[email protected]>
* Update README.md
---------
Co-authored-by: Leandro Braga <[email protected]>
Co-authored-by: YoungJae Bae <[email protected]>
Co-authored-by: ptapping <[email protected]>
Co-authored-by: Edgar <[email protected]>
Pure Python implementation of bayesian global optimization with gaussian
12
12
processes.
13
13
14
+
## Installation
15
+
14
16
* PyPI (pip):
15
17
16
18
```console
@@ -30,48 +32,40 @@ suited for optimization of high cost functions, situations where the balance
30
32
between exploration and exploitation is important.
31
33
32
34
## Quick Start
33
-
See below for a quick tour over the basics of the Bayesian Optimization package. More detailed information, other advanced features, and tips on usage/implementation can be found in the [examples](https://github.com/fmfn/BayesianOptimization/tree/master/examples) folder. I suggest that you:
34
-
- Follow the
35
-
[basic tour notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/basic-tour.ipynb)
36
-
to learn how to use the package's most important features.
37
-
- Take a look at the
38
-
[advanced tour notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/advanced-tour.ipynb)
39
-
to learn how to make the package more flexible, how to deal with categorical parameters, how to use observers, and more.
- Explore this [notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/exploitation_vs_exploration.ipynb)
35
+
See below for a quick tour over the basics of the Bayesian Optimization package. More detailed information, other advanced features, and tips on usage/implementation can be found in the [examples](http://bayesian-optimization.github.io/BayesianOptimization/examples.html) folder. I suggest that you:
36
+
- Follow the [basic tour notebook](http://bayesian-optimization.github.io/BayesianOptimization/basic-tour.html) to learn how to use the package's most important features.
37
+
- Take a look at the [advanced tour notebook](http://bayesian-optimization.github.io/BayesianOptimization/advanced-tour.html) to learn how to make the package more flexible, how to deal with categorical parameters, how to use observers, and more.
38
+
- Check out this [notebook](http://bayesian-optimization.github.io/BayesianOptimization/visualization.html) with a step by step visualization of how this method works.
39
+
- To understand how to use bayesian optimization when additional constraints are present, see the [constrained optimization notebook](http://bayesian-optimization.github.io/BayesianOptimization/constraints.html).
40
+
- Explore this [notebook](http://bayesian-optimization.github.io/BayesianOptimization/exploitation_vs_exploration.html)
46
41
exemplifying the balance between exploration and exploitation and how to
47
42
control it.
48
-
- Go over this [script](https://github.com/fmfn/BayesianOptimization/blob/master/examples/sklearn_example.py)
43
+
- Go over this [script](https://github.com/bayesian-optimization/BayesianOptimization/blob/master/examples/sklearn_example.py)
49
44
for examples of how to tune parameters of Machine Learning models using cross validation and bayesian optimization.
50
-
- Explore the [domain reduction notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/domain_reduction.ipynb) to learn more about how search can be sped up by dynamically changing parameters' bounds.
51
-
- Finally, take a look at this [script](https://github.com/fmfn/BayesianOptimization/blob/master/examples/async_optimization.py)
45
+
- Explore the [domain reduction notebook](http://bayesian-optimization.github.io/BayesianOptimization/domain_reduction.html) to learn more about how search can be sped up by dynamically changing parameters' bounds.
46
+
- Finally, take a look at this [script](https://github.com/bayesian-optimization/BayesianOptimization/blob/master/examples/async_optimization.py)
52
47
for ideas on how to implement bayesian optimization in a distributed fashion using this package.
53
48
54
49
55
50
## How does it work?
56
51
57
52
Bayesian optimization works by constructing a posterior distribution of functions (gaussian process) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not, as seen in the picture below.
58
53
59
-

54
+

60
55
61
56
As you iterate over and over, the algorithm balances its needs of exploration and exploitation taking into account what it knows about the target function. At each step a Gaussian Process is fitted to the known samples (points previously explored), and the posterior distribution, combined with a exploration strategy (such as UCB (Upper Confidence Bound), or EI (Expected Improvement)), are used to determine the next point that should be explored (see the gif below).
62
57
63
-

58
+

64
59
65
60
This process is designed to minimize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore Bayesian Optimization is most adequate for situations where sampling the function to be optimized is a very expensive endeavor. See the references for a proper discussion of this method.
66
61
67
62
This project is under active development, if you find a bug, or anything that
68
63
needs correction, please let me know.
69
64
70
65
71
-
Basic tour of the Bayesian Optimization package
72
-
===============================================
66
+
## Basic tour of the Bayesian Optimization package
73
67
74
-
## 1. Specifying the function to be optimized
68
+
###1. Specifying the function to be optimized
75
69
76
70
This is a function optimization package, therefore the first and most important ingredient is, of course, the function to be optimized.
77
71
@@ -89,7 +83,7 @@ def black_box_function(x, y):
89
83
return-x **2- (y -1) **2+1
90
84
```
91
85
92
-
## 2. Getting Started
86
+
###2. Getting Started
93
87
94
88
All we need to get started is to instantiate a `BayesianOptimization` object specifying a function to be optimized `f`, and its parameters with their corresponding bounds, `pbounds`. This is a constrained optimization technique, so you must specify the minimum and maximum values that can be probed for each parameter in order for it to work
95
89
@@ -160,7 +154,7 @@ for i, res in enumerate(optimizer.res):
160
154
```
161
155
162
156
163
-
### 2.1 Changing bounds
157
+
####2.1 Changing bounds
164
158
165
159
During the optimization process you may realize the bounds chosen for some parameters are not adequate. For these situations you can invoke the method `set_bounds` to alter them. You can pass any combination of **existing** parameters and their associated new bounds.
166
160
@@ -183,17 +177,17 @@ optimizer.maximize(
183
177
| 10 | -1.762 | 1.442 | 0.1735 |
184
178
=================================================
185
179
186
-
### 2.2 Sequential Domain Reduction
180
+
####2.2 Sequential Domain Reduction
187
181
188
182
Sometimes the initial boundaries specified for a problem are too wide, and adding points to improve the response surface in regions of the solution domain is extraneous. Other times the cost function is very expensive to compute, and minimizing the number of calls is extremely beneficial.
189
183
190
184
When it's worthwhile to converge on an optimal point quickly rather than try to find the optimal point, contracting the domain around the current optimal value as the search progresses can speed up the search progress considerably. Using the `SequentialDomainReductionTransformer` the bounds of the problem can be panned and zoomed dynamically in an attempt to improve convergence.
An example of using the `SequentialDomainReductionTransformer` is shown in the [domain reduction notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/domain_reduction.ipynb). More information about this method can be found in the paper ["On the robustness of a simple domain reduction scheme for simulation‐based optimization"](http://www.truegrid.com/srsm_revised.pdf).
188
+
An example of using the `SequentialDomainReductionTransformer` is shown in the [domain reduction notebook](http://bayesian-optimization.github.io/BayesianOptimization/domain_reduction.html). More information about this method can be found in the paper ["On the robustness of a simple domain reduction scheme for simulation‐based optimization"](http://www.truegrid.com/srsm_revised.pdf).
195
189
196
-
## 3. Guiding the optimization
190
+
###3. Guiding the optimization
197
191
198
192
It is often the case that we have an idea of regions of the parameter space where the maximum of our function might lie. For these situations the `BayesianOptimization` object allows the user to specify points to be probed. By default these will be explored lazily (`lazy=True`), meaning these points will be evaluated only the next time you call `maximize`. This probing process happens before the gaussian process takes over.
By default you can follow the progress of your optimization by setting `verbose>0` when instantiating the `BayesianOptimization` object. If you need more control over logging/alerting you will need to use an observer. For more information about observers checkout the advanced tour notebook. Here we will only see how to use the native `JSONLogger` object to save to and load progress from files.
227
221
228
-
### 4.1 Saving progress
222
+
####4.1 Saving progress
229
223
230
224
231
225
```python
@@ -255,7 +249,7 @@ optimizer.maximize(
255
249
256
250
By default the previous data in the json file is removed. If you want to keep working with the same logger, the `reset` parameter in `JSONLogger` should be set to False.
257
251
258
-
### 4.2 Loading progress
252
+
####4.2 Loading progress
259
253
260
254
Naturally, if you stored progress you will be able to load that onto a new instance of `BayesianOptimization`. The easiest way to do it is by invoking the `load_logs` function, from the `util` submodule.
This introduction covered the most basic functionality of the package. Checkout the [basic-tour](https://github.com/fmfn/BayesianOptimization/blob/master/examples/basic-tour.ipynb) and [advanced-tour](https://github.com/fmfn/BayesianOptimization/blob/master/examples/advanced-tour.ipynb) notebooks in the example folder, where you will find detailed explanations and other more advanced functionality. Also, browse the examples folder for implementation tips and ideas.
This introduction covered the most basic functionality of the package. Checkout the [basic-tour](http://bayesian-optimization.github.io/BayesianOptimization/basic-tour.html) and [advanced-tour](http://bayesian-optimization.github.io/BayesianOptimization/advanced-tour.html), where you will find detailed explanations and other more advanced functionality. Also, browse the [examples](http://bayesian-optimization.github.io/BayesianOptimization/examples.html) for implementation tips and ideas.
300
275
301
-
If you prefer, you can clone it and run the setup.py file. Use the following
302
-
commands to get a copy from Github and install all dependencies:
0 commit comments