Skip to content

Commit 383cb29

Browse files
till-mleandrobbragaYoungJaeBaeptappingperezed00
authored
Docstring overhaul (#457)
* Replace custom colour implementation, add docs for `logger.py`, `util.py` (#435) * Replace custom colour implementation, add docs for `logger.py`, `util.py` * minor typo/syntax fixes * User `or` to separate different possible types * Update docs & linting for `constraints.py`, `target_space.py` (#440) * Run tests on any PR * Update docs, linting * Update bayes_opt/constraint.py Co-authored-by: Leandro Braga <[email protected]> * Rename mislabelled parameters --------- Co-authored-by: Leandro Braga <[email protected]> * Update various docstrings, add workflow to check docstrings (#445) * Fixes issue-436: Constrained optimization does not allow duplicate points (#437) * Update docs of `bayesian_optimization.py` and `observer.py`. * Fix minor style issue in module docstring * Update docs of `__init__.py` and `events.py`. * Fix minor style issue in class docstring * Add workflow to check docstrings * Update bayes_opt/bayesian_optimization.py Co-authored-by: Leandro Braga <[email protected]> --------- Co-authored-by: YoungJae Bae <[email protected]> Co-authored-by: Leandro Braga <[email protected]> * Pydocstyle (#453) * Improve acq_max seeding of L-BFGS-B optimization (#297) --------- Co-authored-by: ptapping <[email protected]> * Domain reduction, Sphinx docs (#455) * Fixes issue-436: Constrained optimization does not allow duplicate points (#437) * Update docs of `bayesian_optimization.py` and `observer.py`. * Fix minor style issue in module docstring * Update docs of `__init__.py` and `events.py`. * Fix minor style issue in class docstring * Add workflow to check docstrings * Update bayes_opt/bayesian_optimization.py Co-authored-by: Leandro Braga <[email protected]> * Improve acq_max seeding of L-BFGS-B optimization (#297) * bounds_transformer could bypass global_bounds due to the test logic within _trim function in domain_reduction.py (#441) * Update trim bounds in domain_reduction.py Previously, when the new upper limit was less than the original lower limit, the new_bounds could bypass the global_bounds. * Update test_seq_domain_red.py Added test cases to catch an error when both bounds of new_bounds exceeded the global_bounds * Update domain_reduction.py _trim function now avoids an error when both bounds for a given parameter in new_bounds exceed the global_bounds * Update domain_reduction.py comments * fixed English in domain_reduction.py * use numpy to sort bounds, boundary exceeded warn. * simple sort test added * domain_red windows target_space to global_bounds Added windowing function to improve the convergence of optimizers that use domain_reduction. Improved comments and documentation. * target_space.max respects bounds; SDRT warnings * Remove unused function. This function was used to prototype a solution. It should not have been pushed and can be removed. * Updated target_space.py docstrings * Update tests/test_target_space.py Co-authored-by: till-m <[email protected]> * Added pbound warnings, updated various tests. * updated line spacing for consistency and style * added pbound test condition --------- Co-authored-by: till-m <[email protected]> * DomainReduction docs, docstyle * Add missing doc dependency --------- Co-authored-by: YoungJae Bae <[email protected]> Co-authored-by: Leandro Braga <[email protected]> Co-authored-by: ptapping <[email protected]> Co-authored-by: Edgar <[email protected]> * Small fixes, minor cosmetic changes * Add some more docs to target space and constraint, cosmetic changes * Remove duplicate code snippet * Remove numpydoc + adjust "*" formatting accordingly * Explicitly add D417, adjust code accordingly * Adjust `TargetSpace.probe()` behaviour to be in line with docstring. * Update bayes_opt/target_space.py Co-authored-by: Edgar <[email protected]> * Update README.md --------- Co-authored-by: Leandro Braga <[email protected]> Co-authored-by: YoungJae Bae <[email protected]> Co-authored-by: ptapping <[email protected]> Co-authored-by: Edgar <[email protected]>
1 parent 129caac commit 383cb29

29 files changed

+1006
-461
lines changed

.github/workflows/build_docs.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ jobs:
2929
pip install nbsphinx
3030
pip install sphinx_rtd_theme
3131
pip install jupyter
32+
pip install myst-parser
3233
- name: Install package
3334
run: |
3435
pip install -e .
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
# This workflow will install Python dependencies and check docstrings with a single version of Python
2+
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions
3+
4+
name: Check docstrings
5+
6+
on:
7+
push:
8+
branches: [ "master" ]
9+
pull_request:
10+
11+
permissions:
12+
contents: read
13+
14+
jobs:
15+
build:
16+
runs-on: ubuntu-latest
17+
steps:
18+
- uses: actions/checkout@v3
19+
- name: Set up Python 3.10
20+
uses: actions/setup-python@v3
21+
with:
22+
python-version: "3.10"
23+
- name: Install dependencies
24+
run: |
25+
python -m pip install --upgrade pip
26+
pip install pydocstyle
27+
- name: Check docstyle
28+
run : pydocstyle --convention=numpy --add-select D417 bayes_opt/*
29+
#- name: Run linting
30+
# run : pylint bayes_opt/* --disable=C0103 # ignore no snake_case conformity of arguments

.github/workflows/run_tests.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,6 @@ on:
77
push:
88
branches: [ "master" ]
99
pull_request:
10-
branches: [ "master" ]
1110

1211
permissions:
1312
contents: read

.gitignore

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,4 +32,6 @@ venv.bak/
3232

3333
docs/*
3434
docsrc/.ipynb_checkpoints/*
35-
docsrc/*.ipynb
35+
docsrc/*.ipynb
36+
docsrc/static/*
37+
docsrc/README.md

README.md

Lines changed: 60 additions & 69 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,18 @@
11
<div align="center">
2-
<img src="https://github.com/fmfn/BayesianOptimization/blob/master/examples/func.png"><br><br>
2+
<img src="https://raw.githubusercontent.com/bayesian-optimization/BayesianOptimization/master/static/func.png"><br><br>
33
</div>
44

55
# Bayesian Optimization
66

7-
![tests](https://github.com/fmfn/BayesianOptimization/actions/workflows/run_tests.yml/badge.svg)
8-
[![Codecov](https://codecov.io/github/fmfn/BayesianOptimization/badge.svg?branch=master&service=github)](https://codecov.io/github/fmfn/BayesianOptimization?branch=master)
7+
![tests](https://github.com/bayesian-optimization/BayesianOptimization/actions/workflows/run_tests.yml/badge.svg)
8+
[![Codecov](https://codecov.io/github/bayesian-optimization/BayesianOptimization/badge.svg?branch=master&service=github)](https://codecov.io/github/bayesian-optimization/BayesianOptimization?branch=master)
99
[![Pypi](https://img.shields.io/pypi/v/bayesian-optimization.svg)](https://pypi.python.org/pypi/bayesian-optimization)
1010

1111
Pure Python implementation of bayesian global optimization with gaussian
1212
processes.
1313

14+
## Installation
15+
1416
* PyPI (pip):
1517

1618
```console
@@ -30,48 +32,40 @@ suited for optimization of high cost functions, situations where the balance
3032
between exploration and exploitation is important.
3133

3234
## Quick Start
33-
See below for a quick tour over the basics of the Bayesian Optimization package. More detailed information, other advanced features, and tips on usage/implementation can be found in the [examples](https://github.com/fmfn/BayesianOptimization/tree/master/examples) folder. I suggest that you:
34-
- Follow the
35-
[basic tour notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/basic-tour.ipynb)
36-
to learn how to use the package's most important features.
37-
- Take a look at the
38-
[advanced tour notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/advanced-tour.ipynb)
39-
to learn how to make the package more flexible, how to deal with categorical parameters, how to use observers, and more.
40-
- Check out this
41-
[notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/visualization.ipynb)
42-
with a step by step visualization of how this method works.
43-
- To understand how to use bayesian optimization when additional constraints are present, see the
44-
[constrained optimization notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/constraints.ipynb).
45-
- Explore this [notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/exploitation_vs_exploration.ipynb)
35+
See below for a quick tour over the basics of the Bayesian Optimization package. More detailed information, other advanced features, and tips on usage/implementation can be found in the [examples](http://bayesian-optimization.github.io/BayesianOptimization/examples.html) folder. I suggest that you:
36+
- Follow the [basic tour notebook](http://bayesian-optimization.github.io/BayesianOptimization/basic-tour.html) to learn how to use the package's most important features.
37+
- Take a look at the [advanced tour notebook](http://bayesian-optimization.github.io/BayesianOptimization/advanced-tour.html) to learn how to make the package more flexible, how to deal with categorical parameters, how to use observers, and more.
38+
- Check out this [notebook](http://bayesian-optimization.github.io/BayesianOptimization/visualization.html) with a step by step visualization of how this method works.
39+
- To understand how to use bayesian optimization when additional constraints are present, see the [constrained optimization notebook](http://bayesian-optimization.github.io/BayesianOptimization/constraints.html).
40+
- Explore this [notebook](http://bayesian-optimization.github.io/BayesianOptimization/exploitation_vs_exploration.html)
4641
exemplifying the balance between exploration and exploitation and how to
4742
control it.
48-
- Go over this [script](https://github.com/fmfn/BayesianOptimization/blob/master/examples/sklearn_example.py)
43+
- Go over this [script](https://github.com/bayesian-optimization/BayesianOptimization/blob/master/examples/sklearn_example.py)
4944
for examples of how to tune parameters of Machine Learning models using cross validation and bayesian optimization.
50-
- Explore the [domain reduction notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/domain_reduction.ipynb) to learn more about how search can be sped up by dynamically changing parameters' bounds.
51-
- Finally, take a look at this [script](https://github.com/fmfn/BayesianOptimization/blob/master/examples/async_optimization.py)
45+
- Explore the [domain reduction notebook](http://bayesian-optimization.github.io/BayesianOptimization/domain_reduction.html) to learn more about how search can be sped up by dynamically changing parameters' bounds.
46+
- Finally, take a look at this [script](https://github.com/bayesian-optimization/BayesianOptimization/blob/master/examples/async_optimization.py)
5247
for ideas on how to implement bayesian optimization in a distributed fashion using this package.
5348

5449

5550
## How does it work?
5651

5752
Bayesian optimization works by constructing a posterior distribution of functions (gaussian process) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not, as seen in the picture below.
5853

59-
![BayesianOptimization in action](./examples/bo_example.png)
54+
![BayesianOptimization in action](./static/bo_example.png)
6055

6156
As you iterate over and over, the algorithm balances its needs of exploration and exploitation taking into account what it knows about the target function. At each step a Gaussian Process is fitted to the known samples (points previously explored), and the posterior distribution, combined with a exploration strategy (such as UCB (Upper Confidence Bound), or EI (Expected Improvement)), are used to determine the next point that should be explored (see the gif below).
6257

63-
![BayesianOptimization in action](./examples/bayesian_optimization.gif)
58+
![BayesianOptimization in action](./static/bayesian_optimization.gif)
6459

6560
This process is designed to minimize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore Bayesian Optimization is most adequate for situations where sampling the function to be optimized is a very expensive endeavor. See the references for a proper discussion of this method.
6661

6762
This project is under active development, if you find a bug, or anything that
6863
needs correction, please let me know.
6964

7065

71-
Basic tour of the Bayesian Optimization package
72-
===============================================
66+
## Basic tour of the Bayesian Optimization package
7367

74-
## 1. Specifying the function to be optimized
68+
### 1. Specifying the function to be optimized
7569

7670
This is a function optimization package, therefore the first and most important ingredient is, of course, the function to be optimized.
7771

@@ -89,7 +83,7 @@ def black_box_function(x, y):
8983
return -x ** 2 - (y - 1) ** 2 + 1
9084
```
9185

92-
## 2. Getting Started
86+
### 2. Getting Started
9387

9488
All we need to get started is to instantiate a `BayesianOptimization` object specifying a function to be optimized `f`, and its parameters with their corresponding bounds, `pbounds`. This is a constrained optimization technique, so you must specify the minimum and maximum values that can be probed for each parameter in order for it to work
9589

@@ -160,7 +154,7 @@ for i, res in enumerate(optimizer.res):
160154
```
161155

162156

163-
### 2.1 Changing bounds
157+
#### 2.1 Changing bounds
164158

165159
During the optimization process you may realize the bounds chosen for some parameters are not adequate. For these situations you can invoke the method `set_bounds` to alter them. You can pass any combination of **existing** parameters and their associated new bounds.
166160

@@ -183,17 +177,17 @@ optimizer.maximize(
183177
| 10 | -1.762 | 1.442 | 0.1735 |
184178
=================================================
185179

186-
### 2.2 Sequential Domain Reduction
180+
#### 2.2 Sequential Domain Reduction
187181

188182
Sometimes the initial boundaries specified for a problem are too wide, and adding points to improve the response surface in regions of the solution domain is extraneous. Other times the cost function is very expensive to compute, and minimizing the number of calls is extremely beneficial.
189183

190184
When it's worthwhile to converge on an optimal point quickly rather than try to find the optimal point, contracting the domain around the current optimal value as the search progresses can speed up the search progress considerably. Using the `SequentialDomainReductionTransformer` the bounds of the problem can be panned and zoomed dynamically in an attempt to improve convergence.
191185

192-
![sequential domain reduction](./examples/sdr.png)
186+
![sequential domain reduction](./static/sdr.png)
193187

194-
An example of using the `SequentialDomainReductionTransformer` is shown in the [domain reduction notebook](https://github.com/fmfn/BayesianOptimization/blob/master/examples/domain_reduction.ipynb). More information about this method can be found in the paper ["On the robustness of a simple domain reduction scheme for simulation‐based optimization"](http://www.truegrid.com/srsm_revised.pdf).
188+
An example of using the `SequentialDomainReductionTransformer` is shown in the [domain reduction notebook](http://bayesian-optimization.github.io/BayesianOptimization/domain_reduction.html). More information about this method can be found in the paper ["On the robustness of a simple domain reduction scheme for simulation‐based optimization"](http://www.truegrid.com/srsm_revised.pdf).
195189

196-
## 3. Guiding the optimization
190+
### 3. Guiding the optimization
197191

198192
It is often the case that we have an idea of regions of the parameter space where the maximum of our function might lie. For these situations the `BayesianOptimization` object allows the user to specify points to be probed. By default these will be explored lazily (`lazy=True`), meaning these points will be evaluated only the next time you call `maximize`. This probing process happens before the gaussian process takes over.
199193

@@ -221,11 +215,11 @@ optimizer.maximize(init_points=0, n_iter=0)
221215
=================================================
222216

223217

224-
## 4. Saving, loading and restarting
218+
### 4. Saving, loading and restarting
225219

226220
By default you can follow the progress of your optimization by setting `verbose>0` when instantiating the `BayesianOptimization` object. If you need more control over logging/alerting you will need to use an observer. For more information about observers checkout the advanced tour notebook. Here we will only see how to use the native `JSONLogger` object to save to and load progress from files.
227221

228-
### 4.1 Saving progress
222+
#### 4.1 Saving progress
229223

230224

231225
```python
@@ -255,7 +249,7 @@ optimizer.maximize(
255249

256250
By default the previous data in the json file is removed. If you want to keep working with the same logger, the `reset` parameter in `JSONLogger` should be set to False.
257251

258-
### 4.2 Loading progress
252+
#### 4.2 Loading progress
259253

260254
Naturally, if you stored progress you will be able to load that onto a new instance of `BayesianOptimization`. The easiest way to do it is by invoking the `load_logs` function, from the `util` submodule.
261255

@@ -277,54 +271,51 @@ load_logs(new_optimizer, logs=["./logs.log"]);
277271

278272
## Next Steps
279273

280-
This introduction covered the most basic functionality of the package. Checkout the [basic-tour](https://github.com/fmfn/BayesianOptimization/blob/master/examples/basic-tour.ipynb) and [advanced-tour](https://github.com/fmfn/BayesianOptimization/blob/master/examples/advanced-tour.ipynb) notebooks in the example folder, where you will find detailed explanations and other more advanced functionality. Also, browse the examples folder for implementation tips and ideas.
281-
282-
Installation
283-
============
284-
285-
### Installation
286-
287-
The latest release can be obtained by two ways:
288-
289-
* With PyPI (pip):
290-
291-
pip install bayesian-optimization
292-
293-
* With conda (from conda-forge channel):
294-
295-
conda install -c conda-forge bayesian-optimization
296-
297-
The bleeding edge version can be installed with:
298-
299-
pip install git+https://github.com/fmfn/BayesianOptimization.git
274+
This introduction covered the most basic functionality of the package. Checkout the [basic-tour](http://bayesian-optimization.github.io/BayesianOptimization/basic-tour.html) and [advanced-tour](http://bayesian-optimization.github.io/BayesianOptimization/advanced-tour.html), where you will find detailed explanations and other more advanced functionality. Also, browse the [examples](http://bayesian-optimization.github.io/BayesianOptimization/examples.html) for implementation tips and ideas.
300275

301-
If you prefer, you can clone it and run the setup.py file. Use the following
302-
commands to get a copy from Github and install all dependencies:
276+
## Minutiae
303277

304-
git clone https://github.com/fmfn/BayesianOptimization.git
305-
cd BayesianOptimization
306-
python setup.py install
278+
### Citation
307279

308-
Citation
309-
============
310-
311-
If you used this package in your research and is interested in citing it here's how you do it:
280+
If you used this package in your research, please cite it:
312281

313282
```
314283
@Misc{,
315284
author = {Fernando Nogueira},
316285
title = {{Bayesian Optimization}: Open source constrained global optimization tool for {Python}},
317286
year = {2014--},
318-
url = " https://github.com/fmfn/BayesianOptimization"
287+
url = " https://github.com/bayesian-optimization/BayesianOptimization"
288+
}
289+
```
290+
If you used any of the advanced functionalities, please additionally cite the corresponding publication:
291+
292+
For the `SequentialDomainTransformer`:
293+
```
294+
@article{
295+
author = {Stander, Nielen and Craig, Kenneth},
296+
year = {2002},
297+
month = {06},
298+
pages = {},
299+
title = {On the robustness of a simple domain reduction scheme for simulation-based optimization},
300+
volume = {19},
301+
journal = {International Journal for Computer-Aided Engineering and Software (Eng. Comput.)},
302+
doi = {10.1108/02644400210430190}
303+
}
304+
```
305+
306+
For constrained optimization:
307+
```
308+
@inproceedings{gardner2014bayesian,
309+
title={Bayesian optimization with inequality constraints.},
310+
author={Gardner, Jacob R and Kusner, Matt J and Xu, Zhixiang Eddie and Weinberger, Kilian Q and Cunningham, John P},
311+
booktitle={ICML},
312+
volume={2014},
313+
pages={937--945},
314+
year={2014}
319315
}
320316
```
321317

322-
# Dependencies
323-
* Numpy
324-
* Scipy
325-
* Scikit-learn
326-
327-
# References:
318+
### References:
328319
* http://papers.nips.cc/paper/4522-practical-bayesian-optimization-of-machine-learning-algorithms.pdf
329320
* http://arxiv.org/pdf/1012.2599v1.pdf
330321
* http://www.gaussianprocess.org/gpml/

bayes_opt/__init__.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,10 @@
1+
"""Pure Python implementation of bayesian global optimization with gaussian processes."""
12
from .bayesian_optimization import BayesianOptimization, Events
23
from .domain_reduction import SequentialDomainReductionTransformer
34
from .util import UtilityFunction
45
from .logger import ScreenLogger, JSONLogger
56
from .constraint import ConstraintModel
7+
from .util import UtilityFunction
68

79
__all__ = [
810
"BayesianOptimization",

0 commit comments

Comments
 (0)