Skip to content
fernando edited this page Jun 10, 2014 · 5 revisions

Welcome to the Bayesian-Optimization-with-Gaussian-Processes wiki!

This is a constrained global optimization package built upon bayesian inference and gaussian process, that attempts to find the maximum value of an unknown function in as few iterations as possible. This technique is particularly suited for optimization of high cost functions, situations where the balance between exploration and exploitation is important.

This package was motivated by hyper-parameter optimization of machine leaning algorithms when performing cross validation. Some of the design choices were clearly made with this setting in mind and ultimately a out-of-the-box cross validation optimization object will be implemented (soon).

Disclaimer: This project is under active development, some of its functionalities and sintaxes are bound to change, sometimes dramatically. If you find a bug, or anything that needs correction, please let me know.


Layout

  • Global optimization class:

    • Takes a multivariate function f(x1,...,xn), together with lower and upper bounds for each variable ( {'x1' : (x1_min, x1_max), ..., 'xn' : (xn_min, xn_max)} ) as parameters and uses gaussian processes together with acquisition functions to efficiently search the parameter space in search for the global maximum.
    • The object initializes the process at random points, however the user has the option to pass extra initialization points.
    • Log scale search is available for sharply peaked function near zero, or for upper_bound/lower_bound >> 1.
  • Gaussian process class:

    • Fits a gaussian process to data set.
    • Currently the prior mean is taken to be zero. Marginalizing over it is in the 'to do' list.
    • ...
  • Kernels

    • Squared exponential and ARD Matern are supported, but the user is free to pass a custom kernel function.
    • The kernel parameters are ignored when using GP.best_fit
  • Acquisition Functions

    • Probability of Improvement, Expected Improvement and Upper Confidence Bound are supported. Again, the user can pass a custom ac function.
  • Magic Box class

    • Automatic cross validation optimizer... under development.
  1. Introduction
  2. bo.bayes_opt
    1. bayes_opt
    2. GP
  3. support.objects
    1. Kernels
    2. Acquisition Functions
Clone this wiki locally