hbayes.bayesian_optimization

Queue

class hbayes.bayesian_optimization.Queue[source]
__init__()[source]

Initialize self. See help(type(self)) for accurate signature.

add(obj)[source]

Add object to end of queue.

BayesianOptimization

class hbayes.bayesian_optimization.BayesianOptimization(f, pbounds, random_state=None, verbose=2, bounds_transformer=None)[source]
Overview:

This class takes the function to optimize as well as the parameters bounds in order to find which values for the parameters yield the maximum value using bayesian optimization.

Parameters
  • f – Function to be maximized.

  • pbounds – Dictionary with parameters names as keys and a tuple with minimum and maximum values.=

  • random_state – If the value is an integer, it is used as the seed for creating a numpy.random.RandomState. Otherwise, the random state provided it is used. When set to None, an unseeded random state is generated.

  • verbose – The level of verbosity.

  • bounds_transformer – If provided, the transformation is applied to the bounds.

__init__(f, pbounds, random_state=None, verbose=2, bounds_transformer=None)[source]

Constructor of Observable.

Parameters

events – Set of events, can be a list, tuple or an enum class.

Note

When enum is used, its values will be used as events. For example:

>>> from enum import IntEnum
>>> from hbutils.design import Observable
>>>
>>> class MyIntEnum(IntEnum):
...     A = 1
...     B = 2
>>>
>>> # equals to `Observable([MyIntEnum.A, MyIntEnum.B])`
... o = Observable(MyIntEnum)
>>> o._events  # just for explanation, do not do this on actual use
{<MyIntEnum.A: 1>: {}, <MyIntEnum.B: 2>: {}}
maximize(init_points: int = 5, n_iter: int = 25, acq='ucb', kappa=2.576, kappa_decay=1, kappa_decay_delay=0, xi=0.0, **gp_params)[source]

Probes the target space to find the parameters that yield the maximum value for the given function.

Parameters
  • init_points – Number of iterations before the explorations starts the exploration for the maximum.

  • n_iter – Number of iterations where the method attempts to find the maximum value.

  • acq – The acquisition method used. ucb stands for the Upper Confidence Bounds method. ei is the Expected Improvement method. poi is the Probability Of Improvement criterion.

  • kappa – Parameter to indicate how closed are the next parameters sampled. Higher value = favors spaces that are least explored. Lower value = favors spaces where the regression function is the highest.

  • kappa_decaykappa is multiplied by this factor every iteration.

  • kappa_decay_delay – Number of iterations that must have passed before applying the decay to kappa.

  • xi – [unused yet].

probe(params: Dict[str, float], lazy=True)[source]

Evaluates the function on the given points. Useful to guide the optimizer.

Parameters
  • params – The parameters where the optimizer will evaluate the function.

  • lazy – If True, the optimizer will evaluate the points when calling maximize(). Otherwise, it will evaluate it at the moment.

register(x: Union[numpy.ndarray, Dict[str, float]], y: float)[source]

Expect observation with known target

set_bounds(new_bounds: Dict[str, Tuple[float, float]])[source]

A method that allows changing the lower and upper searching bounds

Parameters

new_bounds – A dictionary with the parameter name and its new bounds.

set_gp_params(**params)[source]

Set parameters to the internal Gaussian Process Regressor

suggest(utility_function: hbayes.util.UtilityFunction) → Dict[str, float][source]

Most promising point to probe next