Optimization



Yüklə 0,51 Mb.
səhifə15/19
tarix12.05.2023
ölçüsü0,51 Mb.
#112044
1   ...   11   12   13   14   15   16   17   18   19
bayesian optimallash

Derivative Observations Finally, we discuss optimization with derivatives. Observations of ∇f (x), optionally with normally distributed noise, may be incorporated directly into GP regression (Rasmussen and Williams, 2006, Sect 9.4). Lizotte (2008) proposed using gradient information in this way in Bayesian optimization, together with the EI acquisition function, showing an improvement over BFGS (Liu and Nocedal, 1989). EI is unchanged by a proposed observation of ∇f (x) in addition to f (x) as compared to its value when observing f (x) alone. (Though, if previous derivative observations have contributed to the time n posterior, then that time-n posterior will differ from what it would be if we had observed only f (x).) Thus, EI does not take advantage of the availability of derivative information to, for example, evaluate at points far away from previously evaluated ones where derivative information would be particularly useful. A KG method alleviating this problem was proposed by Wu et al. (2017). In other related work in this area, Osborne et al. (2009) proposed using gradient information to improve conditioning of the covariance matrix in GP regression, and Ahmed et al. (2016) proposed a method for choosing a single directional derivative to retain when observing gradients to improve the computational tractability of GP inference.


  1. Software


There are a variety of codes for Bayesian optimization and Gaussian process regression. Several of these Gaussian process regression and Bayesian optimization packages are developed together, with the Bayesian optimization package making use of the Gaussian process regression package. Other packages are standalone, providing only either Gaussian process regression support or Bayesian optimization support. We list here several of the most prominent packages, along with URLs that are current as of June 2018.

    • DiceKriging and DiceOptim are packages for Gaussian process regression and Bayesian optimization respectively, written in R. They are described in detail in Roustant et al. (2012) and are available from CRAN via https://cran.r-project.org/web/packages/DiceOptim/index.html.

    • GPyOpt (https://github.com/SheffieldML/GPyOpt) is a python Bayesian optimization library built on top of the Gaussian process regression library GPy (https://sheffieldml.github.io/ GPy/) both written and maintained by the machine learning group at Sheffield University.

    • Metrics Optimization Engine (MOE, https://github.com/Yelp/MOE) is a Bayesian optimization library in C++ with a python wrapper that supports GPU-based computations for improved speed. It was developed at Yelp by the founders of the Bayesian optimization startup, SigOpt (http:// sigopt.com). Cornell MOE (https://github.com/wujian16/Cornell-MOE) is built on MOE with changes that make it easier to install, and support for parallel and derivative-enabled knowledge- gradient algorithms.

    • Spearmint (https://github.com/HIPS/Spearmint), with an older version under a different license available at https://github.com/JasperSnoek/spearmint, is a python Bayesian optimization li- brary. Spearmint was written by the founders of the Bayesian optimization startup Whetlab, which was acquired by Twitter in 2015 Perez (2015).

    • DACE (Design and Analysis of Computer Experiments) is a Gaussian process regression library written in MATLAB, available at http://www2.imm.dtu.dk/projects/dace/. Although it was last updated in 2002, it remains widely used.

    • GPFlow (https://github.com/GPflow/GPflow) and GPyTorch (https://github.com/cornellius-gp/ gpytorch) are python Gaussian process regression library built on top of Tensorflow (https:

//www.tensorflow.org/) and PyTorch (https://pytorch.org/) respectively.

    • laGP (https://cran.r-project.org/web/packages/laGP/index.html) is an R package for Gaus- sian process regression and Bayesian optimization with support for inequality constraints.




  1. Yüklə 0,51 Mb.

    Dostları ilə paylaş:
1   ...   11   12   13   14   15   16   17   18   19




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©azkurs.org 2024
rəhbərliyinə müraciət

gir | qeydiyyatdan keç
    Ana səhifə


yükləyin