Gaussian processes in TensorFlow
This is a bug-fixes release, due to problems with model saving in 2.6.0
.
gpflow.utilities.ops.cast
. Use tf.cast
instead.tf.saved_model
and methods wrapped in @check_shapes
.This release contains contributions from:
jesnie
The major theme for this release is heteroskedastic likelihoods. Changes have unfortunately caused some breaking changes, but makes it much easier to use heteroskedastic likelihoods, either by plugging together built-in GPflow classes, or when writing your own. See our updated notebook, for examples on how to use this.
X
argument. If you have written custom likelihoods or you have custom code calling likelihoods directly you will need to add this extra argument.CGLB
model the xnew
parameters has changed name to Xnew
, to be consistent with the other models.GPLVM
model the variance returned by predict_f
with full_cov=True
has changed shape from [batch..., N, N, P]
to [batch..., P, N, N]
to be consistent with the other models.gpflow.likelihoods.Gaussian.DEFAULT_VARIANCE_LOWER_BOUND
has been replaced with gpflow.likelihoods.scalar_continuous.DEFAULT_LOWER_BOUND
.InducingVariables
API. InducingVariables
must now have a shape
property.gpflow.experimental.check_shapes.get_shape.register
has been replaced with gpflow.experimental.check_shapes.register_get_shape
.check_shapes
will no longer automatically wrap shape checking in tf.compat.v1.flags.tf_decorator.make_decorator
. This is likely to affect you if you use check_shapes
with custom Keras models. If you require the decorator you can manually enable it with check_shapes(..., tf_decorator=True)
.tf.function
. Use set_enable_check_shapes
to change this behaviour. See the API documentation for more details.X
argument, allowing you to easily implement heteroskedastic likelihoods.Gaussian
likelihood can now be parametrized by either a variance
or a scale
Gaussian
variance
Gaussian
scale
StudentT
scale
Gamma
shape
Beta
scale
GPR
and SGPR
can now be configured with a custom Gaussian likelihood, allowing you to make them heteroskedastic.gpflow.mean_functions
has been renamed gpflow.functions
, but with an alias, to avoid breaking changes.gpflow.experimental.check_shapes
cov: [n..., n...]
.is None
and is not None
as checks for conditional shapes.register_get_shape
instead of get_shape.register
, for better compatibility with TensorFlow.InducingVariable
s.arm64
) via the tensorflow-macos
dependency. (#1850)full_cov
and full_output_cov
.This release contains contributions from:
jesnie, corwinpro, st--, vdutor
This release fixes a performance regression introduced in 2.5.0
. 2.5.0
used features of Python
that tensorfow < 2.9.0
do not know how to compile, which negatively impacted performance.
This release contains contributions from:
jesnie
The focus of this release has mostly been bumping the minimally supported versions of Python and
TensorFlow; and development of gpflow.experimental.check_shapes
.
gpflow.utilities.utilities
. It was scheduled for deletion in 2.3.0
.
Use gpflow.utilities
instead. (#1804)Likelihood.predict_density
, which has been deprecated since March 24, 2020.
(#1804)ScalarLikelihood.num_gauss_hermite_points
, which has been deprecated since
September 30, 2020. (#1804)mypy
. (#1795, #1799, #1802, #1812, #1814, #1816)Significant work on gpflow.experimental.check_shapes
.
Optional
values. (#1797)Significant speed-up of the GPR posterior objects. (#1809, #1811)
Significant improvements to documentation. Note the new home page: https://gpflow.github.io/GPflow/index.html (#1828, #1829, #1830, #1831, #1833, #1841, #1842, #1856, #1857)
This release contains contributions from:
ltiao, uri.granta, frgsimpson, st--, jesnie
This release mostly focuses on make posterior objects useful for Bayesian Optimisation.
It also adds a new experimetal
sub-package, with a tool for annotating tensor shapes.
Slight change to the API of custom posterior objects.
gpflow.posteriors.AbstractPosterior._precompute
no longer must return an alpha
and an
Qinv
- instead it returns any arbitrary tuple of PrecomputedValue
s.
Correspondingly gpflow.posteriors.AbstractPosterior._conditional_with_precompute
should no
longer try to access self.alpha
and self.Qinv
, but instead is passed the tuple of tensors
returned by _precompute
, as a parameter. (#1763, #1767)
Slight change to the API of inducing points.
You should no longer override gpflow.inducing_variables.InducingVariables.__len__
. Override
gpflow.inducing_variables.InducingVariables.num_inducing
instead. num_inducing
should return a
tf.Tensor
which is consistent with previous behaviour, although the type previously was
annotated as int
. __len__
has been deprecated. (#1766, #1792)
mypy
.
(#1766, #1769, #1771, #1773, #1775, #1777, #1780, #1783, #1787, #1789)Add new posterior class to enable faster predictions from the VGP model. (#1761)
VGP class bug-fixed to work with variable-sized data. Note you can use
gpflow.models.vgp.update_vgp_data
to ensure variational parameters are updated sanely. (#1774).
All posterior classes bug-fixed to work with variable data sizes, for Bayesian Optimisation. (#1767)
Added experimental
sub-package for features that are still under developmet.
gpflow.experimental.check_shapes
for checking tensor shapes.
(#1760, #1768, #1782, #1785, #1788)dataclasses
dependency conditional at install time. (#1759)predict_f
. (#1755)This release contains contributions from:
jesnie, tmct, joacorapela
This is a bug-fix release, primarily for the GPR posterior object.
GPR posterior
GPRPosterior._conditional_with_precompute()
(#1747).Make gpflow.optimizers.Scipy
able to handle unused / unconnected variables. (#1745).
Build
make dev-install
also install the test requirements (#1737).Documentation
README.md
(#1736).cglb.ipynb
(#1742).Test suite
test_gpr_posterior.py
so it will cover leading dimension uses.This release contains contributions from:
st--, jesnie, johnamcleod, Andrew878
This release contains contributions from:
johnamcleod, st--, Andrew878, tadejkrivec, awav, avullo
The main focus of this release is the new "Posterior" object introduced by
PR #1636, which allows for a significant speed-up of post-training predictions
with the SVGP
model (partially resolving #1599).
gpflow.conditionals.conditional
.SVGP
model, you can call model.posterior()
to obtain a
Posterior object that precomputes all quantities not depending on the test
inputs (e.g. Choleskty of Kuu), and provides a posterior.predict_f()
method
that reuses these cached quantities. model.predict_f()
computes exactly the
same quantities as before and does not give any speed-up.gpflow.conditionals.conditional()
forwards to the same "fused" code-path as
before.gpflow.conditionals.conditional.register
is deprecated and should not be
called outside of the GPflow core code. If you have written your own
implementations of gpflow.conditionals.conditional()
, you have two options
to use your code with GPflow 2.2:
gpflow.models.SVGP
, use the
backwards-compatible gpflow.models.svgp.SVGP_deprecated
.gpflow.posteriors.AbstractPosterior
, and register
get_posterior_class()
instead (see the "Variational Fourier Features"
notebook for an example).SVGP
model. We
would like to extend this to the other models such as GPR
, SGPR
, or VGP
, but
this effort is beyond what we can currently provide. If you would be willing
to contribute to those efforts, please get in touch!GPModel
convenience
functions such as predict_f_samples
, predict_y
, predict_log_density
.
Again, if you're willing to contribute, get in touch!This release contains contributions from:
@stefanosele, @johnamcleod, @st--