Deep universal probabilistic programming with Python and PyTorch
This patch release merely
help(MyDistribution)
TraceEnum_ELBO.compute_marginals()
infer_discrete()
finds no discrete sitespyroapi
potential_fn
issues in MCMC. #2591AffineAutoregressive
, Householder
, NeuralAutogregressive
, Spline
, and GeneralizedChannelPermute
flows.Permute
and AffineCoupling
can operate on a specific dimension with dim
keyword argument #2472AffineAutogregressive
#2504BatchNorm
TransformModule
#2459PyroModule
compatible with torch.nn.RNN
.log_abs_det_jacobian
of TransformModulesLocScaleReparam
whereby all loc-scale reparameterized sites shared a single centeredness parameter.jit_compile=True
flag in HMC/NUTS work for models with pyro.param
statements.Patches 1.2.0 with the following bug fixes:
This release adds a new effect handler and a collection of strategies that reparameterize models to improve geometry. These tools are largely orthogonal to other inference tools in Pyro, and can be used with SVI, MCMC, and other inference algorithms.
TransformedDistribution
s..rsample()
method. This supports non-Gaussian noise such as Levy Stable and StudentT, but requires reparameterization for inference.MultivariateNormal
conversion from scale_tril
to precision
.pyro.infer.ReweightedWakeSleep implements the Reweighted Wake Sleep algorithm (Le et al. 2019). Contributed by Siddharth Narayanaswamy and Tuan Anh Le.
pyro.infer.TraceTMC_ELBO implements the Tensor Monte Carlo marginal likelihood estimator (Aitchinson 2019), a generalization of the importance-weighted autoencoder objective.
pyro.infer.EnergyDistance implements a likelihood-free inference algorithm based on Szekely's energy statistics, a multidimensional generalization of CRPS (Gneiting & Raftery 2007).
pyro.contrib.cevae implements the Causal Inference VAE of (Louizos et al. 2017). See examples/contrib/cevae/synthetic.py for an end-to-end usage example.
pyro.deterministic primitive to record deterministic values in the trace.
pyro.nn.to_pyro_module_() recursively converts an regular nn.Module
to a PyroModule in-place.
A default implementation for Distribution.expand() that is available to all Pyro distributions that subclass from TorchDistribution
, making it easier to create custom distributions.
.rsample()
method but no .log_prob()
. This can be fit using EnergyDistance inference.pyro.util.save_visualization
has been deprecated, and dependency on graphviz
is removed.The objective of this release is to stabilize Pyro's interface and thereby make it safer to build high level components on top of Pyro.
pyro.contrib
may change at any time (though we aim for stability).FutureWarning
and specify possible work-arounds. Features marked as deprecated will not be maintained, and are likely to be removed in a future release.nn.Module
. PyroModule
is already used internally by AutoGuide
, EasyGuide
pyro.contrib.gp
, pyro.contrib.timeseries
, and elsewhere.TransformedDistribution(-, AbsTransform())
but providing a .log_prob()
method.AutoGuide
and EasyGuide
are now nn.Module
s and can be serialized separately from the param store. This enables serving via torch.jit.trace_module.Auto*Normal
family of autoguides now have init_scale
arguments, and init_loc_fn
has better support. Autoguides no longer support initialization by writing directly to the param store.InverseAutoregressiveFlow
to AffineAutoregressive
.pyro.generic
has been moved to a separate project pyroapi.pyro.contrib.glmm
has been moved to pyro.contrib.oed.glmm
and will eventually be replaced by BRMP.DeprecationWarning
s have been promoted to FutureWarning
s.pyro.random_module
: The pyro.random_module
primitive has been deprecated in favor of PyroModule which can be used to create Bayesian modules from torch.nn.Module
instances.SVI.run
: The SVI.run
method is deprecated and users are encouraged to use the .step method directly to run inference. For drawing samples from the posterior distribution, we recommend using the Predictive utility class, or directly by using the trace
and replay
effect handlers.TracePredictive
: The TracePredictive
class is deprecated in favor of Predictive, that can be used to gather samples from the posterior and predictive distributions in SVI and MCMC.mcmc.predictive
: This utility function has been absorbed into the more general Predictive class.