A probabilistic programming language in TensorFlow. Deep generative models, variational inference.
tf.GraphKeys.REGULARIZATION_LOSSES
to variational inference (#813).ed.MetropolisHastings
(#806).We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
This version release comes with several new features, alongside a significant push for better documentation, examples, and unit testing.
ed.KLqp
's score function gradient now does more intelligent (automatic) Rao-Blackwellization for variance reduction.ed.WakeSleep
).examples/lstm.py
)examples/deep_exponential_family.py
)examples/sigmoid_belief_network.py
)examples/stochastic_block_model.py
)examples/cox_process.py
)We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
DirichletProcess
(#652).We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
Edward requires a TensorFlow version of at least 1.1.0rc0. This includes several breaking API changes:
Normal(loc=0.0, scale=1.0)
replaces the older syntax of Normal(mu=0.0, sigma=1.0)
.MultivariateNormalCholesky
is renamed to MultivariateNormalTriL
.MultivariateNormalFull
is removed.rv.get_batch_shape()
is renamed to rv.batch_shape
.rv.get_event_shape()
is renamed to rv.event_shape
.sample_shape
argument. This lets its associated tensor to represent more than a single sample (#591).ParamMixture
random variable. It is a mixture of random variables where each component has the same distribution (#592).DirichletProcess
has persistent states across calls to sample()
(#565, #575, #583).ed.complete_conditional
function (#588, #605, #613). See a Beta-Bernoulli example.BiGANInference
for adversarial feature learning (#597).Inference
, MonteCarlo
, VariationalInference
are abstract classes, preventing instantiation (#582).shape
property to random variables. It is the same as get_shape()
.collections
argument to random variables(#609).ed.get_blanket
to get Markov blanket of a random variable (#590).ed.get_dims
and ed.multivariate_rbf
utility functions are removed.We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
sess.run()
and eval()
) is enabled for RandomVariable
(#503).RandomVariable
(#515).We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
RandomVariable
. For example, this enables x + y
(#445).RandomVariable
(#483).GANInference
. There's a tutorial (#310).WGANInference
(#448).VariationalInference
is generalized to be a tensor (#467).Inference
can now work with tf.Tensor
latent variables and observed variables (#488).ed.evaluate
and ed.ppc
. This includes support for checking implicit models and proper Monte Carlo estimates for the posterior predictive density (#485).tensorflow
and tensorflow-gpu
, TensorFlow is no longer an explicit dependency (#482).ed.tile
utility function is removed (#484).We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
int
and float
data types (#421).We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.