Bayesian inference with probabilistic programming.
Merged pull requests:
externalsampler
(#2204) (@torfjelde)Turing.Experimental
module where more experimental features will go, with the aim of eventually making its way into Turing
proper.Turing.Experimental.Gibbs
is a new implementation of the Gibbs
sampler which provides much greater flexibility specifying which variables should use which sampler. Note that this is in the Turing.Experimental
module, and thus will be prone to changes.Merged pull requests:
condition
(#2099) (@torfjelde)Merged pull requests:
Closed issues:
essential/ad.jl
to DynamicPPL (#2141)MH
only works correctly with subtypes of MvNormal
(#2180)settrans!
not defined" (#2183)filldist
on distributions requiring SimplexBijector
(#2190)Merged pull requests:
Closed issues:
essential/ad.jl
is removed, ForwardDiff
and ReverseDiff
integrations via LogDensityProblemsAD
are moved to DynamicPPL
and live in corresponding package extensions.LogDensityProblemsAD.ADgradient(ℓ::DynamicPPL.LogDensityFunction)
(i.e. the single argument method) is moved to Inference
module. It will create ADgradient
using the adtype
information stored in context
field of ℓ
.getADbackend
function is renamed to getADType
, the interface is preserved, but packages that previously used getADbackend
should be updated to use getADType
.TuringTag
for ForwardDiff is also removed, now DynamicPPLTag
is defined in DynamicPPL
package and should serve the same purpose.Merged pull requests:
ad.jl
to DynamicPPL
(#2158) (@sunxd3)autoad
to adtype
(#2168) (@ElOceanografo)accepted
arg to AdvancedMH.Transition
calls (#2172) (@sunxd3)Merged pull requests:
Prior
should use PriorContext
(#2170) (@torfjelde)Closed issues:
ArgumentError: Union{} does not have elements
when trying to run a state space model (#2151)Prior
sampler should use PriorContext
, not DefaultContext
(#2169)