Bayesian inference with probabilistic programming.
Merged pull requests:
Closed issues:
essential/ad.jl
is removed, ForwardDiff
and ReverseDiff
integrations via LogDensityProblemsAD
are moved to DynamicPPL
and live in corresponding package extensions.LogDensityProblemsAD.ADgradient(ℓ::DynamicPPL.LogDensityFunction)
(i.e. the single argument method) is moved to Inference
module. It will create ADgradient
using the adtype
information stored in context
field of ℓ
.getADbackend
function is renamed to getADType
, the interface is preserved, but packages that previously used getADbackend
should be updated to use getADType
.TuringTag
for ForwardDiff is also removed, now DynamicPPLTag
is defined in DynamicPPL
package and should serve the same purpose.Merged pull requests:
ad.jl
to DynamicPPL
(#2158) (@sunxd3)autoad
to adtype
(#2168) (@ElOceanografo)accepted
arg to AdvancedMH.Transition
calls (#2172) (@sunxd3)Merged pull requests:
Prior
should use PriorContext
(#2170) (@torfjelde)Closed issues:
ArgumentError: Union{} does not have elements
when trying to run a state space model (#2151)Prior
sampler should use PriorContext
, not DefaultContext
(#2169)ADTypes.jl
. Users should now specify the desired ADType
directly in sampler constructors, e.g., HMC(0.1, 10; adtype=AutoForwardDiff(; chunksize))
, or HMC(0.1, 10; adtype=AutoReverseDiff(false))
(false
indicates not to use compiled tape).ADBackend
, setadbackend
, setadsafe
, setchunksize
, and setrdcache
are deprecated and will be removed in a future release.verifygrad
function.LogDensityProblemsAD
(v1.7).Full Changelog: https://github.com/TuringLang/Turing.jl/compare/v0.29.3...v0.30.0
Merged pull requests:
Closed issues:
LKJCholesky
does not work with compiled ReverseDiff.jl (#2091)