A Python Library for Deep Probabilistic Modeling
DeeProb-kit is a unified library written in Python consisting of a collection of deep probabilistic models (DPMs) that are tractable and exact representations for the modelled probability distributions. The availability of a representative selection of DPMs in a single library makes it possible to combine them in a straightforward manner, a common practice in deep learning research nowadays. In addition, it includes efficiently implemented learning techniques, inference routines, statistical algorithms, and provides high-quality fully-documented APIs. The development of DeeProb-kit will help the community to accelerate research on DPMs as well as to standardise their evaluation and better understand how they are related based on their expressivity.
The collection of implemented models is summarized in the following table.
Model | Description |
---|---|
Binary-CLT | Binary Chow-Liu Tree (CLT) |
Binary-CNet | Binary Cutset Network (CNet) |
SPN | Vanilla Sum-Product Network |
MSPN | Mixed Sum-Product Network |
XPC | Random Probabilistic Circuit |
RAT-SPN | Randomized and Tensorized Sum-Product Network |
DGC-SPN | Deep Generalized Convolutional Sum-Product Network |
MAF | Masked Autoregressive Flow |
NICE | Non-linear Independent Components Estimation Flow |
RealNVP | Real-valued Non-Volume-Preserving Flow |
The library can be installed either from PIP repository or by source code.
# Install from PIP repository
pip install deeprob-kit
# Install from `main` git branch
pip install -e git+https://github.com/deeprob-org/deeprob-kit.git@main#egg=deeprob-kit
The documentation is generated automatically by Sphinx using sources stored in the docs directory.
A collection of code examples and experiments can be found in the examples and experiments directories respectively. Moreover, benchmark code can be found in the benchmark directory.
@misc{loconte2022deeprob,
doi = {10.48550/ARXIV.2212.04403},
url = {https://arxiv.org/abs/2212.04403},
author = {Loconte, Lorenzo and Gala, Gennaro},
title = {{DeeProb-kit}: a Python Library for Deep Probabilistic Modelling},
publisher = {arXiv},
year = {2022}
}
[^1]: Peharz et al. On Theoretical Properties of Sum-Product Networks. AISTATS (2015). [^2]: Poon and Domingos. Sum-Product Networks: A New Deep Architecture. UAI (2011). [^3]: Molina, Vergari et al. Mixed Sum-Product Networks: A Deep Architecture for Hybrid Domains. AAAI (2018). [^4]: Molina, Vergari et al. SPFLOW : An easy and extensible library for deep probabilistic learning using Sum-Product Networks. CoRR (2019). [^5]: Di Mauro et al. Sum-Product Network structure learning by efficient product nodes discovery. AIxIA (2018). [^6]: Peharz et al. Probabilistic Deep Learning using Random Sum-Product Networks. UAI (2020). [^7]: Papamakarios et al. Masked Autoregressive Flow for Density Estimation. NeurIPS (2017). [^8]: Dinh et al. Density Estimation using RealNVP. ICLR (2017). [^9]: Dinh et al. NICE: Non-linear Independent Components Estimation. ICLR (2015). [^10]: Papamakarios, Nalisnick et al. Normalizing Flows for Probabilistic Modeling and Inference. JMLR (2021). [^11]: Van de Wolfshaar and Pronobis. Deep Generalized Convolutional Sum-Product Networks for Probabilistic Image Representations. PGM (2020). [^12]: Rahman et al. Cutset Networks: A Simple, Tractable, and Scalable Approach for Improving the Accuracy of Chow-Liu Trees. ECML-PKDD (2014). [^13]: Di Mauro, Gala et al. Random Probabilistic Circuits. UAI (2021). [^14]: Desana and Schnörr. Learning Arbitrary Sum-Product Network Leaves with Expectation-Maximization. CoRR (2016). [^15]: Peharz et al. Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits. ICML (2020).