Code for the SIGGRAPH 2022 paper "DeltaConv: Anisotropic Operators for Geometric Deep Learning on Point Clouds."
Code for the SIGGRAPH 2022 paper "DeltaConv: Anisotropic Operators for Geometric Deep Learning on Point Clouds" by Ruben Wiersma, Ahmad Nasikun, Elmar Eisemann, and Klaus Hildebrandt.
Anisotropic convolution is a central building block of CNNs but challenging to transfer to surfaces. DeltaConv learns combinations and compositions of operators from vector calculus, which are a natural fit for curved surfaces. The result is a simple and robust anisotropic convolution operator for point clouds with state-of-the-art results.
Top: unlike images, surfaces have no global coordinate system. Bottom: DeltaConv learns both scalar and vector features using geometric operators.
git clone https://github.com/rubenwiersma/deltaconv.git
environment.yml
:conda env create -n deltaconv -f environment.yml
Done!
If you wish to install DeltaConv in your own environment, proceed as follows.
Make sure that you have installed:
pip install numpy
conda install pyg -c pyg
Install DeltaConv:
pip install deltaconv
git clone --recurse-submodules https://github.com/rubenwiersma/deltaconv.git
If you have already cloned the repository without submodules, you can fix it with git submodule update --init --recursive
.
cd [root_folder]
pip install
See the README.md in replication_scripts
for instructions on replicating the experiments and using the pre-trained weights (available in experiments/pretrained_weights
).
In short, you can run bash scripts to replicate our experiments. For example, evaluating pre-trained weights on ShapeNet:
cd [root_folder]
conda activate deltaconv
bash replication_scripts/pretrained/shapenet.sh
You can also directly run the python files in experiments
:
python experiments/train_shapenet.py
Use the -h
or --help
flag to find out which arguments can be passed to the training script:
python experiments/train_shapenet.py -h
You can keep track of the training process with tensorboard:
tensorboard logdir=experiments/runs/shapenet_all
The code that was used to generate Figure 2 from the paper and Figure 2 and 3 from the supplement is a notebook in the folder experiments/anisotropic_diffusion
.
The training scripts assume that you have a data
folder in experiments
. ModelNet40 and ShapeNet download the datasets from a public repository. Instructions to download the data for human body shape segmentation, SHREC, and ScanObjectNN are given in the training scripts.
In the paper, we make statements about a number of properties of DeltaConv that are either a result of prior work or due to the implementation. We created a test suite to ensure that these properties hold for the implementation, along with unit tests for each module. For example:
test/nn/test_mlp.py
test/nn/test_nonlin.py
test/nn/test_deltaconv.py
test/geometry/test_grad_div.py
test/geometry/test_grad_div.py
Please cite our paper if this code contributes to an academic publication:
@Article{Wiersma2022DeltaConv,
author = {Ruben Wiersma, Ahmad Nasikun, Elmar Eisemann, Klaus Hildebrandt},
journal = {Transactions on Graphics},
title = {DeltaConv: Anisotropic Operators for Geometric Deep Learning on Point Clouds},
year = {2022},
month = jul,
number = {4},
volume = {41},
doi = {10.1145/3528223.3530166},
publisher = {ACM},
}
The farthest point sampling code relies on Geometry Central:
@misc{geometrycentral,
title = {geometry-central},
author = {Nicholas Sharp and Keenan Crane and others},
note = {www.geometry-central.net},
year = {2019}
}
And we make use of PyG (and underlying packages) to load point clouds, compute sparse matrix products, and compute nearest neighbors:
@inproceedings{Fey/Lenssen/2019,
title={Fast Graph Representation Learning with {PyTorch Geometric}},
author={Fey, Matthias and Lenssen, Jan E.},
booktitle={ICLR Workshop on Representation Learning on Graphs and Manifolds},
year={2019},
}