3D Shape Generation Baselines in PyTorch.
3D Shape Generation Baselines in PyTorch.
Using Anaconda to install all dependences.
conda env create -f environment.yml
CUDA_VISIBLE_DEVICES=<gpus> python train.py --options <config>
CUDA_VISIBLE_DEVICES=<gpus> python predictor.py --options <config>
training/inference
loop, add code in scheduler
and inherit base class.models/zoo
utils/config
datasets/data
Our work is based on the codebase of an unofficial pixel2mesh framework. The Chamfer loss code is based on ChamferDistancePytorch.
Official baseline code
DISN: Deep Implicit Surface Network for High-quality Single-view 3D Reconstruction
Pixel2Mesh: Generating 3D Mesh Models from Single RGB Images
Learning a Probabilistic Latent Space of Object Shapes via 3D Generative-Adversarial Modeling
Please follow the License of official implementation for each model.