[ICCV 2023] Official PyTorch implementation of the paper "InterDiff: Generating 3D Human-Object Interactions with Physics-Informed Diffusion"
InterDiff: Generating 3D Human-Object Interactions with Physics-Informed Diffusion
Sirui Xu
Zhengyuan Li
Yu-Xiong Wang*
Liang-Yan Gui*
University of Illinois Urbana-Champaign
ICCV 2023
To create the environment, you can check and build according to the requirement file requirements.txt, which is based on Python 3.7.
[!NOTE] For specific packages such as psbody-mesh and human-body-prior, you may need to build from their sources.
You may also build from a detailed requirement file based on Python 3.8, which might contain redundancies,
conda env create -f environment.yml
For more information about the implementation, see interdiff/README.md.
We present HOI sequences (left), object motions (middle), and objects relative to the contacts after coordinate transformations (right). Our key insight is to inject coordinate transformations into a diffusion model, as the relative motion shows simpler patterns that are easier to predict, e.g., being almost stationary (top), or rotating around a fixed axis (bottom).
If you find our work helpful, please cite:
@inproceedings{
xu2023interdiff,
title={{InterDiff}: Generating 3D Human-Object Interactions with Physics-Informed Diffusion},
author={Xu, Sirui and Li, Zhengyuan and Wang, Yu-Xiong and Gui, Liang-Yan},
booktitle={ICCV},
year={2023},
}
This code is distributed under an MIT LICENSE.
Note that our code depends on other libraries, including SMPL, SMPL-X, PyTorch3D, Hugging Face, Hydra, and uses datasets which each have their own respective licenses that must also be followed.