Taichi Ngp Renderer Save

An Instants-NGP renderer that has been implemented using Taichi

Project README

Taichi NGP Renderer

License

Update 2023-02-09: Support real scenes! Try with python taichi_ngp.py --gui --scene garden


This is a Instant-NGP renderer implemented using Taichi, written entirely in Python. No CUDA! This repository only implemented the rendering part of the NGP but is more simple and has a lesser amount of code compared to the original (Instant-NGP and tiny-cuda-nn).


Installation

Clone this repository and install the required package:

git clone https://github.com/Linyou/taichi-ngp-renderer.git
python -m pip install -r requirement.txt

Description

This repository only implemented the forward part of the Instant-NGP, which include:

  • Rays intersection with bounding box: ray_intersect()
  • Ray marching strategic: raymarching_test_kernel()
  • Spherical harmonics encoding for ray direction: dir_encode()
  • Hash table encoding for 3d coordinate: hash_encode()
  • Fully Fused MLP using shared memory: sigma_layer(), rgb_layer()
  • Volume rendering: composite_test()

However, there are some differences compared to the original:

Missing function
  • Taichi is currently missing the frexp() method, so I have to use a hard-coded scale of 0.5. I will update the code once Taichi supports this function.
Fully Fused MLP
  • Instead of having a single kernel like tiny-cuda-nn, this repo use separated kernel sigma_layer() and rgb_layer() because the shared memory size that Taichi currently allow is 48KB as issue #6385 points out, it could be improved in the future.
  • In the tiny-cuda-nn, they use TensorCore for float16 multiplication, which is not an accessible feature for Taichi, so I directly convert all the data to ti.float16 to speed up the computation.

GUI

This code supports real-time rendering GUI interactions with less than 1GB VRAM. Here are the functionality that the GUI offers:

  • Camera:
    • Keyboard and mouse control
    • DoF
  • Rendering:
    • The number of samples for each ray
    • Transparency threshold (Stop ray marching)
    • Show depth
  • Export:
    • Snapshot
    • Video recording (Required ffmpeg)

the GUI is running up to 66 fps on a 3090 GPU at 800 $\times$ 800 resolution (default pose).

Run python taichi_ngp.py --gui --scene lego to start the GUI. This repository provided eight pre-trained NeRF synthesis scenes: Lego, Ship, Mic, Materials, Hotdog, Ficus, Drums, Chair



Running python taichi_ngp.py --gui --scene <name> will automatically download pre-trained model <name> in the ./npy_file folder. Please check out the argument parameters in taichi_ngp.py for more options.

Custom scene

You can train a new scene with ngp_pl and save the pytorch model to numpy using np.save(). After that, use the --model_path argument to specify the model file.

Acknowledgments

Many thanks to the incredible projects that open-source to the community, including:

Todo

  • Support Vulkan backend
  • Support real scenes
  • Refactor to separate modules

...

Open Source Agenda is not affiliated with "Taichi Ngp Renderer" Project. README Source: Linyou/taichi-ngp-renderer
Stars
360
Open Issues
5
Last Commit
9 months ago
License

Open Source Agenda Badge

Open Source Agenda Rating