Skip to content

tier4/VAD-GS

 
 

Repository files navigation

[CVPR 2026] VAD-GS: Visibility-Aware Densification for 3D Gaussian Splatting in Dynamic Urban Scenes

Project page | Paper | Youtube | Bilibili

flowchart

Installation

Clone this repository and checkout dev branch
git clone https://github.com/YikangZhang1641/VAD-GS.git
git checkout -b dev origin/dev
Build tools
  1. Install COLMAP (tested version 3.10-dev)
  2. Build SIBR_viewer following the tutorial of 3DGS
Set up the environment
# Create and sync the environment with uv
uv venv

# Install project dependencies
# On Linux, uv resolves torch/torchvision from the PyTorch cu124 index.
uv sync

Datasets

data/
   ├── nuscenes/
   │     ├── raw/
   │     ├── processed_10Hz/
   │     │     ├── mini/
   │     │     │     ├── 000/
   │     │     │     │     ├── images/
   │     |     │     │     ├── ego_pose/
   │     |     │     │     ├── lidar_depth/
   │     |     │     │     └── ...
   │     │     │     ├── 001/
   │     │     │     ├── ...
   └── waymo/
         |...

Example

  • We provide a nuScenes example here. Download and extract it to the folder path above.

For training:

python train.py --config configs/example/nuscenes_train_000.yaml

To generate visual outputs:

python render.py --config configs/example/nuscenes_train_000.yaml mode evaluate

For evaluation:

python metrics.py --config configs/example/nuscenes_train_000.yaml

About

This is the official code of "VAD-GS: Visibility-Aware Densification for 3D Gaussian Splatting in Dynamic Urban Scenes".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 83.2%
  • Cuda 10.1%
  • C++ 4.7%
  • HTML 1.2%
  • Shell 0.6%
  • CMake 0.1%
  • C 0.1%