Building Volumetric Beliefs for Dynamic Environments Exploiting Map-Based Moving Object Segmentation (RAL 2023)
Our approach identifies moving objects in the current scan (blue points) and the local map (black points) of the environment and maintains a volumetric belief map representing the dynamic environment.
First, make sure the MinkowskiEngine is installed on your system, see here for more details.
Next, clone our repository
git clone [email protected]:PRBonn/MapMOS && cd MapMOS
and install with
make install
or
make install-all
if you want to install the project with all optional dependencies (needed for the visualizer). In case you want to edit the Python code, install in editable mode:
make editable
Just type
mapmos_pipeline --help
to see how to run MapMOS.
Check the Download section for a pre-trained model. Like KISS-ICP, our pipeline runs on a variety of point cloud data formats like bin
, pcd
, ply
, xyz
, rosbags
, and more. To visualize these, just type
mapmos_pipeline --visualize /path/to/weights.ckpt /path/to/data
Because these lables come in all shapes, you need to specify a dataloader. This is currently available for SemanticKITTI and NuScenes as well as our post-processed KITTI Tracking sequence 19 and Apollo sequences (see Downloads).
To train our approach, you need to first cache your data. To see how to do that, just cd
into the MapMOS
repository and type
python3 scripts/precache.py --help
After this, you can run the training script. Again, --help
shows you how:
python3 scripts/train.py --help
You can inspect the cached training samples by using the script python3 scripts/cache_to_ply.py --help
.
The training log and checkpoints will be saved by default to the current working directory. To change that, export the export LOGS=/your/path/to/logs
environment variable before running the training script.
You can download the post-processed and labeled Apollo dataset and KITTI Tracking sequence 19 from our website.
The weights of our pre-trained model can be downloaded as well.
If you use our code in your academic work, please cite the corresponding paper:
@article{mersch2023ral,
author = {B. Mersch and T. Guadagnino and X. Chen and I. Vizzo and J. Behley and C. Stachniss},
title = {{Building Volumetric Beliefs for Dynamic Environments Exploiting Map-Based Moving Object Segmentation}},
journal = {IEEE Robotics and Automation Letters (RA-L)},
volume = {8},
number = {8},
pages = {5180--5187},
year = {2023},
issn = {2377-3766},
doi = {10.1109/LRA.2023.3292583},
codeurl = {https://github.com/PRBonn/MapMOS},
}
This implementation is heavily inspired by KISS-ICP.
This project is free software made available under the MIT License. For details see the LICENSE file.