Note our Far3D inherit from the repo of StreamPETR, thus many parts can be used in similar style.
Please following Setup instructions to build environment and compile mmdet3d. We also provide detailed conda environment file here.
Instruction Steps
- Create a conda virtual environment and activate it.
- Install PyTorch and torchvision following the official instructions.
- Install flash-attn (optional).
- Clone Far3D.
- Install mmdet3d.
We use Argoverse 2 dataset and NuScenes dataset for experiments.
- NuScenes dataset preparation can refer to Preparation.
- Similarly, after downloading Argoverse 2 sensor dataset, it can be processed following above pipelines and create_av2_infos.py.
# first modify args such as split, dataset_dir.
python tools/create_infos_av2/gather_argo2_anno_feather.py
python tools/create_infos_av2/create_av2_infos.py
Notes:
- Due to the huge strorage of Argoverse 2 dataset, we read data from s3 path. If any need to load from local disk or other paths, please modify AV2LoadMultiViewImageFromFiles for your convenience.
- For Argoverse 2, its 3D labels are in ego coordinates. The ego-motion transformation refer to Argoverse2DatasetT.
Train the model
tools/dist_train.sh projects/configs/far3d.py 8 --work-dir work_dirs/far3d/
Evaluation
tools/dist_test.sh projects/configs/far3d.py work_dirs/far3d/iter_82548.pth 8 --eval bbox
- You can also refer to StreamPETR for more training recipes.