Skip to content

Latest commit

 

History

History
37 lines (32 loc) · 2.04 KB

get_started.md

File metadata and controls

37 lines (32 loc) · 2.04 KB

Get Started

Note our Far3D inherit from the repo of StreamPETR, thus many parts can be used in similar style.

1. Setup

Please following Setup instructions to build environment and compile mmdet3d. We also provide detailed conda environment file here.

Instruction Steps

  1. Create a conda virtual environment and activate it.
  2. Install PyTorch and torchvision following the official instructions.
  3. Install flash-attn (optional).
  4. Clone Far3D.
  5. Install mmdet3d.

2. Data preparation

We use Argoverse 2 dataset and NuScenes dataset for experiments.

# first modify args such as split, dataset_dir.
python tools/create_infos_av2/gather_argo2_anno_feather.py
python tools/create_infos_av2/create_av2_infos.py

Notes:

  • Due to the huge strorage of Argoverse 2 dataset, we read data from s3 path. If any need to load from local disk or other paths, please modify AV2LoadMultiViewImageFromFiles for your convenience.
  • For Argoverse 2, its 3D labels are in ego coordinates. The ego-motion transformation refer to Argoverse2DatasetT.

3. Training & Inference

Train the model

tools/dist_train.sh projects/configs/far3d.py 8 --work-dir work_dirs/far3d/

Evaluation

tools/dist_test.sh projects/configs/far3d.py work_dirs/far3d/iter_82548.pth 8 --eval bbox
  • You can also refer to StreamPETR for more training recipes.