Skip to content

opacity-black/CAFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Introduction

img The official implementation of the paper Cross-modulated Attention Transformer for RGBT Tracking.

device

We train and test our model on two Nvidia 2080ti GPU.

Train

Download SOT pretrained model from OSTrack. Or download from CAFormer_SOTPretrained.pth.tar

cd your_proj_path
python utils/make_pretrained.py  # if you download model weight from ostrack
sh experiments/caformer/train.sh

Test

cd your_proj_path
# For RGBT234
sh experiments/caformer/test234.sh
# For LasHeR
sh experiments/caformer/test245.sh

evaluation

You can use the files in eval_tracker to quickly evaluate the tracking results.

download

Training Dataset RGBT234 LasHeR VTUAV-ST model & result
LasHeR 88.3/66.4 70.0/55.6 -/- download
VTUAV -/- -/- 88.6/76.2 download

Citation

@inproceedings{CAFormer,
  title={Cross-modulated Attention Transformer for RGBT Tracking},
  author={Yun Xiao, Jiacong Zhao, Andong Lu, Chenglong Li, Bing Yin, Yin Lin, Cong Liu},
  booktitle={AAAI Conference on Artificial Intelligence},
  year={2025}
}

Acknowledgments

Thanks for the OSTrack, which helps us to quickly implement our ideas.

About

Cross-modulated Attention Transformer for RGBT Tracking

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages