Skip to content

Commit dd81edf

Browse files
rentainhentianhe ren
and
ntianhe ren
authored
Align DAB-DETR-R101 training configs (facebookresearch#9)
* add DAB-DETR R101 training configs * refine cfg Co-authored-by: ntianhe ren <[email protected]>
1 parent ad2de66 commit dd81edf

File tree

1 file changed

+25
-0
lines changed

1 file changed

+25
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
from ideadet.config import get_config
2+
3+
from .models.dab_detr_r50 import model
4+
from .common.coco_loader import dataloader
5+
from .common.schedule import lr_multiplier
6+
7+
8+
optimizer = get_config("common/optim.py").AdamW
9+
train = get_config("common/train.py").train
10+
11+
# modify training config
12+
train.init_checkpoint = "./pretrained_weights/r101.pkl"
13+
train.output_dir = "./output/dab_r101_50epoch"
14+
train.max_iter = 375000
15+
16+
17+
# modify optimizer config
18+
optimizer.weight_decay = 1e-4
19+
optimizer.params.lr_factor_func = lambda module_name: 0.1 if "backbone" in module_name else 1
20+
21+
# modify dataloader config
22+
dataloader.train.num_workers = 16
23+
24+
# modify model config
25+
model.backbone.backbone.backbone.stages.depth = 101

0 commit comments

Comments
 (0)