-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multi card parallel #51
Comments
In the paper, we train all models with Distributed Data Parallel with 8xV100 GPU, so parallel training should be OK. Can you check your own environment and pytorch distributed config? |
Please indicate which config you are using. For all configs in this repo, we have ConsistentTeacher/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py Lines 52 to 58 in b49a9ce
|
原始邮件
发件人:"Xingyi Yang"< ***@***.*** >;
发件时间:2024/1/17 18:56
收件人:"Adamdad/ConsistentTeacher"< ***@***.*** >;
抄送人:"Code-of-Liujie"< ***@***.*** >;"Comment"< ***@***.*** >;
主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51)
Please indicate which config you are using. For all configs in this repo, we have model.train_cfg. For example
https://github.com/Adamdad/ConsistentTeacher/blob/b49a9ce1c0ae78d13528be0a9a4eff65a627d581/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py#L52-L58
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
Do you want to try it remotely?
原始邮件
发件人:"Xingyi Yang"< ***@***.*** >;
发件时间:2024/1/17 18:56
收件人:"Adamdad/ConsistentTeacher"< ***@***.*** >;
抄送人:"Code-of-Liujie"< ***@***.*** >;"Comment"< ***@***.*** >;
主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51)
Please indicate which config you are using. For all configs in this repo, we have model.train_cfg. For example
https://github.com/Adamdad/ConsistentTeacher/blob/b49a9ce1c0ae78d13528be0a9a4eff65a627d581/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py#L52-L58
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
Sorry, but what do you mean by "try it remotely"? |
_base_ = './consistent_teacher_r50_fpn_coco_180k_10p.py'
fold = 1
percent = 10
data = dict(
train=dict(
sup=dict(
ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}.json",
),
unsup=dict(
ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}-unlabeled.json",
),
),
)
log_config = dict(
_delete_=True,
interval=50,
hooks=[
dict(type="TextLoggerHook", by_epoch=False),
dict(
type="WandbLoggerHook",
init_kwargs=dict(
project="consistent-teacher",
name="${cfg_name}",
config=dict(
fold="${fold}",
percent="${percent}",
work_dirs="${work_dir}",
total_step="${runner.max_iters}",
),
),
by_epoch=False,
)
],
)
Where are you talking about train_cfg
原始邮件
发件人:"Xingyi Yang"< ***@***.*** >;
发件时间:2024/1/17 19:54
收件人:"Adamdad/ConsistentTeacher"< ***@***.*** >;
抄送人:"Code-of-Liujie"< ***@***.*** >;"Comment"< ***@***.*** >;
主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51)
Do you want to try it remotely? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 18:56 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Please indicate which config you are using. For all configs in this repo, we have model.train_cfg. For example https://github.com/Adamdad/ConsistentTeacher/blob/b49a9ce1c0ae78d13528be0a9a4eff65a627d581/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py#L52-L58 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
Sorry, but what do you mean by "try it remotely"?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
Remote control is to remotely operate my computer through the sunflower software, I see
原始邮件
发件人:"Xingyi Yang"< ***@***.*** >;
发件时间:2024/1/17 19:54
收件人:"Adamdad/ConsistentTeacher"< ***@***.*** >;
抄送人:"Code-of-Liujie"< ***@***.*** >;"Comment"< ***@***.*** >;
主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51)
Do you want to try it remotely? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 18:56 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Please indicate which config you are using. For all configs in this repo, we have model.train_cfg. For example https://github.com/Adamdad/ConsistentTeacher/blob/b49a9ce1c0ae78d13528be0a9a4eff65a627d581/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py#L52-L58 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
Sorry, but what do you mean by "try it remotely"?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
In the first lime, we defaultly import hyper-parameters from another base config _base_ = './consistent_teacher_r50_fpn_coco_180k_10p.py' the Sorry, but I cannot not assist you through remote control. |
Here this mmdetion is the latest version, but the mmdet in the environment ==2.28.1 is not caused by this reason.
原始邮件
发件人:"Xingyi Yang"< ***@***.*** >;
发件时间:2024/1/17 20:09
收件人:"Adamdad/ConsistentTeacher"< ***@***.*** >;
抄送人:"Code-of-Liujie"< ***@***.*** >;"Comment"< ***@***.*** >;
主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51)
base = './consistent_teacher_r50_fpn_coco_180k_10p.py' fold = 1 percent = 10 data = dict( train=dict( sup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}.json", ), unsup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}-unlabeled.json", ), ), ) log_config = dict( delete=True, interval=50, hooks=[ dict(type="TextLoggerHook", by_epoch=False), dict( type="WandbLoggerHook", init_kwargs=dict( project="consistent-teacher", name="${cfg_name}", config=dict( fold="${fold}", percent="${percent}", work_dirs="${work_dir}", total_step="${runner.max_iters}", ), ), by_epoch=False, ) ], ) Where are you talking about train_cfg 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 19:54 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Do you want to try it remotely? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 18:56 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Please indicate which config you are using. For all configs in this repo, we have model.train_cfg. For example https://github.com/Adamdad/ConsistentTeacher/blob/b49a9ce1c0ae78d13528be0a9a4eff65a627d581/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py#L52-L58 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> Sorry, but what do you mean by "try it remotely"? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>
In the first lime, we defaultly import hyper-parameters from another base config
_base_ = './consistent_teacher_r50_fpn_coco_180k_10p.py'
the train_cfg is in the base config.
Sorry, but I cannot not assist you through remote control.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
I would suggest to down-grade the |
The lower version of mmdecion doesn't. Is it possible to do without mmdetection in this directory?
原始邮件
发件人:"Xingyi Yang"< ***@***.*** >;
发件时间:2024/1/17 20:18
收件人:"Adamdad/ConsistentTeacher"< ***@***.*** >;
抄送人:"Code-of-Liujie"< ***@***.*** >;"Comment"< ***@***.*** >;
主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51)
Here this mmdetion is the latest version, but the mmdet in the environment ==2.28.1 is not caused by this reason. 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 20:09 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) base = './consistent_teacher_r50_fpn_coco_180k_10p.py' fold = 1 percent = 10 data = dict( train=dict( sup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}.json", ), unsup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}-unlabeled.json", ), ), ) log_config = dict( delete=True, interval=50, hooks=[ dict(type="TextLoggerHook", by_epoch=False), dict( type="WandbLoggerHook", init_kwargs=dict( project="consistent-teacher", name="${cfg_name}", config=dict( fold="${fold}", percent="${percent}", work_dirs="${work_dir}", total_step="${runner.max_iters}", ), ), by_epoch=False, ) ], ) Where are you talking about train_cfg 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 19:54 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Do you want to try it remotely? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 18:56 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Please indicate which config you are using. For all configs in this repo, we have model.train_cfg. For example https://github.com/Adamdad/ConsistentTeacher/blob/b49a9ce1c0ae78d13528be0a9a4eff65a627d581/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py#L52-L58 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> Sorry, but what do you mean by "try it remotely"? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> In the first lime, we defaultly import hyper-parameters from another base config base = './consistent_teacher_r50_fpn_coco_180k_10p.py' the train_cfg is in the base config. Sorry, but I cannot not assist you through remote control. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
I would suggest to down-grade the mmdetection version of base config as well.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
You can also copy the older version config of |
wandb_login._login(anonymous=anonymous, force=force, _disable_warning=True)
File "D:\Anaconda3\envs\teacher\lib\site-packages\wandb\sdk\wandb_login.py", line 238, in _login
wlogin.prompt_api_key()
File "D:\Anaconda3\envs\teacher\lib\site-packages\wandb\sdk\wandb_login.py", line 174, in prompt_api_key
raise UsageError("api_key not configured (no-tty). call " + directive)
wandb.errors.UsageError: api_key not configured (no-tty). call wandb.login(key=[your_api_key])
原始邮件
发件人:"Xingyi Yang"< ***@***.*** >;
发件时间:2024/1/17 20:24
收件人:"Adamdad/ConsistentTeacher"< ***@***.*** >;
抄送人:"Code-of-Liujie"< ***@***.*** >;"Comment"< ***@***.*** >;
主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51)
The lower version of mmdecion doesn't. Is it possible to do without mmdetection in this directory? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 20:18 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Here this mmdetion is the latest version, but the mmdet in the environment ==2.28.1 is not caused by this reason. 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 20:09 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) base = './consistent_teacher_r50_fpn_coco_180k_10p.py' fold = 1 percent = 10 data = dict( train=dict( sup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}.json", ), unsup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}-unlabeled.json", ), ), ) log_config = dict( delete=True, interval=50, hooks=[ dict(type="TextLoggerHook", by_epoch=False), dict( type="WandbLoggerHook", init_kwargs=dict( project="consistent-teacher", name="${cfg_name}", config=dict( fold="${fold}", percent="${percent}", work_dirs="${work_dir}", total_step="${runner.max_iters}", ), ), by_epoch=False, ) ], ) Where are you talking about train_cfg 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 19:54 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Do you want to try it remotely? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 18:56 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Please indicate which config you are using. For all configs in this repo, we have model.train_cfg. For example https://github.com/Adamdad/ConsistentTeacher/blob/b49a9ce1c0ae78d13528be0a9a4eff65a627d581/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py#L52-L58 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> Sorry, but what do you mean by "try it remotely"? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> In the first lime, we defaultly import hyper-parameters from another base config base = './consistent_teacher_r50_fpn_coco_180k_10p.py' the train_cfg is in the base config. Sorry, but I cannot not assist you through remote control. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> I would suggest to down-grade the mmdetection version of base config as well. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>
You can also copy the older version config of mmdet into local folder, and change the base config path in your current config.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
please register on wandb and put your username and api-key in the code. |
原始邮件
发件人:"Xingyi Yang"< ***@***.*** >;
发件时间:2024/1/17 20:24
收件人:"Adamdad/ConsistentTeacher"< ***@***.*** >;
抄送人:"Code-of-Liujie"< ***@***.*** >;"Comment"< ***@***.*** >;
主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51)
The lower version of mmdecion doesn't. Is it possible to do without mmdetection in this directory? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 20:18 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Here this mmdetion is the latest version, but the mmdet in the environment ==2.28.1 is not caused by this reason. 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 20:09 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) base = './consistent_teacher_r50_fpn_coco_180k_10p.py' fold = 1 percent = 10 data = dict( train=dict( sup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}.json", ), unsup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}-unlabeled.json", ), ), ) log_config = dict( delete=True, interval=50, hooks=[ dict(type="TextLoggerHook", by_epoch=False), dict( type="WandbLoggerHook", init_kwargs=dict( project="consistent-teacher", name="${cfg_name}", config=dict( fold="${fold}", percent="${percent}", work_dirs="${work_dir}", total_step="${runner.max_iters}", ), ), by_epoch=False, ) ], ) Where are you talking about train_cfg 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 19:54 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Do you want to try it remotely? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 18:56 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Please indicate which config you are using. For all configs in this repo, we have model.train_cfg. For example https://github.com/Adamdad/ConsistentTeacher/blob/b49a9ce1c0ae78d13528be0a9a4eff65a627d581/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py#L52-L58 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> Sorry, but what do you mean by "try it remotely"? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> In the first lime, we defaultly import hyper-parameters from another base config base = './consistent_teacher_r50_fpn_coco_180k_10p.py' the train_cfg is in the base config. Sorry, but I cannot not assist you through remote control. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> I would suggest to down-grade the mmdetection version of base config as well. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>
You can also copy the older version config of mmdet into local folder, and change the base config path in your current config.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
原始邮件
发件人:"Xingyi Yang"< ***@***.*** >;
发件时间:2024/1/17 21:19
收件人:"Adamdad/ConsistentTeacher"< ***@***.*** >;
抄送人:"Code-of-Liujie"< ***@***.*** >;"Comment"< ***@***.*** >;
主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51)
wandb_login._login(anonymous=anonymous, force=force, _disable_warning=True) File "D:\Anaconda3\envs\teacher\lib\site-packages\wandb\sdk\wandb_login.py", line 238, in _login wlogin.prompt_api_key() File "D:\Anaconda3\envs\teacher\lib\site-packages\wandb\sdk\wandb_login.py", line 174, in prompt_api_key raise UsageError("api_key not configured (no-tty). call " + directive) wandb.errors.UsageError: api_key not configured (no-tty). call wandb.login(key=[your_api_key]) 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 20:24 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) The lower version of mmdecion doesn't. Is it possible to do without mmdetection in this directory? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 20:18 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Here this mmdetion is the latest version, but the mmdet in the environment ==2.28.1 is not caused by this reason. 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 20:09 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) base = './consistent_teacher_r50_fpn_coco_180k_10p.py' fold = 1 percent = 10 data = dict( train=dict( sup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}.json", ), unsup=dict( ann_file="data/coco_semi/semi_supervised/instances_train2017.${fold}@${percent}-unlabeled.json", ), ), ) log_config = dict( delete=True, interval=50, hooks=[ dict(type="TextLoggerHook", by_epoch=False), dict( type="WandbLoggerHook", init_kwargs=dict( project="consistent-teacher", name="${cfg_name}", config=dict( fold="${fold}", percent="${percent}", work_dirs="${work_dir}", total_step="${runner.max_iters}", ), ), by_epoch=False, ) ], ) Where are you talking about train_cfg 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 19:54 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Do you want to try it remotely? 原始邮件 发件人:"Xingyi Yang"< @.*** >; 发件时间:2024/1/17 18:56 收件人:"Adamdad/ConsistentTeacher"< @.*** >; 抄送人:"Code-of-Liujie"< @.*** >;"Comment"< @.*** >; 主题:Re: [Adamdad/ConsistentTeacher] Multi card parallel (Issue #51) Please indicate which config you are using. For all configs in this repo, we have model.train_cfg. For example https://github.com/Adamdad/ConsistentTeacher/blob/b49a9ce1c0ae78d13528be0a9a4eff65a627d581/configs/consistent-teacher/consistent_teacher_r50_fpn_coco_180k_10p.py#L52-L58 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> Sorry, but what do you mean by "try it remotely"? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> In the first lime, we defaultly import hyper-parameters from another base config base = './consistent_teacher_r50_fpn_coco_180k_10p.py' the train_cfg is in the base config. Sorry, but I cannot not assist you through remote control. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> I would suggest to down-grade the mmdetection version of base config as well. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> You can also copy the older version config of mmdet into local folder, and change the base config path in your current config. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
please register on wandb and put your username and api-key in the code.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
May I ask if there is a parallel issue with multiple cards here? Is there any solution? I used four v100 graphics cards to run.
The text was updated successfully, but these errors were encountered: