-
-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Warnings with transformers >=4.46 #85
Comments
You can simply ignore this warning. the reason is mangaocr is using different settings, for example, there are 2 hidden layers in mangaocr decoder, but there are 12 hidden layers in the original bert. if you feel boring, warning messages from transformers could be suppressed by import logging
loggers = [logging.getLogger(name) for name in logging.root.manager.loggerDict]
for logger in loggers:
if "transformers" in logger.name.lower():
logger.setLevel(logging.ERROR) |
I use use |
I am training the model myself, and this warning disappeared on my trained model, after I changed https://github.com/kha-white/manga-ocr/blob/master/manga_ocr_dev/training/get_model.py to https://github.com/jzhang533/manga-ocr/blob/master/manga_ocr_dev/training/my_get_model.py I am still trying to train a good model, you can see some examples of my current trained model here: https://github.com/jzhang533/manga-ocr?tab=readme-ov-file#examples If I could train a descent model, will share to the community. |
When run on transformers 4.46 or greater, it shows these warnings:
Config of the encoder: <class 'transformers.models.vit.modeling_vit.ViTModel'> is overwritten by shared encoder config: ViTConfig
Config of the decoder: <class 'transformers.models.bert.modeling_bert.BertLMHeadModel'> is overwritten by shared decoder config: BertConfig
The model still seems to work fine however
The text was updated successfully, but these errors were encountered: