-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added RTDETR model to inference #558
base: main
Are you sure you want to change the base?
Conversation
Hi @Bhavay-2001, thank you for providing this PR! I'd love to test it, can you share python code with inference performed using this model? |
Hi @grzegorz-roboflow, I haven't tested it myself. I got the code from here. I have created the notebook that test this code. I have just tried to convert that code to a file. |
Hi @grzegorz-roboflow, pls let me know what changes needs to be done. |
inference/models/rtdetr/rtdetr.py
Outdated
|
||
DEVICE = "cuda:0" if torch.cuda.is_available() else "cpu" | ||
|
||
class RTDETR(RoboflowCoreModel): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should subclass RoboflowInferenceModelor even more ideally TransformerModel after a light refactor
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
Hi @probicheaux, can you pls provide a detailed review? |
Hi @grzegorz-roboflow, PR is ready for review. Can you please check this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your contribution, we will upload some weights and test this out. Some unit tests/integration tests will also need to be added
from inference.models.transformers.transformers import TransformerModel | ||
from inference.core.utils.image_utils import load_image_rgb | ||
|
||
DEVICE = "cuda:0" if torch.cuda.is_available() else "cpu" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not needed, self.device should be set on the TransformerModel
self.api_key = API_KEY | ||
self.dataset_id, self.version_id = model_id.split("/") | ||
self.cache_dir = os.path.join(MODEL_CACHE_DIR, self.endpoint + "/") # "PekingU/rtdetr_r50vd" | ||
dtype = torch.bfloat16 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bfloat16 shouldn't be hardcoded here, bfloat16 is only supported on gpus with compute capability >= 8.0
self.model_id = model_id | ||
self.endpoint = model_id | ||
self.api_key = API_KEY |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think these 3 lines are needed, this is set on RoboflowInferenceModel
self.cache_dir, | ||
torch_dtype=dtype, | ||
device_map=DEVICE, | ||
revision="bfloat16", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can upload float16 weights to a Roboflow project and load from there
self.dataset_id, self.version_id = model_id.split("/") | ||
self.cache_dir = os.path.join(MODEL_CACHE_DIR, self.endpoint + "/") # "PekingU/rtdetr_r50vd" | ||
dtype = torch.bfloat16 | ||
self.model = RTDetrForObjectDetection.from_pretrained( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
RTDetrForObjectDetection should be a class property, see https://github.com/Bhavay-2001/roboflow-inference/blob/d3c88f74fdcaac5c29822a7444698b11b78067c8/inference/models/paligemma/paligemma.py#L10
For an example
revision="bfloat16", | ||
).eval() | ||
|
||
self.processor = RTDetrImageProcessor.from_pretrained( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment for class property
Hey @Bhavay-2001! We greatly appreciate your contribution and thank you so much for submitting this PR. I talked with the team and we are planning on incorporating fine-tuning of RT-DETR more tightly into Roboflow shortly so we need to adapt the inference integration to be compatible with the output of our training process. This PR is a great start & we’ll likely continue working from it towards a release but we won’t be able to do that for a little while until the backend is more concrete. Best path forward in the meantime would be to use it via a plugin or fork. (Apologies for the inconvenience; we’ll update you when we have more to share on our end!) |
Description
PR for #546
Type of change
Please delete options that are not relevant.
How has this change been tested, please provide a testcase or example of how you tested the change?
Any specific deployment considerations
Docs