Skip to content

TensorRT & Flux Dev #4484

Answered by comfyanonymous
Woukim asked this question in Q&A
Aug 19, 2024 · 8 comments · 7 replies
Discussion options

You must be logged in to vote

TensorRT needs more than 24GB vram at the moment to convert a Flux model, even a 4090 isn't enough.

Replies: 8 comments 7 replies

Comment options

You must be logged in to vote
1 reply
@shammyfiveducks
Comment options

Comment options

You must be logged in to vote
2 replies
@Woukim
Comment options

@J-Cott
Comment options

Answer selected by Woukim
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@ankh2054
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@al-swaiti
Comment options

Comment options

You must be logged in to vote
2 replies
@DuckersMcQuack
Comment options

@doctorpangloss
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet