You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
No operator found for memory_efficient_attention_forward with inputs: query : shape=(40, 256, 1, 64) (torch.bfloat16) key : shape=(40, 256, 1, 64) (torch.bfloat16) value : shape=(40, 256, 1, 64) (torch.bfloat16) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info [email protected] is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support dtype=torch.bfloat16 (supported: {torch.float32}) operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64
The text was updated successfully, but these errors were encountered:
No operator found for
memory_efficient_attention_forward
with inputs: query : shape=(40, 256, 1, 64) (torch.bfloat16) key : shape=(40, 256, 1, 64) (torch.bfloat16) value : shape=(40, 256, 1, 64) (torch.bfloat16) attn_bias : <class 'NoneType'> p : 0.0decoderF
is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - seepython -m xformers.info
for more info[email protected]
is not supported because: xFormers wasn't build with CUDA support operator wasn't built - seepython -m xformers.info
for more infocutlassF
is not supported because: xFormers wasn't build with CUDA support operator wasn't built - seepython -m xformers.info
for more infosmallkF
is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support dtype=torch.bfloat16 (supported: {torch.float32}) operator wasn't built - seepython -m xformers.info
for more info unsupported embed per head: 64The text was updated successfully, but these errors were encountered: