-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Document causal mask alignment in scaled_dot_product_attention #2967
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Document causal mask alignment in scaled_dot_product_attention #2967
Conversation
Clarify that MLX uses lower-right alignment for causal masks when T_q != T_kv, which differs from PyTorch's default upper-left alignment. Relates to ml-explore#2835
zcbenz
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think PyTorch has a causal_lower_right option for SDPA and the description is not really right.
|
Hey @zcbenz, it does have causal_lower_right since 2.3 and can be used with SDPA via the attn_mask parameter. I ran a script with: to verify. Here is the tutorial that documents this explicitly: https://docs.pytorch.org/tutorials/intermediate/scaled_dot_product_attention_tutorial.html. I also verified masks are mathematically identical. For example with T_q=2, T_kv=4: The first two are identical; the third is different. This is also consistent with MLX's CUDA backend which uses cuDNN's set_causal_mask_bottom_right. Is there something specific about the description you think is incorrect? if your concern is that causal_lower_right isn't a direct SDPA parameter (like is_causal=True) but rather a separate utility class, I could clarify the wording to use the full module path torch.nn.attention.bias.causal_lower_right. |
|
Thanks for linking the docs, this is a new learn for me. On the behavior, it actually depends on whether T_q is larger or smaller than T_kv: mlx/mlx/backend/cuda/scaled_dot_product_attention.cpp Lines 204 to 208 in 9052f67
|
The mask uses lower-right alignment when T_q <= T_kv and upper-left when T_q > T_kv.
|
Thanks! Fixed to describe the conditional alignment behavior 🙏 |
zcbenz
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me. /cc @awni for a second look.
|
The comment definitely makes sense. But I also find it a bit strange that we switch from lower right to upper left depending on if query is longer or shorter than the keys. It's quite rare for the query to be longer than the keys which is why we never really looked at it carefully. I'm wondering if we should change the behavior in that case rather than documenting something that is a bit unusual? Or maybe it's a good idea to keep it this way? |
|
I agree current behavior is unusual, and using lower right for all should be a better choice. |
Summary
mask="causal"uses lower-right alignmentis_causal=True(upper-left)When
T_q != T_kv, this distinction matters:References:
Relates to #2835