-
Notifications
You must be signed in to change notification settings - Fork 583
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
A recent FA3 commit updated FA3 API, in a similar way as it was done for FA2 with >2.7 version. Most notably, instead of window_size, window_size_left and window_size_right is now used. It makes TE incompatible with the newest versions of FA3.
Steps/Code to reproduce bug
Install FA3 from ToT. Run tests/pytorch/attention/test_attention_with_cp.py test.
Expected behavior
The tests should pass with the newest FA3. Instead, they fail with RuntimeError: Unknown keyword argument 'window_size' for operator 'flash_attn_3::_flash_attn_forward'. error.
Environment overview (please complete the following information)
- Environment location: AWS H100
- Method of Transformer Engine install: docker image nvcr.io/nvidian/pytorch:25.10
- FA3 version: ac9b5f1
Device details
- H100
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working