Skip to content

FA3 fails in the newest version of FA3 #2527

@MaciejBalaNV

Description

@MaciejBalaNV

Describe the bug

A recent FA3 commit updated FA3 API, in a similar way as it was done for FA2 with >2.7 version. Most notably, instead of window_size, window_size_left and window_size_right is now used. It makes TE incompatible with the newest versions of FA3.

Steps/Code to reproduce bug

Install FA3 from ToT. Run tests/pytorch/attention/test_attention_with_cp.py test.

Expected behavior

The tests should pass with the newest FA3. Instead, they fail with RuntimeError: Unknown keyword argument 'window_size' for operator 'flash_attn_3::_flash_attn_forward'. error.

Environment overview (please complete the following information)

  • Environment location: AWS H100
  • Method of Transformer Engine install: docker image nvcr.io/nvidian/pytorch:25.10
  • FA3 version: ac9b5f1

Device details

  • H100

Metadata

Metadata

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions