Skip to content

Commit

Permalink
Merge branch 'lmcafee/flash-attn-fix' into 'main'
Browse files Browse the repository at this point in the history
Test NVIDIA#2: Memory, timing

See merge request ADLR/megatron-lm!677
  • Loading branch information
jaredcasper committed Jul 19, 2023
2 parents 02c6229 + 1a03e5d commit b030472
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions megatron/model/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,10 @@
try:
from flash_attn.flash_attn_interface import flash_attn_unpadded_func
except ImportError:
flash_attn_unpadded_func = None

try:
from flash_attn.flash_attn_interface import flash_attn_varlen_func as flash_attn_unpadded_func
except ImportError:
flash_attn_unpadded_func = None

""" We use the following notation throughout this file:
h: hidden size
Expand Down

0 comments on commit b030472

Please sign in to comment.