We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tothemoon:latest这个镜像里存在一些问题: 1.flash-attn的版本应该不OK,下面的flash_attn_varlen_func找不到 try: from flash_attn.flash_attn_interface import flash_attn_varlen_func except ImportError: flash_attn_varlen_func = None print( "Warning: import flash_attn fail, please install FlashAttention " "https://github.com/Dao-AILab/flash-attention" ) 2. 升级flash_attn版本到最新后,出现so里面找不到符号问题
The text was updated successfully, but these errors were encountered:
No branches or pull requests
tothemoon:latest这个镜像里存在一些问题:
1.flash-attn的版本应该不OK,下面的flash_attn_varlen_func找不到
try:
from flash_attn.flash_attn_interface import flash_attn_varlen_func
except ImportError:
flash_attn_varlen_func = None
print(
"Warning: import flash_attn fail, please install FlashAttention "
"https://github.com/Dao-AILab/flash-attention"
)
2. 升级flash_attn版本到最新后,出现so里面找不到符号问题
The text was updated successfully, but these errors were encountered: