Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add: fsa/flash-llm.md #1567

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

AdamG012
Copy link

@AdamG012 AdamG012 commented Oct 5, 2023

Thank you very much

@AdamG012 AdamG012 changed the title Adamg012/flash llm Add: fsa/flash-llm.md Oct 25, 2023
@AdamG012
Copy link
Author

AdamG012 commented Oct 25, 2023

Hello there to detail this blog, it is work by @Summer-Summer at FSA-Lab and others at Alibaba Research. The source code can be found at https://github.com/AlibabaResearch/flash-llm and https://github.com/usyd-fsalab/flash-llm. This work is a large scale LLM inference library focusing on GPU code optimisations for sparse matrices.

@osanseviero @sayakpaul
Let us know if there is anything you need.

@AdamG012 AdamG012 marked this pull request as draft October 25, 2023 23:45
@AdamG012 AdamG012 marked this pull request as ready for review October 25, 2023 23:46
@AdamG012 AdamG012 marked this pull request as draft October 25, 2023 23:46
@AdamG012 AdamG012 marked this pull request as ready for review October 25, 2023 23:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant