Replies: 1 comment
-
Hey, we just integrated better Looking at your issue however, did you perhaps reuse llama2 tokenizer for llama3? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi guys, i have customized the https://github.com/OpenAccess-AI-Collective/axolotl/blob/main/src/axolotl/prompt_strategies/llama2_chat.py to work with my own chat template. Now i want to adapt it to llama3, what should i change to work with it? Right now, if i'm using the llama2_chat directly, it will have problem in the image below. Do you guys have any suggestions?
Beta Was this translation helpful? Give feedback.
All reactions