Replies: 1 comment
-
我是用LLaMa-Factory来训练glm-4-9b-chat 模型的 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
如果transformer架构的模型是按照jinja的格式组织的话,GLM4应该看什么?我看tokenization_chatglm.py里有一个build_single_message函数,但是多轮对话的话应该怎么搞?
Beta Was this translation helpful? Give feedback.
All reactions