Replies: 1 comment
-
Hi there, this is normal behavior for several pretrained LLMs that haven't undergone finetuning, yet. Especially with smaller LLMs. Not sure why this happens exactly (maybe an artifact of repetitive structures in the training data or the LLM isn't good at longer contexts) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In ch06, I have noticed that generate_text_simple with text2 in the pretrained model, i.e. before finetuning, simply just repeats indefinitely the input text2 many times over word for word.
Is there an explanation why this happens?
Beta Was this translation helpful? Give feedback.
All reactions