Replies: 4 comments 3 replies
-
Hey @rzsgrt, the first change that would be needed to improve on the example notebook results is to increase the number of examples in the dataset. In the notebook, we only selected 100 examples to make the demo run quickly for first-time users. But if you want to improve performance, I would start by using the full dataset. |
Beta Was this translation helpful? Give feedback.
-
Hey @rzsgrt, we discovered an issue with the learning rate after changing some of the default params in Ludwig v0.8.2. Can you take another look at the now updated notebook and try again? |
Beta Was this translation helpful? Give feedback.
-
Hi @tgaddair, thanks for your answer. I notice learning rate param you change. Also, I try to increase train data up to 500. After that result quite good. But when I try train 1000 example, it start oom. Is it because we put all data at gpu? |
Beta Was this translation helpful? Give feedback.
-
I see, so we reducing matrix size total data train x sequence_length. I don't know the detail, do we push all train data on gpu at first? |
Beta Was this translation helpful? Give feedback.
-
Hi, I come here because event you create with deeplearning.ai. Thank you for sharing with us about ludwig and predibase.
But I have question about the result. So I just run it without any changes on param.
Using test examples, 4/5 output just repeating instruction and 1 repeat the answer. Is it normal?
My assumption is I need to increase epochs. Does anyone try to train with higher epoch? How many epoch you guys get reasonable answer?
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions