-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mini-batch training on GMM #19
Comments
Issue #7 still referred to PyCave version 2. In PyCave v3, you don't need to call I believe that this should be the line that causes your error. |
So is there a similar way to implement batch training in PyCave version 3 using dataloader? My whole dataset is large, so I cannot load all the data into the memory once. Thank you so much! Best regards, |
Ah, sorry! Yes, you can simply set the batch size when initializing the GMM. In your case, you might, for example, use: gmm = GM(..., batch_size=8192) This will automatically take care to load data in batches, both for initialization and GMM training. Note that you might be better off with |
Hi,
I want to implement mini-batching training on GMM as discussed in #7 . However, I am little bit confused by the code
gmm.reset_parameters(torch.Tensor(fvectors[:500].astype(np.float32)))
. I am not sure whether it is related to my version of pycave, or maybe my understanding to the code in #7 is wrong. My code doesn't work.My code are as follows:
And the error is:
Thank you so much!
Best regards,
Daisy
The text was updated successfully, but these errors were encountered: