-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
while embedding_concat() runs, I got "Killed" #32
Comments
Try cropsize=224 |
If it is still relevant, this might be if you run out of RAM. Happened for me when having large datasets. |
If I use this model in a larger dataset, out of RAM is a big issue. Can anybody solve it? I guess this is the natural shortcoming of this method. |
@RichardChangCA what I did was repeatedly running the
|
Thanks @michaelstuffer98 Do you know how to calculate the covariance matrix if the dataset is too large? I optimized the cpu usage for other parts, only left the covariance matrix calculation. I have to store all embeddings for all normal training data to calculate the covariance matrix, but my cpu cannot stand it. |
I think I got too much number of data....?
what do you think?
The text was updated successfully, but these errors were encountered: