You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After carefully reading the article
3.4 Temporary consistency using Gaussian Mixture Model (GMM)
You use a Gaussian mixture distribution for each category. Each mixture distribution contains a positive and negative Gaussian distribution. It is also a posterior probability probability of EM algorithm inference for each category. However, in model training, the Log log only has a single GMM threshold, and ultimately GMM generates a category threshold or a unified threshold
Looking forward to your answer
Thank you.
The text was updated successfully, but these errors were encountered:
After carefully reading the article
3.4 Temporary consistency using Gaussian Mixture Model (GMM)
You use a Gaussian mixture distribution for each category. Each mixture distribution contains a positive and negative Gaussian distribution. It is also a posterior probability probability of EM algorithm inference for each category. However, in model training, the Log log only has a single GMM threshold, and ultimately GMM generates a category threshold or a unified threshold
Looking forward to your answer
Thank you.
The text was updated successfully, but these errors were encountered: