You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you very much for your work. I have a problem: if nn. MSELoss () is used as the loss function, and the Heatmap output from the network is directly compared with the generated Heatmap, the loss value will be abnormally large. How to deal with this problem?
I found that your code is:
criterion = nn.MSELoss(reduction='sum')
loss = criterion(pred, target)
return loss / (pred.shape[0] * 46.0 * 46.0)
Do you use this loss function to avoid excessive loss?
The text was updated successfully, but these errors were encountered:
Thank you very much for your work. I have a problem: if nn. MSELoss () is used as the loss function, and the Heatmap output from the network is directly compared with the generated Heatmap, the loss value will be abnormally large. How to deal with this problem?
I found that your code is:
criterion = nn.MSELoss(reduction='sum')
loss = criterion(pred, target)
return loss / (pred.shape[0] * 46.0 * 46.0)
Do you use this loss function to avoid excessive loss?
Hi, thank you for asking. The output size is (batch_size, S=3, C=21, 46, 46), because we have 3 stages for keypoints, 21 keypoints, and image size is 46 * 46. As I use 'nn.MSELoss(reduction='sum')', so I need to divide the batch size * 46 * 46 to make it smaller. You can also use 'nn.MSELoss(reduction='mean')' to avoid dividing.
I write the loss function in this way simply because I want the loss value can reflect the stage number S and channel number C. In detail, the first 3 stages are mask or limb representation, and it has size (batch size, S=3, C=1 or 7, 46, 46). And the last 3 stages are keypoints heatmaps, and its size is (batch size, S=3, C=21, 46, 46).
Thank you very much for your work. I have a problem: if nn. MSELoss () is used as the loss function, and the Heatmap output from the network is directly compared with the generated Heatmap, the loss value will be abnormally large. How to deal with this problem?
I found that your code is:
criterion = nn.MSELoss(reduction='sum')
loss = criterion(pred, target)
return loss / (pred.shape[0] * 46.0 * 46.0)
Do you use this loss function to avoid excessive loss?
The text was updated successfully, but these errors were encountered: