Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
In the current implementation of binary_cross_entropy_with_logit the loss will actually be NaN due to taking the log(0) which occurs for high logits passing through a sigmoid and an affine transformation: inp.affine(-1., 1.)?.log()? ^ ^ ^ | | | 1.0 | | 0.0 | NaN The proposed implementation is actually taken more or less directly from pytorch https://github.com/pytorch/pytorch/blob/41977a05314bbf537e1c5d6cf5916a368d1907d9/aten/src/ATen/native/Loss.cpp#L362
- Loading branch information