Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential PSNR calculation issue due to in-place resize operation of non-contiguous tensors #51

Closed
pengpeng-yu opened this issue Sep 30, 2024 · 1 comment

Comments

@pengpeng-yu
Copy link

Hello,

Thank you for your excellent open-source work!
I’ve encountered a potential issue with the calculation of PSNR in newer versions of PyTorch, which I believe is caused by the in-place resize operations of non-contiguous tensors.

Specifically, notice that slice operations are performed before calling criterion_psnr, creating non-contiguous tensors:

loss_psnr = criterion_psnr(output[:, :, 128:-128, 128:-128], target[:, :, 128:-128, 128:-128])

Later, in the Loss_PSNR class, we use in-place operations with resize_:

Itrue = im_true.clamp(0., 1.).mul_(data_range).resize_(N, C * H * W)
Ifake = im_fake.clamp(0., 1.).mul_(data_range).resize_(N, C * H * W)

However, the torch.Tensor.resize_ method assumes the input tensor is contiguous, which may lead to unintended data access. Please see PyTorch 1.7.0 resize_:

Warning
This is a low-level method. The storage is reinterpreted as C-contiguous, ignoring the current strides (unless the target size equals the current size, in which case the tensor is left unchanged). For most purposes, you will instead want to use view(), which checks for contiguity, or reshape(), which copies data if needed. To change the size in-place with custom strides, see set_().

Therefore, the current implementation relies on the clamp operation to produce a contiguous output, which seems not to be guaranteed by PyTorch conventions.

A quick evaluation can be made like this:

assert (im_true.reshape(N, C * H * W) != im_true.clone().resize_(N, C * H * W)).any()

Maybe we can change the resize_ to reshape for robustness purpose?

Possibly related issues: #4, #8, #29, #45.

Thanks!

@caiyuanhao1998
Copy link
Owner

Hi, thanks for reminding us! I pin this issue with the label.

If you find our repo useful, please help us star it :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants