We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
when I get the macs with thop ,I find the op's macs of torch.matmul will is 0, but it is very heavy macs in self attention of transformer
torch.matmul
self attention of transformer
class M(nn.Module): def __init__(self): super().__init__() def forward(self,x): out = torch.matmul(x,x.transpose(-1, -2)) print(out.shape) return out
The text was updated successfully, but these errors were encountered:
No branches or pull requests
when I get the macs with thop ,I find the op's macs of
torch.matmul
will is 0, but it is very heavy macs inself attention of transformer
The text was updated successfully, but these errors were encountered: