-
-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add topk / maxk #352
Comments
Those don't strictly appear to be NNlib's area to address though. We have topk in Metalhead.jl iirc |
We could also steal them from ResNetImageNet.jl |
It's generally useful. Pytorch and tensorflow have it, deep graph library and pytorch geometric use it internally and I need it for https://github.com/CarloLucibello/GraphNeuralNetworks.jl. I'll look into and eventually port here those other implementations |
I can't find topk in Metalhead.jl, nor in ResNetImageNet.jl |
https://github.com/FluxML/Metalhead.jl/blob/23df74c7df1bfca2ee66392a29bc3529deab9dbf/src/utils.jl#L73 (these functions should be ported back to master) I agree that these are generally useful, but not a "neural network primitive". I can see it in a higher abstracted library like Metalhead, maybe Flux. |
I don't think they belong in Metalhead.jl as they are not specific to vision models. The advantage of NNlib is that packages unrelated to Flux can use it. |
I consider it a differentiable transformation, a generalization of maximum, and prefer having it here for wider availability. We also have |
We should add what is called elsewhere
topk
ormaxk
, e.g. https://pytorch.org/docs/stable/generated/torch.topk.htmlIn Base we have
partialsort
andpartialsortperm
doing something similar but limited to 1d arraysThe text was updated successfully, but these errors were encountered: