Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add masked_fill under Tensor #2346

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open

Conversation

fgsch
Copy link

@fgsch fgsch commented Jul 20, 2024

This unifies the masked_fill implementations under Tensor.

Addresses #2370 .

@LaurentMazare
Copy link
Collaborator

I'm not sure we want to have this under Tensor, the goal there is not to replicate all the functions in pytorch but rather have a smaller subset of basic functions. Maybe put it in candle-transformers/src/utils.rs instead.
Also note that the tensor pre-allocation for some of the models was done on purpose as it avoids a sync point on cuda which was a problem for performance so we would want to keep the current implementation for these.

@fgsch
Copy link
Author

fgsch commented Aug 2, 2024

@LaurentMazare thanks for the comments. I will update the PR accordingly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants