Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Finch.jl backend #171

Open
MasonProtter opened this issue Apr 28, 2023 · 4 comments
Open

Add Finch.jl backend #171

MasonProtter opened this issue Apr 28, 2023 · 4 comments
Labels
enhancement New feature or request

Comments

@MasonProtter
Copy link

cc @willow-ahrens, Finch.jl is a tensor compiler for sparse multidimensional arrays, so it'd be cool if we could bring it under the Tullio.jl umbrella!

@mcabbott mcabbott added the enhancement New feature or request label Apr 28, 2023
@willow-ahrens
Copy link

Yes, absolutely! A few questions/comments.

  1. From my initial read, the Tullio macro expresses a (possibly compound) pointwise expression, to be eventually surrounded by nested loops. Is that correct?
  2. Is there an intermediate representation I can use after parsing but before lowering? Do most Tullio backends just use the input syntax directly?
  3. Some of the modifications to indices (e.g. mod) don't currently have Finch analogues, so that will have to wait till we implement those in Finch.
  4. We may want some sparsity-specific preprocessing to make it easier to use. For example, not all sparse formats support efficient random access, so we may want to automatically transpose and reformat some inputs before running the kernel, an example of this is described here

@mcabbott
Copy link
Owner

Sorry I thought I replied to this, but must not have pasted what I wrote. Maybe this indicates how active this project is -- I aim to keep it working but don't have time for much re-thinking.

From my initial read, the Tullio macro expresses a (possibly compound) pointwise expression, to be eventually surrounded by nested loops. Is that correct?

Yes that's exactly right. In particular, this means it deviates from Einstein notation as something like z[i] := x[i,j]*x[i,j] + y[i] has one loop nest, and y is inside the sum.

Is there an intermediate representation I can use after parsing but before lowering? Do most Tullio backends just use the input syntax directly?

Unfortunately this is not very modular. The three backends are Base, LoopVectorization, KernelAbstractions. Different methods of the act! function are generated for each, all at once. Everything that is parsed/resolved ends up in store::DotDict which is used to write code. @tullio verbose=2 prints it out. Code for the gradients is generated at the same time, by different source.

Adding a 4th backend which similarly builds loops with @finch in front would probably not be very hard. storage_type would have to recognise Finch's sparse array types, to dispatch to them.

@willow-ahrens
Copy link

Adding a 4th backend which similarly builds loops with @finch in front would probably not be very hard. storage_type would have to recognise Finch's sparse array types, to dispatch to them.

I agree, especially for the simpler expressions. Finch has similar features to storage_type in https://github.com/willow-ahrens/Finch.jl/blob/1b0539cc04556d7a92871d5af9c2221cecc099e1/src/base/broadcast.jl#L43 and similar features to https://github.com/mcabbott/Tullio.jl#notation in https://github.com/willow-ahrens/Finch.jl/blob/1b0539cc04556d7a92871d5af9c2221cecc099e1/src/transforms/wrapperize.jl#L10. The challenge will be implementing features where semantics maybe disagree between the two systems. I won't have time to work on this till January, but perhaps we can start with a simple prototype that implements most of the stuff, and add remaining features later.

@willow-ahrens
Copy link

At this point, we're slowly approaching a point of Finch development where this issue can begin to move forward. We currently implement an @einsum (with opportunities for fused execution with other array operations)

finch-tensor/Finch.jl#454

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants