Could you possibly add infini-attention and not this thing whatever this was supposed to be

2
👍
0
kyegomez
kyegomez / Infini-attention
Implementation of the paper: "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" from Google in pyTORCH

Minimum is $1

By funding this issue, you agree to our Terms of Service and understand our Privacy Policy.

How does funding with Polar work?

1

Pay now to fund the work behind this issue.

2

Get updates on progress being made.

3

Maintainer is rewarded once the issue is completed.

FAQ

Backer

You're funding impactful open source efforts

Contributor

You want to contribute to this effort

Maintainer

You want to get funding like this too