Log in

Get down and dirty with FlashAttention2.0 in pytorch, plug in and play no complex CUDA kernels

Issues looking for funding

Creatorkyegomez
Stars98
LicenseMIT License
RepositoryGitHub