Get down and dirty with FlashAttention2.0 in pytorch, plug in and play no complex CUDA kernels

Issues looking for funding

Creatorkyegomez
Stars90
LicenseMIT License
RepositoryGitHub