Package profile
flash-attn
- Summary: Flash Attention: Fast and Memory-Efficient Exact Attention
- Author: Tri Dao
- Homepage: https://github.com/Dao-AILab/flash-attention
- Source: https://github.com/Dao-AILab/flash-attention (Repo profile)
- Number of releases: 73
- First release: 0.2.0 on 2022-11-15
- Latest release: 2.8.1 on 2025-07-10