PySpect

Home

lists

Frequently asked questions

© 2025 PySpect

Package profile

flash-attn

  • Summary: Flash Attention: Fast and Memory-Efficient Exact Attention
  • Author: Tri Dao
  • Homepage: https://github.com/Dao-AILab/flash-attention
  • Source: https://github.com/Dao-AILab/flash-attention (Repo profile)
  • Number of releases: 73
  • First release: 0.2.0 on 2022-11-15
  • Latest release: 2.8.1 on 2025-07-10

Releases

Dates and sizes of releases2023AprilJulyOctober2024AprilJulyOctober2025AprilJulyRelease Date2345678Size in MB

PyPI Downloads

Weekly downloads over the last 3 monthsFebruaryMarchAprilMayJuneDate050100150200250300350 thousand downloads per week

Dependencies

Flash-attn has 2 dependencies, 0 of which optional.
Dependencies of flash-attn (2).
DependencyOptional
einopsfalse
torchfalse

Details