PySpect

Home

lists

Frequently asked questions

© 2025 PySpect

Package profile

flash-attn

  • Summary: Flash Attention: Fast and Memory-Efficient Exact Attention
  • Author: Tri Dao
  • Homepage: https://github.com/Dao-AILab/flash-attention
  • Source: https://github.com/Dao-AILab/flash-attention (Repo profile)
  • Number of releases: 75
  • First release: 0.2.0 on 2022-11-15
  • Latest release: 2.8.3 on 2025-08-15

Releases

Dates and sizes of releases2023AprilJulyOctober2024AprilJulyOctober2025AprilJulyRelease Date23456789Size in MB

PyPI Downloads

Loading PyPI statistics...

Dependencies

Flash-attn has 2 dependencies, 0 of which optional.
Dependencies of flash-attn (2).
DependencyOptional
einopsfalse
torchfalse

Details