Package profile
vllm
- Summary: A high-throughput and memory-efficient inference and serving engine for LLMs
- Author: vLLM Team
- Homepage: https://github.com/vllm-project/vllm
- Source: https://github.com/vllm-project/vllm (Repo profile)
- Number of releases: 65
- First release: 0.0.1 on 2023-06-19
- Latest release: 0.10.1.1 on 2025-08-20
Releases
PyPI Downloads
Loading PyPI statistics...