vllm v0.18.0

pypi

A high-throughput and memory-efficient inference and serving engine for LLMs

Updated View on PyPI →
Version
0.18.0
15d ago
License
Apache-2.0
15d ago
Weekly Downloads
1.5M
15d ago
Dependencies
87
15d ago
Required Runtime
<3.14,>=3.10
15d ago
Deprecated
No
15d ago

Weekly downloads

Release activity

4 releases tracked — averages one every 7 days. Last release: 15 days ago.

v0.18.0 current
v0.18.0 Mar 20, 2026
v0.17.1 Mar 11, 2026
v0.17.0 Mar 07, 2026
v0.16.0 Feb 28, 2026

Last changed

Field Change When
weekly_downloads 1443689 1497880 15d ago
latest_version 0.17.1 0.18.0 15d ago
dep_count 86 87 25d ago
license (none) Apache-2.0 35d ago
required_runtime (none) <3.14,>=3.10 35d ago
deprecated (none) false 35d ago

Use it

REST API
$ curl
https://api.grounded-api.dev/v1/pypi/vllm/latest_version
{ "value": "0.18.0" }
CLI
$ npx grounded-cli vllm
vllm@0.18.0 (pypi)