Nixpkgs Security Tracker

Login with GitHub

Suggestions search

With package: vllm

Found 2 matching suggestions

updated 2 days, 21 hours ago by @jopejoe1 Activity log
  • Created automatic suggestion
  • @jopejoe1 accepted
  • @jopejoe1 published on GitHub
vLLM leaks a heap address when PIL throws an error

vLLM is an inference and serving engine for large language models (LLMs). From 0.8.3 to before 0.14.1, when an invalid image is sent to vLLM's multimodal endpoint, PIL throws an error. vLLM returns this error to the client, leaking a heap address. With this leak, we reduce ASLR from 4 billion guesses to ~8 guesses. This vulnerability can be chained a heap overflow with JPEG2000 decoder in OpenCV/FFmpeg to achieve remote code execution. This vulnerability is fixed in 0.14.1.

Affected products

vllm
  • ==>= 0.8.3, < 0.14.1

Matching in nixpkgs

Package maintainers

Upstream fix: https://github.com/vllm-project/vllm/releases/tag/v0.14.1
Upstream advisory: https://github.com/vllm-project/vllm/security/advisories/GHSA-4r2x-xpjr-7cvv

Unstable fix: https://github.com/NixOS/nixpkgs/pull/483505
updated 2 weeks, 6 days ago by @LeSuisse Activity log
  • Created automatic suggestion
  • @LeSuisse removed
    4 packages
    • pkgsRocm.vllm
    • python312Packages.vllm
    • python313Packages.vllm
    • pkgsRocm.python3Packages.vllm
  • @LeSuisse accepted
  • @LeSuisse published on GitHub
vLLM affected by RCE via auto_map dynamic module loading during model initialization

vLLM is an inference and serving engine for large language models (LLMs). Starting in version 0.10.1 and prior to version 0.14.0, vLLM loads Hugging Face `auto_map` dynamic modules during model resolution without gating on `trust_remote_code`, allowing attacker-controlled Python code in a model repo/path to execute at server startup. An attacker who can influence the model repo/path (local directory or remote Hugging Face repo) can achieve arbitrary code execution on the vLLM host during model load. This happens before any request handling and does not require API access. Version 0.14.0 fixes the issue.

Affected products

vllm
  • ==>= 0.10.1, < 0.14.0

Matching in nixpkgs

Package maintainers

Upstream advisory: https://github.com/vllm-project/vllm/security/advisories/GHSA-2pc9-4j83-qjmr
Upstream fix: https://github.com/vllm-project/vllm/commit/78d13ea9de4b1ce5e4d8a5af9738fea71fb024e5