Nixpkgs Security Tracker

Login with GitHub

Suggestions search

With package: python312Packages.keras

Found 2 matching suggestions

Untriaged
created 6 hours ago
Arbitrary File Read in Keras via HDF5 External Datasets

Arbitrary file read in the model loading mechanism (HDF5 integration) in Keras versions 3.0.0 through 3.13.1 on all supported platforms allows a remote attacker to read local files and disclose sensitive information via a crafted .keras model file utilizing HDF5 external dataset references.

Affected products

Keras
  • <3.13.1

Matching in nixpkgs

Package maintainers

Published
updated 3 months, 1 week ago by @LeSuisse Activity log
  • Created automatic suggestion
  • @Erethon dismissed
  • @Erethon marked as untriaged
  • @LeSuisse removed package python312Packages.tf-keras
  • @balsoft added package python312Packages.tf-keras
  • @balsoft dismissed
  • @LeSuisse accepted
  • @LeSuisse removed package python312Packages.tf-keras
  • @LeSuisse published on GitHub
Arbitary Code execution in Keras load_model()

The Keras Model.load_model method can be exploited to achieve arbitrary code execution, even with safe_mode=True. One can create a specially crafted .h5/.hdf5 model archive that, when loaded via Model.load_model, will trigger arbitrary code to be executed. This is achieved by crafting a special .h5 archive file that uses the Lambda layer feature of keras which allows arbitrary Python code in the form of pickled code. The vulnerability comes from the fact that the safe_mode=True option is not honored when reading .h5 archives. Note that the .h5/.hdf5 format is a legacy format supported by Keras 3 for backwards compatibility.

Affected products

keras
  • =<3.11.2

Matching in nixpkgs

Package maintainers