Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory ussage when using craft imager-x/generate -f my-superField -t=namedTransform #210

Open
martin-coded opened this issue Apr 27, 2023 · 8 comments
Labels
bug Something isn't working

Comments

@martin-coded
Copy link

Just as information. When transforming a lot of images with "imager-x/generate" after some time, we are running out of memory. I am not sure if this is a docker problem or a general issue. At the moment the memory usage increases up to 12 GB, which is the max defined in the docker settings. After about 275 images we are out of memory and we have to restart the process.

craft imager-x/generate -f my-superField -t=articleHeaderJpeg

@martin-coded
Copy link
Author

martin-coded commented Apr 28, 2023

If you have a similar problem and just want to take some sleep or a lot of coffee for the next several hundred thousand seconds, without checking if the task was killed because it run out of memory? I introduce to you the super bad quick-fix loop ;) This is not a solution! I only use it for the initial generation in a dev env.

while true
do
    nohup php craft imager-x/generate -f articleImage -t=articleHeaderAvif &
    pid=$!
    wait $pid || continue
    break
done

@aelvan
Copy link
Contributor

aelvan commented Apr 28, 2023

I've noticed something similar to this, maybe there's some kind of memory leak related to console commands. I'll have a look.

@aelvan aelvan added the bug Something isn't working label Apr 28, 2023
@aelvan
Copy link
Contributor

aelvan commented Jul 13, 2023

4.2.0 has a fix that will greatly improve memory usage when using the generate command. Let me know if that helps.

@dgsiegel
Copy link

dgsiegel commented Mar 6, 2024

@aelvan I am also running into memory issues when generating all assets (4 named transforms with 3-5 sizes each, jpeg and avif) of a volume with php craft imager-x/generate -v [myvolume]

After about an hour, the process has consumed about 8gb of memory and is killed by the kernel:

kernel: php invoked oom-killer: gfp_mask=0x140cca(GFP_HIGHUSER_MOVABLE|__GFP_COMP), order=0, oom_score_adj=0
[...]
kernel: Out of memory: Killed process 323612 (php) total-vm:6616156kB, anon-rss:3270960kB, file-rss:0kB, shmem-rss:0kB, UID:33 pgtables:12520kB oom_score_adj:0

From watching the script run I can see that each transform adds a few mb's of memory that does not get freed, so you have to run into some limit at some point.

I'd have two suggestions here:

  • There are definitely some remaining memory leaks here, it might make sense to hunt for those
  • Would it be possible to set a limit of how many transforms should be done per run? That way memory would be freed and you could run the script until all transforms have been processed.

@dgsiegel
Copy link

dgsiegel commented Mar 7, 2024

After some further testing it turns out, most of our memory problems result while creating AVIF images. There definitely seems to be a memory leak here. We found two workarounds:

Similarly to #210 (comment) we use a loop, but kill the process after a defined time (e.g. 30m) instead of waiting for it to use up all available memory:

#!/bin/sh

while timeout 30m php craft imager-x/generate -v [YOUR_VOLUME] ; ret=$? ; [ $ret -ne 0 ]; do
  echo "Restarting generate"
done

The second solution is to use the customEncoders setting in config/imager-x.php, and define an external encoder like avifenc (https://github.com/AOMediaCodec/libavif), cavif (https://github.com/kornelski/cavif-rs) etc:

<?php

return [
  [...]
  'customEncoders' => [
    'avif' => [
      'path' => '/usr/bin/cavif',
      'options' => [
        'quality' => 80,
        'speed' => 7,
      ],
      'paramsString' => '--quality {quality} --speed {speed} --overwrite -o {dest} {src}'
    ],
  ],
];

@aelvan
Copy link
Contributor

aelvan commented Mar 12, 2024

@dgsiegel Could you post more details about your environment (php version, image driver, etc).

@dgsiegel
Copy link

dgsiegel commented Mar 12, 2024

@dgsiegel Could you post more details about your environment (php version, image driver, etc).

@aelvan sure thing! It's a pretty standard Debian 12 install, with:

  • Imager version & edition: 4.3.1 PRO
  • Imager transformer: craft
  • Craft version: Craft Pro 4.8.1
  • PHP version: 8.2.7
  • Image driver & version: GD 8.2.7
  • Image driver supported formats jpg, jpeg, gif, png, webp, avif
  • Enabled external storages No external storage configured.
  • Enabled optimizers No optimizers configured.

@aelvan
Copy link
Contributor

aelvan commented Apr 5, 2024

This is a hard one to confidently resolve, mostly because it's hard to make a reliable test case. Imagick using external libraries, and PHP's wonky garbage collection doesn't help.

Anyway, there might've been a leak in my code related to avif specifically. I've swapped out my custom code, and instead use the Imagine library directly (when I implemented support for avif and jxl, Imagine didn't support it, but as of Craft 4.4, it does), which at least should increase the chance that this is resolved. But, I'll do more tests and see if I can find more issues that needs to be resolved. If you test this again, let me know if there're any improvements.

I've also added some new features to the generate command, it now has parameters for limit, offset and queue:

./craft imager-x/generate -v images -t myTransform  --limit=20 --offset=20 --queue

When adding --queue, transforms will not be done runtime in the console, queue jobs for the transforms will be created instead. Kinda defeating the point of running a console command, but in cases like this, it's kinda more reliable to let a queue handler handle it, and have each queue job spin up a separate PHP process for each transform.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants