You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Expected behavior (works for the i.e. first 1000 pdfs)
Issue
Additional context
When a create i.e. 1000 PDF's with php artisan queue:work --queue=export **--max-jobs=1** --memory=4000 --timeout=3600 step by step (process for process) it's working fine. Without --max-jobs=1 the same issue appears. Same issue when we process the jobs with horizon with a job chunk size of 1000
$jobs = [];
foreach ($this->invoices()->chunk(1000) as $invoiceIds)
{
$jobs[] = new ExportInvoicesChunkJob($this->shop, $invoiceIds->values());
}
if (empty($jobs))
{
logger("no invoices for export available");
return;
}
// zipped und schreibt den export eintrag in die datenbank
$jobs[] = new ExportInvoicesFinishJob($this->shop, $this->params['range']);
Bus::chain($jobs)->catch(function (Throwable $e)
{
logger($e->getMessage());
})->onQueue('export')->dispatch();
I guess it has something todo with Laravel view / caching system. Maybe the cache but i already tries to drop the cache each 100 PDF's.
We appreciate any help 🙏
The text was updated successfully, but these errors were encountered:
Hi @bsweeney , thank you very much for that interesting hint! Do you have any idea how to change it for a test? My thoughts was to move the method prop
Describe the bug
As soon as we generate more than 1000 PDFs by using PDF::loadView, it starts randomly breaking the PDF design (starts overlapping)
To Reproduce
Steps to reproduce the behavior:
Expected behavior (works for the i.e. first 1000 pdfs)
![Image](https://private-user-images.githubusercontent.com/2176503/411169909-d0db26fd-58d0-4ebd-952e-bd29d2f5aca8.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkxODkxMzAsIm5iZiI6MTczOTE4ODgzMCwicGF0aCI6Ii8yMTc2NTAzLzQxMTE2OTkwOS1kMGRiMjZmZC01OGQwLTRlYmQtOTUyZS1iZDI5ZDJmNWFjYTgucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI1MDIxMCUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNTAyMTBUMTIwMDMwWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9Mjk1ZjY5ZjIxZjQ1NGUzNzk4YjIwZDAwZjJkNGNmOTA4NTIwNDNmOGMzODFlMDhmZGJlYWVkMmMyNzIyMzA0ZiZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.pb_eRaCbnJGXgZHXLOID1z5Zb25rMg_kCPAkU77owLU)
Issue
![Image](https://private-user-images.githubusercontent.com/2176503/411171168-cf8ba179-c88e-4dfa-913d-bdaacc7cc1a2.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkxODkxMzAsIm5iZiI6MTczOTE4ODgzMCwicGF0aCI6Ii8yMTc2NTAzLzQxMTE3MTE2OC1jZjhiYTE3OS1jODhlLTRkZmEtOTEzZC1iZGFhY2M3Y2MxYTIucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI1MDIxMCUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNTAyMTBUMTIwMDMwWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9NDk5OTdiNzRkNjE4ODAwMDRhNGQ1MzhkMjZiY2NlNzJiNzYwZWNhMjQzNGQwNjUyYTY0Y2U3YzZkZmU3ZTg2NiZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.nR6kUj9aPEmsINx2b8dcblCz491NsUPKGBtAXIcEBiU)
Additional context
When a create i.e. 1000 PDF's with
php artisan queue:work --queue=export **--max-jobs=1** --memory=4000 --timeout=3600
step by step (process for process) it's working fine. Without --max-jobs=1 the same issue appears. Same issue when we process the jobs with horizon with a job chunk size of 1000I guess it has something todo with Laravel view / caching system. Maybe the cache but i already tries to drop the cache each 100 PDF's.
We appreciate any help 🙏
The text was updated successfully, but these errors were encountered: