Replies: 1 comment
-
My issue seems to be the same as SpartnerNL/Laravel-Excel#2793 where SimpleXML is the cause. Apparently PHP doesn't take SimpleXML's memory usage into consideration, which explains the huge discrepancy and why the process didn't get killed on the server. I'm afraid this issue may be a fundamental flaw in this library, short of rewriting it to use XMLReader. Would love to know if anyone else has come up with a solution to this. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Greetings,
I recently experienced an OOM crash on my server when trying to process a 7 MB, 13,000 row Excel spreadsheet. It brought the whole server down! I've been trying to debug the issue locally, and I discovered that just loading the spreadsheet causes the PHP process to balloon up to 2.4G resident memory used, even though PHP reports a real peak_memory_usage of just 38M. Is this the normal/expected behaviour? Is there anything I can do to reduce the memory usage? I'm using a read filter to limit the number of rows read at a time, but it doesn't make any difference.
My simple test script:
ChunkReadFilter:
Output:
As an experiment, I opened the file in LibreOffice and re-saved it as an xlsx in case it was corrupted. The file size dropped to 2.3M, and loading this with PHPSpreadsheet only used 1G of RAM. Still too high, but why was it so much lower for the exact same data?
Any insights would be appreciated.
FYI, the server has 4G of RAM, and memory_limit is set to 1G in php.ini, so I'm not sure why it crashed.
PHPSpreadsheet 4.3.1
PHP 8.4.8 (local), 8.3.7 (server)
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions