Bug #83451
closedDatabaseRowsUpdateWizard Out of Memory on big tables
0%
Description
The "database migrations on single rows" wizard selects complete tables, which leads to memory exhausted on large tables (on my system 1,6 million rows) as all rows get selected. Normally the rows should only be read from database while one row is processed after the other one (as we don't use something like fetchAll) but there are PHP issues around the buffering on MySQLi and pdo_mysql drivers.
See https://secure.php.net/manual/en/mysqlinfo.concepts.buffering.php about this (And special one in MySQLi see https://github.com/doctrine/dbal/blob/master/lib/Doctrine/DBAL/Driver/Mysqli/MysqliStatement.php#L192).
So we should move from select all to select only 1(0).000 rows at once. This will be slower but helps in this OoM issue.
Updated by Alexander Opitz almost 7 years ago
Shortest complete answer: https://stackoverflow.com/questions/13728106/unexpectedly-hitting-php-memory-limit-with-a-single-pdo-query
But we can't switch from buffered to unbuffered read, as we do queries while iterating over the rows.
Updated by Christian Kuhn about 6 years ago
- Status changed from New to Needs Feedback
unsure on how to proceed. any ideas? note: in v9, the row updater can be called via cli, this may relax the issue a bit.
Updated by Christian Kuhn about 6 years ago
- Related to Task #86200: Migrate RowUpdater wizards added
Updated by Kevin Ditscheid over 5 years ago
Just stumbled into this while updating a huge instance.
If I set $connectionForTable->getWrappedConnection()->setAttribute(\PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false); with driver pdo_mysql, I can step over the statement execution, but the open unbuffered connection does not allow for another query.
```
An exception occurred while executing 'SELECT COUNT FROM `sys_registry` WHERE (`entry_namespace` = ?) AND (`entry_key` = ?)' with params ["installUpdateRows", "rowUpdatePosition"]: SQLSTATE[HY000]: General error: 2014 Cannot execute queries while other unbuffered queries are active. Consider using PDOStatement::fetchAll(). Alternatively, if your code is only ever going to run against mysql, you may enable query buffering by setting the PDO::MYSQL_ATTR_USE_BUFFERED_QUERY attribute.
```
I am unsure how to solve this. The table sys_file_metadata has more than 1.5 Million entries alone.
I mean, an Idea to solve this could be, that we fetch the whole row inside of the for-loop and only fetch uids beforehand.
This would however cause a massive increase in database connections, 1.5 Million for the sys_file_metadata alone in my case.
Updated by Benni Mack over 4 years ago
- Status changed from Needs Feedback to Closed
We've made this wizard repeatable so it works in chunks. Should go into the next v9 version - if you still encounter issues, let me know so I will re-open the ticket