Quantcast
Channel: Issues for Drupal core
Viewing all articles
Browse latest Browse all 294882

Add an entity iterator to load entities in chunks

$
0
0

Problem/Motivation

If you need to load a lot of entities and iterate over them (pretty common), your memory usage could easily go through the roof.

For example, consider updating all nodes for some reason is a post_update hook. You want to load them all and save them, but loading all 10000000000 nodes will kill your machine. You would need to load them in batches to keep memory usage at a sane level.

This could also affect core directly, as we may need to update content (or config) entities in the upgrade path, but have no way of knowing how many items a site will have. This could cause problems and easily balloon over the PHP requirements we set to run Drupal.

Proposed resolution

Introduce an EntityIterator class that uses a generator to break the IDs to load into chunks. Then load just a set amount each time, clearing the static cache before each load so it only contains the items currently being iterated over.

We can get clever and use a generator. This internally creates an iterator so we can implement \IteratorAggregate and just return our generator!!

Remaining tasks

Add some tests

User interface changes

N/A - Developer only

API changes

Addition of new Iterator

Data model changes

N/A


Viewing all articles
Browse latest Browse all 294882

Trending Articles