Quantcast
Channel: Issues for Drupal core
Viewing all articles
Browse latest Browse all 292655

Add an entity iterator to load entities in chunks

$
0
0

Problem/Motivation

If you need to load a lot of entities and iterate over them (pretty common), your memory usage could easily go through the roof.

For example, consider updating all nodes for some reason is a post_update hook. You want to load them all and save them, but loading all 1_000_000 nodes will over time lead to a out of memory issue on the machine. In order to solve that one common strategy would be, to load them in batches and in between clear the static memory cache.

This could also affect core directly, as we may need to update content (or config) entities in the upgrade path, but have no way of knowing how many items a site will have. This could cause problems and easily balloon over the PHP requirements we set to run Drupal.

Proposed resolution

Introduce an EntityIterator class that uses a generator to break the IDs to load into chunks. Then load just a set amount each time, clearing the static cache before each load so it only contains the items currently being iterated over.

We can get clever and use a generator. Note: We clear the memory cache using $memory_cache->deleteAll() so all referenced entities are removed from the cache as well.

Remaining tasks

User interface changes

N/A - Developer only

API changes

Addition of new Iterator. Example usage:

 $iterator = new ChunkedIterator($entity_storage, \Drupal::service('entity.memory_cache'), $all_ids);
 foreach ($iterator as $entity) {
   // Process the entity
 }

Data model changes

N/A


Viewing all articles
Browse latest Browse all 292655

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>