I
I
IvanRu082020-05-04 13:31:18
PHP
IvanRu08, 2020-05-04 13:31:18

How to deal with WordPress memory leak?

Hello. Faced with the task of updating the meta data of 60k+ posts. I go through all the foreach loop:

foreach ( $product_ids_list as $id ) {
        cr_product_fill_meta( $id );
        error_log( 'after memory usage: ' . memory_get_usage() );
        $i


cr_product_fill_meta function :
function cr_product_fill_meta ( $prod_id ) {
    //товар можно редактировать?
    $is_product_blocked_edit = get_post_meta( $prod_id, 'is_product_blocked_edit', true );
    unset ( $is_product_blocked_edit );
  }
  add_action( 'save_post', 'cr_product_fill_meta' );


Now everything superfluous has been removed from the function. But even what is eats up memory very quickly and I do not understand why. Here is what I see in the log:
[04-May-2020 10:17:24 UTC] before memory usage: 66954464
[04-May-2020 10:17:50 UTC] before memory usage: 66954464
[04-May-2020 10:17:51 UTC] after memory usage: 66992384
[04-May-2020 10:17:51 UTC] after memory usage: 67011952
[04-May-2020 10:17:51 UTC] after memory usage: 67031520
[04-May-2020 10:17:51 UTC] after memory usage: 67051088
[04-May-2020 10:17:51 UTC] after memory usage: 67070656
[04-May-2020 10:17:51 UTC] after memory usage: 67090544
[04-May-2020 10:17:51 UTC] after memory usage: 67110112
[04-May-2020 10:17:51 UTC] after memory usage: 67129680
...................................................................
[04-May-2020 10:17:55 UTC] after memory usage: 226265440
[04-May-2020 10:17:55 UTC] after memory usage: 226284416
[04-May-2020 10:17:55 UTC] after memory usage: 226303392
[04-May-2020 10:17:55 UTC] after memory usage: 226322368
[04-May-2020 10:17:55 UTC] after memory usage: 226341344
[04-May-2020 10:17:55 UTC] after memory usage: 226360320
[04-May-2020 10:17:55 UTC] after memory usage: 226379296


And then everything rests on memory_limit (even 10k posts do not work). Where does the memory go if I create a variable and immediately kill it?? The script is triggered by ajax, if that's important

Answer the question

In order to leave comments, you need to log in

3 answer(s)
D
Daria Motorina, 2020-05-04
@IvanRu08

enumeration of 60k elements is loading 60k elements into memory, it is not customary to do this and exceeding the memory limit is a natural problem. Process the data in batches (select with limit and offset), so you can cycle through the next batch of data and process them element by element. Basically it's pagination.

X
xmoonlight, 2020-05-04
@xmoonlight

Disable caching of everything you can.
And Daria Motorina 's comment is essentially the answer.

D
d-stream, 2020-05-04
@d-stream

Hm... and sql update than did not please?

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question