Answer the question
In order to leave comments, you need to log in
JSON with a size of 1GB, how to properly distribute the load when removing elements from it?
Good afternoon. I have a JSON with 900,000 elements and a size of 1.2 GB.
What you need: remove multiple values from each element.
Question: how to properly distribute the load so that the server does not fall (on a shared hosting) and it turns out as quickly as possible?
Several options:
1) Split into parts-files of 20,000 elements (long).
2) Through array_slice, set intervals each time (somehow it is doubtful, there is a suspicion that the server will crash). Plus opening a 1.2GB file each pass will require a lot of performance.
What else is there? Thank you!
UPD: what if you run it on your computer through OpenServer? Are there limits or will it depend on the power of the computer?
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question