Answer the question
In order to leave comments, you need to log in
How to send large json array over HTTP?
There is a request to the API that generates a huge array of data and issues "Allowed memory size of .. bytes exhausted" when the handler is running. It is necessary to transfer this data in json in response. How can this be done within a single request? Without increasing memory_limit.
Answer the question
In order to leave comments, you need to log in
If there is not enough memory, then there is no way to do without memory allocation.
Let's say /api/products/all you want to get all the products of the store.
In this case, the controller implementation would be something like "SELECT * FROM products > json_encode > response".
If your store has a million products, then the memory limit will be exhausted already at the level of the database query.
As an option, increase the memory limit for this particular request.
Or, you can try to give in parts of 1k positions /api/products/all?offset=1000
Or, you can run a large script with a maximum amount of memory once every 5 minutes in the background, which will select all products, serialize and put them in some file like products_all.json and issue this one on request to /api/products/all file (it is possible to issue a link to the file in order to shift the return of content to nginx)
If you put the question this way, then pack your request in zlib
https://github.com/nodeca/pako
And send it to the server
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question