G
G
GrimJack2017-03-16 18:16:23
JavaScript
GrimJack, 2017-03-16 18:16:23

How to properly organize the script in this case?

There are 7 files (to fill in 7 different fields, 6 files have the same number of lines, let's say 1000, and 1 file "by itself").
In fact - the generation of posts. 1000 headings for 1 content (roughly speaking). You need to iteratively send them to the specified address by post (there are no problems with this).
If all the data is sent to the server and processed there, the server will think for a long time, timeouts and other bad things.
I thought to "unload" by making iteration on the client (a loop with Ajax inside), and "assembling" the data and sending it to the server. But then the browser crashes due to lack of memory.
What is the best way to do it? Cron is not an option. Maybe it’s worth considering queues (I heard about them in Laravel, but I haven’t reached them yet)? Or are there simpler solutions?
Do not ask me to decide for me, just push me on the right path

Answer the question

In order to leave comments, you need to log in

1 answer(s)
E
Eugene Budnik, 2017-03-18
@egenik

Queues are just the thing. Additionally, you create a table, for example, file_handlings, in which you will write the current file processing process with something like this structure id, total_rows, completed_rows. And the process will look something like this:
1. upload the file to the server
2. create an entry in the file_handlings table
3. return the id to the client from the file_handlings table
4. run the job on the server, in the job pass the id from the file_handlings table
5. process the file in the job and write the current processing process to the file_handlings table by id
6. from the client ajax from time to time pull the server and get by id from the file_handlings table how many rows were processed
......
N. All lines of the file have been processed, then do what you need;)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question