Answer the question
In order to leave comments, you need to log in
How to process a large amount of data in a minute?
Let's say there are 1000 data in the database. There is a php script that should process them EVERY 1 MINUTE and give some result (no matter what), but in 1 minute the script processes only 10 data.
Everything is done in a while() { ] loop
How can we process them all in 1 minute?
Create 10 scripts and allocate 10 data for each script and hang it on cron for 1 minute? But if all these scripts start running at the same time, running, then running again, etc., won't this put a lot of load on the processor?
Answer the question
In order to leave comments, you need to log in
No magic - optimize requests and improve server hardware / distribute between servers.
Perhaps the problem can be solved on the other hand - to avoid the situation when you need to bypass the conditional thousand records every minute.
But in any case, at least a few details are needed. Otherwise, there will be fortune-telling on the coffee grounds.
Optimized algorithms, otherwise choose more suitable platforms for big data. My guess is "MySql" and php is a terrible solution when thinking about Big Data and its processing.
But as already mentioned above, specifics are needed, and not "let's say 1000".
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question