B
B
Bur Ov2017-03-28 00:22:06
PHP
Bur Ov, 2017-03-28 00:22:06

How to process a large amount of data in a minute?

Let's say there are 1000 data in the database. There is a php script that should process them EVERY 1 MINUTE and give some result (no matter what), but in 1 minute the script processes only 10 data.
Everything is done in a while() { ] loop
How can we process them all in 1 minute?
Create 10 scripts and allocate 10 data for each script and hang it on cron for 1 minute? But if all these scripts start running at the same time, running, then running again, etc., won't this put a lot of load on the processor?

Answer the question

In order to leave comments, you need to log in

5 answer(s)
M
McBernar, 2017-03-28
@McBernar

No magic - optimize requests and improve server hardware / distribute between servers.
Perhaps the problem can be solved on the other hand - to avoid the situation when you need to bypass the conditional thousand records every minute.
But in any case, at least a few details are needed. Otherwise, there will be fortune-telling on the coffee grounds.

A
Alexander Alexandrovich, 2017-03-28
@tatu

Optimize queries or process in real time

T
Therapyx, 2017-03-28
@Therapyx

Optimized algorithms, otherwise choose more suitable platforms for big data. My guess is "MySql" and php is a terrible solution when thinking about Big Data and its processing.
But as already mentioned above, specifics are needed, and not "let's say 1000".

X
xmoonlight, 2017-03-28
@xmoonlight

We need to turn the cycle into a flow. Then you can.

R
res2001, 2017-03-28
@res2001

Offload processing to the database server, PHP script only start and get the result.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question