I
I
iBird Rose2018-12-23 16:16:36
Algorithms
iBird Rose, 2018-12-23 16:16:36

How to properly distribute tasks depending on the load?

Hello. In a nutshell, there is an array (let's say 1000 elements) and there are groups in it and their statistics on load in% ratio:
group_id1 - 0.2%
group_id2 - 5%
group_id3 - 61%
...
group_id1000 - 3%
i.e. in general, the sum of % of all 1000 elements will be 100
The task is to distribute these 1000 groups on 30 servers according to the load. Those. in essence, the task is to distribute these 1000 groups into an array of 30 elements.
So far, there are no ideas how to approach the task. Any ideas?)
UPD: there are groups whose workload is sometimes 60% of the total. means their processing needs to be taken out at this moment on the separate server. Accordingly, it is stupid to scatter - it will not work.
UPD 2: there are also groups that do not consume anything at all, which means you can put a huge pile on 1 server. and let them rest there

Answer the question

In order to leave comments, you need to log in

2 answer(s)
L
longclaps, 2018-12-23
@iiiBird

You will not believe.
If the tasks are just randomly distributed among the servers, on average, the maximum load on the server will not exceed 5% of the total.
Here is the proof.

from random import random, randint

N, M = 1000, 30
tasks = [random() for _ in range(N)]
w = sum(tasks)
for i in range(N):
    tasks[i] /= w
servers = [0.] * M
for t in tasks:
    servers[randint(0, M - 1)] += t
print(min(servers), max(servers))

R
Roman Mirilaczvili, 2018-12-24
@2ord

I'm assuming this is a message queuing (MQ) problem. In the sense that having a lot of tasks, they can be queued (s). So, a lot of queue handlers (workers) will take tasks from the queue and process them. Having processed the task, they take the next one. And so with many servers. In this case, the total load between the servers must be uniform. As I understand it, ⚡ Kotobotov ⚡ meant this scenario.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question