M
M
Matvey Nosurname2021-10-10 23:36:13
Python
Matvey Nosurname, 2021-10-10 23:36:13

Why does the put method hang?

I need to implement a function in parallel on several processes. I separate the data by processes, and I put their result in a queue. After completing their work, I combine the data in the main thread, pulling them out of the queue. But my queue's put method hangs. Help me fix it please

# Подсчёт элементов с помощью многопоточности
from time import process_time
from multiprocessing import Process, Queue, current_process


# Calculates matrix row
def calculate_part(params):
    p_i, q_vector = params
    return [(p_i ** 2 + q_j ** 2) ** 0.5 for q_j in q_vector]


def calculate_n_parts(n_params: list, start_index: int, end_index: int, queue: Queue):
    result = []

    for p_i, q_vector in n_params:
        result.append(calculate_part((p_i, q_vector)))
    res_obj = (start_index, end_index, *result)
    queue.put(res_obj)


def concatenate_data(queue):
    global n_el
    result = [0 for i in range(n_el)]
    while not queue.empty():
        tup = queue.get()
        result[tup[0]: tup[1]] = tup[2]

    return result


if __name__ == '__main__':
    n_el = 1000

    processes_count = 5
    process_el_count = n_el // processes_count
    matrix = [0 for i in range(n_el)]
    data_queue = Queue()

    q_vect = [(i + 11) ** 2 for i in range(n_el)]
    p_vect = [(i * 3 + 13) * 17 for i in range(n_el)]

    prepared_data = [(p_i, q_vect) for p_i in p_vect]

    t_res = []
    for index in range(0, n_el, process_el_count):
        t_res.append((prepared_data[index:index + process_el_count], index, index + process_el_count, data_queue))

    processes = [Process(target=calculate_n_parts,
                         args=tup) for tup in t_res]

    start_time = process_time()

    for process in processes:
        process.start()

    for process in processes:
        process.join()

    print(concatenate_data(data_queue))

    print(process_time() - start_time, "секунд")

Answer the question

In order to leave comments, you need to log in

1 answer(s)
V
Vindicar, 2021-10-11
@Vindicar

Perhaps the queue is full.
An easy way around this is to know how many jobs have been given to child processes. Read the answers from the queue and count them. When you read as many answers as the tasks were given, you can stop reading further.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question