A
A
Alexander Ivanov2019-01-06 12:32:26
Video cards
Alexander Ivanov, 2019-01-06 12:32:26

How to use 6 video cards at once for calculations?

There is a former farm of six 1060 video cards. Now it is planned to be used to create face matrices based on dlib. The OS is Ubuntu 18.04, but you can also install Windows. The question is how to use the power of all video cards at once for dlib to work? It is desirable to be as detailed as possible
. After all, the software for mining, as the same, uses all the cards

Answer the question

In order to leave comments, you need to log in

4 answer(s)
A
Alexander Ivanov, 2019-01-08
@alexivanov77

I decided as follows: I run copies of the script, for each video card separately. Before running the script in the console: export CUDA_VISIBLE_DEVICES="0", where 0 is the video card number. Then python script.py

D
Dimonchik, 2019-01-06
@dimonchik2013

NVIDIA SLI go by three,
well, maybe something has already changed

M
Max, 2019-01-06
@MaxDukov

Start by building dlib with cuda. Next, you should look towards multigpu in cuda, it should be either out of the box, or some kind of nccl type.
Quick googling really suggests that people have faced such a task, and have not found a beautiful solution. https://github.com/davisking/dlib/issues/1482

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question