D
D
delkov2016-02-11 11:05:22
linux
delkov, 2016-02-11 11:05:22

[linux] how to allocate more memory/resources for an application?

Good day, dear ones!
Given: linux, 4 cores, 16gb RAM.
Task: it is necessary to give more resources to the matlab process, or to the program written in C ++ main.cpp.
When executing the program, the processor eats only 14%, RAM 15% - I would like more.
Are there any options for parallelization?
* the program itself is trivial: there is txt ~ 1gb, it is read line by line and arithmetic operations are done with the line data.
Thanks in advance!

Answer the question

In order to leave comments, you need to log in

3 answer(s)
R
res2001, 2016-02-11
@delkov

It makes no sense to parallelize - disk operations are badly parallelized, because. disk is a slow device. It is he who is the main brake in your case. Read not line by line, but in large blocks, a multiple of the cluster size, for example, 4/8/16Kb, this will reduce the number of disk operations. The code will certainly become more complicated. In the last question , Oleg Shevelev already wrote this to you.
You can use asynchronous IO for parallelization, but, IMHO, this will complicate the code even more, most likely this is not necessary for your task (reading in large portions should help), because further operations on the read line are trivial.
And by the way, you can not read the file in C ++, if you know Matlab well, do it on it according to the same algorithm - because disk operations are the main brake, then there will be practically no advantage from using C ++.
In general, the system gives the program as many resources as it needs, if there are not enough resources, the program usually ends with an error.

U
uvelichitel, 2016-02-11
@uvelichitel

Sometimes it helps to increase the priority of the process (max -20, min 19, user default 0) with the nice, renice commands
when starting
during work
bash$ renice -n -20 PIDofYourProcess

L
lovecraft, 2016-02-12
@lovecraft

When executing the program, the processor eats only 14%, RAM 15% - I would like more
So the bottleneck is somewhere else )
The easiest way is to place a text file on a RAM disk, for example
mkdir /tmp/ramdisk
mount -t tmpfs -o size=2G tmpfs /tmp/ramdisk
cp ./myfile.txt /tmp/ramdisk
All at once will become very fast.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question