Answer the question
In order to leave comments, you need to log in
How to correctly count the number of operations per second?
Let's say (oh, if only) I have an Intel Xeon E5-2699V4 Broadwell-EP 2200MHz 22 cores and an Nvidia Geforce GTX 1080Ti 1582MHz 3584 cores. We consider: (1582000000 * 3584) / (2200000000 * 22) \u003d 117, that is, a video card for 50 thousand is 117 times more productive than a processor for 220 thousand.
What is the error here and how to count correctly?
PS I was interested in calculating the difference in hacking speeds on the CPU and GPU.
PPS Let's assume that the program is correctly distributed among the cores and takes up all the available computing power
Answer the question
In order to leave comments, you need to log in
1. It depends on what operations.
Operations are VERY different.
2. It depends on what software.
If you have software that can be parallelized and run on a GPU, the GPU will have several orders of magnitude more operations, yes.
And, say, if you check each password search by sending it to a server on the Internet, there is no difference.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question