Answer the question
In order to leave comments, you need to log in
How many times higher is the computational complexity of the MD5 algorithm compared to the CRC32 algorithm?
Or otherwise, how many times is the MD5 algorithm slower than the CRC32 algorithm?
I know that MD5 is a hashing algorithm and CRC32 is a checksum algorithm to check the integrity of data while being transmitted or written.
Now data integrity control (photos of ~ 1MB) when transferred to the MySQL database from 1C is carried out using the MD5 algorithm, and there is a suspicion that the excessive computational complexity of the MD5 algorithm slows down the data transfer process and loads the server processor. I need to understand whether it makes sense to replace MD5 with CRC32 and what will be the performance gain? I just don’t want to experiment like that, since this refinement costs money. Thank you!
Answer the question
In order to leave comments, you need to log in
In a lot!
Judge for yourself, CRC32 uses only modular arithmetic and shift operations in a 32-bit register. For the same MD5, simple operations are also used, but this happens in four 32-bit registers, in 32 cycles. So 4 * 32 = 128. That's how many times MD5 is slower than CRC32.
And this is only offhand, since I don’t remember all the details of the implementation of one and the other algorithm.
The article Hash Functions all the way down compares hash functions with a graph.
I doubt that it is the calculation of MD5 that is the culprit of the slowdown. The reason must be sought elsewhere.
Use profiling of code, DBMS calls and I/O operations.
There is still very possible inefficient work with memory due to work with photographs.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question