V
V
val182019-11-18 13:10:29
Algorithms
val18, 2019-11-18 13:10:29

How to determine the complexity of an algorithm?

How to correctly calculate the complexity of an algorithm? What is the complexity of the algorithm in O notation for matrix addition/multiplication? Is there any formula or method for calculating?

Answer the question

In order to leave comments, you need to log in

3 answer(s)
Z
Zolg, 2019-11-18
@Zolg

What is the complexity of the algorithm in O notation for matrix addition/multiplication?

Matrix multiplication is not an algorithm, but an operation.
An algorithm is a specific way to implement it.
The complexity of the 'naive' multiplication algorithm is O(n^3)
The complexity of, for example, Strassen 's algorithm is O(n^2.81)
Is there any formula or method for calculating?
elementary: count the number (for O-notation - the maximum) of elementary operations required to calculate the algorithm for input data of dimension n. You shorten the constant.
ps:
'computational complexity' only shows the ratio of how much longer the algorithm takes to compute as the size of the input increases. but says absolutely nothing about the amount of calculations in absolute terms. many 'fast' algorithms on small input data sizes (in absolute figures) lose out to 'slow' ones: yes, the number of calculations grows
more slowly with n , but up to some n there are more calculations

B
BasiC2k, 2019-11-18
@BasiC2k

Complexity can be translated into man-hours for implementation.
For example:
- I implemented the algorithm in 10 hours. It was complicated.
- and I implemented the same algorithm in 2 hours. It wasn't difficult.
In itself, the concept of "complexity" is subjective and depends on many factors:
- experience in implementing similar tasks;
- availability of specific knowledge;
- development environment;
- well-being, etc.

M
Mercury13, 2019-11-18
@Mercury13

Let's start with what the Landau (not the German Edmund Landau) symbols O(f(n)) and o(f(n)) are as n→∞.
g(n) = O(f(n)), if ng(n)≤cf(n) is sufficient.
g(n) = O(f(n)), if g(n)/f(n) → 0 as n→∞ tends.
Thus…
• Landau symbols reflect the change of something (time, memory, etc.) at n→∞.
• If, for example, there are two cycles in each other and each for n, we write O(n²). For example, to multiply n×n matrices:

для i = 1…n
  для j = 1…n
    s = 0
    для k = 1…n
      s += a[i,k]·b[k,j]
    c[i,j] = s

There are three loops nested each other over n - so we immediately write O (n³).
• If g(n) = O(n), then g(n) = O(n²), so we are not sinning against the truth. But if it suddenly turned out that the inner loop is an extraction from the queue and, for example, 2n events will pass through this very queue, the outer loop is executed O (n) times and the inner O (n) times; the evaluation of the algorithm immediately turns into O(n).
• Of course, we throw out the constants: both 2n² and 200n² = O(n²). We also discard those parts whose order is less than the largest: n²+n log n = O(n²). By the way, this is why we do not write the base of the logarithm: the logarithms of different bases differ by a factor.
• But sometimes constants are important. High-speed methods of matrix multiplication have how large a constant at n 2.81that in reality they start to make sense somewhere from 100 × 100 (or even larger). For the same reason, the binary substring search algorithm lives - it has an extremely low constant at n² and a couple more useful properties.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question