D
D
dc65k2021-06-03 17:19:26
Algorithms
dc65k, 2021-06-03 17:19:26

How to properly estimate the complexity of an O(n) algorithm?

Understanding algorithms and estimating their complexity. In several examples, I noticed a moment in connection with which the following question arose. I will give an abstract example.
Conditionally, the problem is solved using a loop:

function f2() {
    const array = Array(100).fill(null);

    return array.reduce((accumulator, currentValue) => {
        return accumulator;
    }, []);
}

As I understand it, this is the linear complexity of the O(n) algorithm.

If the problem is solved using a loop and another loop is nested in it:
function f2() {
    const array = Array(100).fill(null);

    for (let i = 0; i < array.length; i++) {
        for (let j = i + 1; j < array.length; j++) {

        }
    }
}

Then here the complexity is O(n2) The

question is about the third case, if the problem is solved, for example, using cycles that follow one after another:
function f3() {
    const array = Array(100).fill(null);

    return array.reduce((accumulator, currentValue) => {
        return accumulator;
    }, []);
}

In the examples, this is also the linear complexity of the O(n) algorithm.
But this is not the same as in the first example above.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
M
Mercury13, 2021-06-03
@dc65k

f(x) = O(g(x)) as x→y is the so-called Landau symbol.
And means that for x sufficiently close to y, f(x)<k·g(x). So 2x or 1000x - sorry, it doesn't matter.
Hence the notation O(log n) - after all, different logarithms differ by a constant, which the Landau symbols eat up.
Why are Landau symbols interesting for programmers?
1. With caches, a fast processor, "cunning" programming and other things on large data sets, you can win, for example, at times. The order of complexity of the algorithm is much, much higher.
2. As long as Moore's law was in effect, data volumes grew exponentially - so it quickly got to the point that the program was being used on datasets for which it was simply not intended.
3. Practically acceptable algorithms usually have a small complexity - for example, up to O (n³). And, for example, a linear algorithm will process millions of elements in an acceptable time, n log n - hundreds of thousands, n² - thousands, n³ - hundreds.
4. Programmers debug on small datasets that can be processed manually. So the difference between debug and combat data can be large - which means that the order of complexity should influence more than other factors.

D
Denis Zagaevsky, 2021-06-03
@zagayevskiy

Read the definition of big O.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question