Answer the question
In order to leave comments, you need to log in
How to calculate the running time of an algorithm ( Big O )?
There is an algorithm. It is necessary to calculate the "approximate running time of the algorithm". That is, not "detect the time at the beginning of the work and look at the result" - but you need to "analyze the algorithm and say the approximate time of work (with certain input parameters)".
I started reading about Big O, but I still can't figure it out. Below is the sorting algorithm. As I understand it, this is O(n^2), because the loop is nested within the loop. Logically: f(n) = n^2. For example, if the array has 100 elements, it turns out f (100) = 10000. And what does this give me? How can I find time?
private void Sorting (int[] arraySort)
{
for (int i = 0; i < arraySort.Length; i++)
{
int minID = i;
for (int j = i; j < arraySort.Length; j++)
{
if (arraySort[minID] > arraySort[j])
{
minID = j;
}
}
int temp = arraySort[i];
arraySort[i] = arraySort[minID];
arraySort[minID] = temp;
}
}
Answer the question
In order to leave comments, you need to log in
the ALGORITHM does not have a running time, there is a number of operations (average, worst ....)
the running time of the PROGRAM will depend on the
speed of the processor/memory/caches
such as processor/memory/caches
, compiler settings, OS, memory sizes,
and not linearly. those. if there is no longer enough memory, then the speed will drop sharply.
Big O gives an understanding that when sorting an array of 1,000,000 elements it is useless in this way
, and if sorting 1000 emails took 1 minute, then 2000 will most likely be 4
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question