we foughtasymptotic analysis,worst, middle and best casemiasymptotic notationsin previous posts. In this article an analysis of iterative programs with simple examples is discussed.
O(1) constant time complexity:
The time complexity of a function (or instruction set) is considered O(1) if it does not contain loops, recursion, or calls to another non-constant time function.
i.e. non-recursive instruction set without loops
Example:
- swap() functionIt has O(1) time complexity.
- A loop or recursion that is executed a constant number of times is also considered O(1). For example, the following loop is O(1).
C
// Here c is a constant
for
(
And t
yo = 1; yo <= c; i++) {
// some O(1) expressions
}
Java
// Here c is a constant
for
(
And t
ue =
1
; yo <= c; i++) {
// some O(1) expressions
}
// This code is contributed by Utkarsh
Python3
# Here c is a constant
for
UE
they
Area
(
1
, C
+
1
):
# some O(1) expressions
# This code is contributed by Pushpesh Raj.
JavaScript
// Here c is a constant
for
(
Eras
yo = 1; yo <= c; i++) {
// some O(1) expressions
}
O(n) linear time complexity:
The time complexity of a loop is considered O(n) if the loop variables are incremented/decremented by a constant value. For example, the following functions have a time complexity of O(n).
C
// Here c is a positive integer constant
for
(
And t
i = 1; i <= n; i += c) {
// some O(1) expressions
}
for
(
And t
i = n; I > 0; i -= c) {
// some O(1) expressions
}
Java
// Here c is a positive integer constant
for
(
And t
ue =
1
; i <= n; i += c) {
// some O(1) expressions
}
for
(
And t
i = n; me >
0
; yo -= c) {
// some O(1) expressions
}
// This code is contributed by Utkarsh
Python3
# Here c is a positive integer constant
for
UE
they
Area
(
1
, norte
+
1
, C):
# some O(1) expressions
for
UE
they
Area
(NORTE,
0
,
-
C):
# some O(1) expressions
# This code is contributed by Pushpesh Raj
JavaScript
// Here c is a positive integer constant
for
(
Eras
i = 1; i <= n; i += c) {
// some O(1) expressions
}
for
(
Eras
i = n; I > 0; i -= c) {
// some O(1) expressions
}
O(n) Quadratic time complexityC):
Time complexity is defined as an algorithm whose performance is directly proportional to the squared size of the input data, since in nested loops it is equal to the number of executions of the innermost statement. For example, the following example loops have O(n2) time complexity
C
for
(
And t
i = 1; i <= n; i += c) {
for
(
And t
j = 1; j <= n; j += c) {
// some O(1) expressions
}
}
for
(
And t
i = n; I > 0; i -= c) {
for
(
And t
j = yo + 1; j <= n; j += c) {
// some O(1) expressions
}
Java
for
(
And t
ue =
1
; i <= n; i += c) {
for
(
And t
j =
1
; j <= n; j += c) {
// some O(1) expressions
}
}
for
(
And t
i = n; me >
0
; yo -= c) {
for
(
And t
j = yo +
1
; j <= n; j += c) {
// some O(1) expressions
}
// This code is contributed by Utkarsh
Python3
for
UE
they
Area
(
1
, norte
+
1
, C):
for
j
they
Area
(
1
, norte
+
1
, C):
# some O(1) expressions
for
UE
they
Area
(NORTE,
0
,
-
C):
for
j
they
Area
(UE
+
1
, norte
+
1
, C):
# some O(1) expressions
# This code is contributed by Pushpesh Raj
JavaScript
for
(
Eras
i = 1; i <= n; i += c) {
for
(
Eras
j = 1; j <= n; j += c) {
// some O(1) expressions
}
}
for
(
Eras
i = n; I > 0; i -= c) {
for
(
Eras
j = yo + 1; j <= n; j += c) {
// some O(1) expressions
}
Example: selection qualificationmiSort by insertionhat on2) time complexity.
Logarithmic Time Complexity O(Log n):
The time complexity of a loop is considered O(logn) when the loop variables are divided/multiplied by a constant value. And also for recursive calls in the recursive function, the time complexity is considered as O(Logn).
C
for
(
And t
i = 1; i <= n; i *= c) {
// some O(1) expressions
}
for
(
And t
i = n; I > 0; i /= c) {
// some O(1) expressions
}
Java
for
(
And t
ue =
1
; i <= n; i *= c) {
// some O(1) expressions
}
for
(
And t
i = n; me >
0
; yo /= c) {
// some O(1) expressions
}
// This code is contributed by Utkarsh
Python3
UE
=
1
while
(ue <
=
NORTE):
# some O(1) expressions
UE
=
UE
*
C
UE
=
norte
while
(ue >
0
):
# some O(1) expressions
UE
=
UE
/
/
C
# This code is contributed by Pushpesh Raj
JavaScript
for
(
Eras
i = 1; i <= n; i *= c) {
// some O(1) expressions
}
for
(
Eras
i = n; I > 0; i /= c) {
// some O(1) expressions
}
C
// recursive function
archive
requester
{
con
(n == 0)
reversing
;
Others
{
// some O(1) expressions
}
recursion(n - 1);
}
Java
// recursive function
archive
requester
{
con
(n ==
0
)
reversing
;
Others
{
// some O(1) expressions
}
recursive(n -
1
);
}
// This code is contributed by Utkarsh
Python3
# recursive function
definitely
recursive(s):
con
(NORTE
=
=
0
):
reversing
Others
:
# some O(1) expressions
requester
-
1
)
# This code is contributed by Pushpesh Raj
JavaScript
// recursive function
function
requester
{
con
(n == 0)
reversing
;
Others
{
// some O(1) expressions
}
recursion(n - 1);
}
Example: binary search (see iterative implementation)has O(Log) time complexity.
Logarithmic time complexity O(Log Log n):
The time complexity of a loop is considered O(LogLogn) if the loop variables decrease/increase exponentially by a constant value.
C
// Here c is a constant greater than 1
for
(
And t
i = 2; i <= n; me =
blow
(yo, c)) {
// some O(1) expressions
}
// funny here is sqrt or cuberoot or any other constant root
for
(
And t
i = n; I > 1; i = fun(i)) {
// some O(1) expressions
}
Java
// Here c is a constant greater than 1
for
(
And t
ue =
2
; ich <= n; i = Math.pow(i, c)) {
// some O(1) expressions
}
// funny here is sqrt or cuberoot or any other constant root
for
(
And t
i = n; me >
1
; i = fun(i)) {
// some O(1) expressions
}
// This code is contributed by Utkarsh
Python3
# Here c is a constant greater than 1
UE
=
2
while
(ue <
=
NORTE):
# some O(1) expressions
UE
=
UE
*
*
C
# Here is the fun sqrt or cuberoot or any other constant root
UE
=
norte
while
(ue >
1
):
# some O(1) expressions
UE
=
funny (me)
# This code is contributed by Pushpesh Raj
JavaScript
// Here c is a constant greater than 1
for
(
Eras
i = 2; i <= n; i = i**c) {
// some O(1) expressions
}
// funny here is sqrt or cuberoot or any other constant root
for
(
Eras
i = n; I > 1; i = fun(i)) {
// some O(1) expressions
}
verto befor mathematical details.
How to combine time complexity of consecutive loops?
For consecutive loops, we calculate the time complexity as the sum of the time complexities of the individual loops.
C
for
(
And t
i = 1; i <= m; i += c) {
// some O(1) expressions
}
for
(
And t
i = 1; i <= n; i += c) {
// some O(1) expressions
}
// The time complexity of the above code is O(m) + O(n), which is O(m + n).
// When m == n, the time complexity becomes O(2n), which is O(n).
Java
for
(
And t
ue =
1
; i <= m; i += c) {
// some O(1) expressions
}
for
(
And t
ue =
1
; i <= n; i += c) {
// some O(1) expressions
}
// The time complexity of the above code is O(m) + O(n), which is O(m + n).
// When m == n, the time complexity becomes O(2n), which is O(n).
// This code is contributed by Utkarsh
Python3
for
UE
they
Area
(
1
, m
+
1
, C):
# some O(1) expressions
for
UE
they
Area
(
1
, norte
+
1
, C):
# some O(1) expressions
# The time complexity of the above code is O(m) + O(n), which is O(m + n).
# When m == n, the time complexity becomes O(2n), which is O(n).
JavaScript
for
(
Eras
i = 1; i <= m; i += c) {
// some O(1) expressions
}
for
(
Eras
i = 1; i <= n; i += c) {
// some O(1) expressions
}
// The time complexity of the above code is O(m) + O(n), which is O(m + n).
// When m == n, the time complexity becomes O(2n), which is O(n).
How to calculate time complexity when there are many if,else statements in loops?
As discussedHere, the time complexity of the worst case is the most useful among the best, average, and worst cases. So we have to consider the worst case. We evaluate the situation where values in if-else conditions result in the execution of a maximum number of statements.
For example, consider thelinear search functionwhere we consider the case where an element is present or absent at the end.
If the code is too complex to handle all the if-else cases, we can get an upper bound by ignoring if-else and other complex control statements.
How to calculate the time complexity of recursive functions?
The time complexity of a recursive function can be written as a mathematical iteration relation. To calculate time complexity, we need to know how to solve the iterations. We will shortly discuss recursion resolution techniques in a separate post.
Algorithm Cheat Sheet:
algorithm | In the best case | middle case | Worst of cases |
selection qualification | O(n^2) | O(n^2) | O(n^2) |
bubble type | One) | O(n^2) | O(n^2) |
Sort by insertion | One) | O(n^2) | O(n^2) |
tree classification | Or (login) | Or (login) | O(n^2) |
root classification | O(dn) | O(dn) | O(dn) |
Classification of mixtures | Or (login) | Or (login) | Or (login) |
heap sorting | Or (login) | Or (login) | Or (login) |
quick type | Or (login) | Or (login) | O(n^2) |
sort cubes | O(n+k) | O(n+k) | O(n^2) |
sort by number | O(n+k) | O(n+k) | O(n+k) |
Algorithm Analysis Quiz
For more details see:Design and analysis of algorithms.
Write comments if you find something wrong or if you want to share more information on the topic discussed above.
my personal notesarrow_fall_up
FAQs
How do you analyze time complexity of an algorithm? ›
In general, you can determine the time complexity by analyzing the program's statements (go line by line). However, you have to be mindful how are the statements arranged. Suppose they are inside a loop or have function calls or even recursion. All these factors affect the runtime of your code.
How do you find the time complexity of a loop? ›The time complexity of a loop is equal to the number of times the innermost statement is to be executed. On the first iteration of i=0, the inner loop executes 0 times. On the first iteration of i=1, the inner loop executes 1 times. On the first iteration of i=n-1, the inner loop executes n-1 times.
How do you find the time complexity of an algorithm in a while loop? ›As the nested loops always run to completion and there is a constant amount of code within the inner loop, the time complexity can be determined by taking the sum of the number of inner loop iterations in the worst case.
How do you find the complexity of two for loops? ›As a result, the statements in the inner loop execute a total of N * M times. Thus, the complexity is O(N * M). In a common special case where the stopping condition of the inner loop is j < N instead of j < M (i.e., the inner loop also executes N times), the total complexity for the two loops is O(N2).
What are the 3 algorithm analysis techniques? ›In Sections 1.3 through 1.6, we explore three important techniques of algorithm design—divide-and-conquer, dynamic programming, and greedy heuristics.
How will you analyze the algorithm explain in detail? ›1.3 Analysis of Algorithms.
Determine the time required for each basic operation. Identify unknown quantities that can be used to describe the frequency of execution of the basic operations. Develop a realistic model for the input to the program. Analyze the unknown quantities, assuming the modelled input.