Asymptotic Notations and Analysis

One of the most important yet often skipped topic while studying algorithms is “Asymptotic Notations and Analysis”. Writing an efficient algorithm is only possible if you know how to measure the efficiency of it, and this is where asymptotic Notations and analysis comes into the picture.

Why do we need to know the time complexity?

Computers were designed to solve problems. To solve these problems, we need to design algorithms and after developing the algorithm into a program the computer finally becomes ready to solve the problems. The problem size may range from single-digit values to millions. This is where the need for an efficient algorithm pops in. The efficiency of an algorithm is determined by the amount of time taken and the amount of space it takes in the computer memory.

The amount of time consumed is known as the “Time Complexity” whereas the amount of space consumed is known as “Space Complexity”.

In this article, we will be dealing with the “Asymptotic Notation and Asymptotic Analysis”.

What is the need for analysis of algorithms?

Suppose there is a set of two algorithms for a problem. One algorithm might work for smaller set input whereas the other might work well for both the small as well as large inputs. So, this is the time when we need to decide which algorithm we need to use for the problem.

In asymptotic analysis, the performance of the analysis is judged. Here, we do not calculate the running time instead we calculate it according to the input size. In other words how the time complexity and space complexity will change with regards to the size of the input.

Types of Cases for analyzing an algorithm:-

  1. Worst Case
  2. Average Case
  3. Best Case

In Worst Case analysis we usually take the upper bound of the algorithm. 

In Average Case analysis, we take the input of all types whether large or small, and calculate the average. However, calculating the average case is not feasible always.

In Best Case analysis the lower bound of the algorithm is analyzed, meaning the minimum running time is calculated.

Asymptotic Notations

What does asymptotic mean?

The word asymptotic means approaching a certain value which could be considered as the limit. The value can range from any finite natural number to an infinite value.


Asymptotic Notation is used to decide the asymptotic running time of an algorithm, when the input size n is very large, i.e, n-> ∞ . ‘n’ belongs to the set of natural numbers.

What are the notations?

  1. Theta Notation (Θ)
  2. Big O Notation (O)
  3. Big Omega Notation (Ω)
  4. Little O Notation (o)
  5. Little Omega (⍵)

Theta Notation (Θ) (Also known as asymptotic tight bound)

This notation is used to determine both the upper bound as well as the lower bound of the algorithm. In other words, this notation determines the minimum and the maximum values until which the algorithm could run.

Θ(g(n)) = {For every, f(n), there exists positive constants c1,c2 and n0.}

Such that 0<= c1.g(n)<=f(n)<=c2.g(n) for all n>=n0.

Big O Notation (O) (Also known as Asymptotic Upper Bound)

This notation is used to determine the asymptotic upper bound of the algorithm. In other words, it is used to know the worst time complexity of the algorithm. This method is highly efficient when the input is as large as 10^9.

O(g(n)) = {For every, f(n) there exist positive constant c and n0 such that 0<=f(n)<=c.g(n) for all n>=n0}

Note:- If f(n)=θ(g(n))
       => f(n)=O(g(n))


Big Omega Notation (Ω) (Asymptotic Lower Bound)

Big Omega Notation gives the lower bound of the function which represents the algorithm.

Ω(g()) = {For every f(n), there exist positive constants c and n such that 0<=c.g(n)<=f(n) for all n>=n0}.

Note:- If f(n)=θ(g(n))
       => f(n)=Ω(g(n))

Theorem 1:

For any polynomial, P(n)=i=0daini, where ai are the constants and ad  > 0 then we have P(n)=ϴ(nd) 

Theorem 2:

For any two functions f(n) and g(n).


f(n)=O(g(n)) and f(n)=Ω(g(n))

Little o Notation (o)

This notation is used to denote an upper bound which is not asymptotically tight.



o(g(n))={For every f(n), any positive constant c>0, there esist a constant n)>0 such that 0<=f(n)<c.g(n) for all n>=n0}

Little Omega Notation (⍵)

This notation is used to denote a lower bound which is not asymptotically tight.

⍵(g(n))={For every f(n), for any positive constant c>0 there exist a constant n0>0 there exist a constant n0>0 such tha 0<=c.g(n)<f(n) for all n>n0}.

Eg:- n^3=⍵(n)

Theorem 1:-

The relation f(n)=ϴ(g(n)) implies that

n∞f(n)g(n) = c>0

Theorem 2:-

The relation f(n)=o(g(n)) implies that

n∞f(n)g(n) = 0

Theorem 3:-

The relation f(n)=ω(g(n)) implies that

n∞f(n)g(n) = ∞

Special thanks to Ushmita Dutta for contributing to this article on takeUforward. If you also wish to share your knowledge with the takeUforward fam, please check out this article