1

# Best Guide on Asymptotic Notations in Algorithms

## Asymptotic Notations Definition

• Asymptotic notations are languages that allow us to analyze an algorithm’s running time by identifying its behavior as the input size for the algorithm increases.
• It is also knows as Algorithm’s grow rate.
• Asymptotic notations gives us methods for classifying functions according to their rate of growth.
• If we have more than one algorithms with alternative steps then to choose among them, the algorithm with lesser complexity should be select. To represents these complexities Asymptotic notations are used.
• For estimating and comparing the time consumption of programs, we need some notions to estimate the time consumption of algorithms in a computer independent way.
• If We want to make statements like “algorithm A is better than algorithm B”. By this we mean: “for sufficiently large inputs a program that implements algorithm A will run faster than a program that implements algorithm B”.

## Best, Worst and Average-case Complexity.

• Worst case : Function defined by maximum number of steps taken in any instance of size n. This represents curve passing through the highest point in each column.
• Best case : Function defined by minimum number of steps taken in any instance of size n. This represents curve passing through the lowest point of each column.
• Average case : Function defined by the average number of steps over all instances of size n.

## Types of Notation’s for Complexity

• Let us assume that to be C(n). Now, to compare, contrast and rank the order of growth of an algorithm,
• we use three notations :
1. O (Big Oh),
2. Ω (Big omega),
3. ϴ (Big theta).

Assume, t(n) and g(n) are non-negative functions defined on the set of natural numbers. t(n) is the algorithm’s running time, usually indicated by its basic operation count C(n).
This can be compared with a simple function g(n).
This g(n) is any simple function to compare the count C(n) with.

### Big Oh Notation.

• This expresses the complexity of an algorithm.An algorithm whose complexity does not change with the input size is O(1).
• Algorithm with O(1) is said to have constant time complexity, it means it takes the same amount of time even if input size is doubled, tripled or increased to any level.
• If ‘N’ is the size of input, the complexity of an algorithm is O(N).
• Linear time method is O(N).
Example:
```If N is 100 it takes 100 seconds to process 100 elements,
If N is 200 it takes 200 seconds to process 200 elements,
then the algorithm is said to be O(N).```
• The complexity of an algorithm is O(N2) if the time taken by the algorithm increases quadratic-ally when ‘N’ increases

Note:

• Big-O expressions do not have constants or low-order terms. This is because, when N gets large enough, constants and low-order terms doesn’t matter.
Example:
``````O(N2+100) is equivalent to O(N2)      /* ignore 100
O(N2+N) is equivalent to O(N2)        /* if N is very large, ignore it.``````

### Big Omega:

• It is like >= rate of growth is greater than or equal to a specified value.
• The algorithm’s lower bound is represented by Omega notation. The asymptotic lower bond is given by Omega notation
• Big Omega (Ω) – Best case
• Big- Ω is take a small amount of time as compare to Big-O it could possibly take for the algorithm to complete.

### Big Theta:

• It is like == meaning the rate of growth is equal to a specified value
• The bounding of function from above and below is represented by theta notation. The exact asymptotic behavior is done by this theta notation.
• Big Theta (Θ) – Average case
• Big- Θ is take very short amount of time as compare to Big-O and Big-? it could possibly take for the algorithm to complete.

Click below to download the Data Structures and Algorithms cheat sheet pdf.

## Online Resources:

##### Naveed Tawargeri

Hi Everyone, Naveed Tawargeri Here. I am a graduate of Information Science Engineering and working as Digital Marketing Executive. 