# Data Structures – Asymptotic Analysis

Asymptotic examination of a calculation alludes to characterizing the numerical boundation/outlining of its run-time execution. Utilizing asymptotic examination, we can finish up the best case, normal case, and most dire outcome imaginable of a calculation.

Asymptotic examination is input bound i.e., if there’s no contribution to the calculation, it is closed to work in a consistent time. Other than the “input” any remaining components are viewed as consistent.

Asymptotic examination alludes to processing the running season of any activity in numerical units of calculation. For instance, the running season of one activity is registered as f(n) and might be for another activity it is processed as g(n2). This implies the primary activity running time will increment directly with the increment in n and the running time activity will increment dramatically when n increments. Likewise, the running season of the two activities will be almost the equivalent if n is fundamentally little.

Usually, the time required by an algorithm falls under three types −

**Best Case**− Minimum time required for program execution.**Average Case**− Average time required for program execution.**Worst Case**− Maximum time required for program execution.

## Asymptotic Notations

Following are the commonly used asymptotic notations to calculate the running time complexity of an algorithm.

- Ο Notation
- Ω Notation
- θ Notation

### Big Oh Notation, Ο

The notation Ο(n) is the formal way to express the upper bound of an algorithm’s running time. It measures the worst case time complexity or the longest amount of time an algorithm can possibly take to complete.

For example, for a function *f*(n)

Ο(f(n)) = {g(n) : there exists c > 0 and n_{0}such thatf(n) ≤ c.g(n) for all n > n_{0}. }

### Omega Notation, Ω

The notation Ω(n) is the formal way to express the lower bound of an algorithm’s running time. It measures the best case time complexity or the best amount of time an algorithm can possibly take to complete.

For example, for a function *f*(n)

Ω(f(n)) ≥ {g(n) : there exists c > 0 and n_{0}such thatg(n) ≤ c.f(n) for all n > n_{0}. }

### Theta Notation, θ

The notation θ(n) is the formal way to express both the lower bound and the upper bound of an algorithm’s running time. It is represented as follows −

θ(f(n)) = {g(n) if and only ifg(n) = Ο(f(n)) andg(n) = Ω(f(n)) for all n > n_{0}. }

## Common Asymptotic Notations

Following is a list of some common asymptotic notations −

constant | − | Ο(1) |

logarithmic | − | Ο(log n) |

linear | − | Ο(n) |

n log n | − | Ο(n log n) |

quadratic | − | Ο(n^{2}) |

cubic | − | Ο(n^{3}) |

polynomial | − | n^{Ο(1)} |

exponential | − | 2^{Ο(n)} |