# Different Notations

Now that we know, on a high level, what time & space complexities are, it's a good time to look at different ways to represent them and try to make sense of them practically.

Let's clear out the *make sense of it* part right now. It's simple: the worse the time complexity, the more time the code takes to execute given the same input. So, given the constraints and time limit of a problem, we'll have to calculate the worst-case time complexity our code can have to pass, & of course, make sure we find an algorithm that works within that time complexity.

It is similar for space complexity as well, but like I wrote in the last article, it isn't generally cared about in more than 95% of the cases.

## Diving into the Notations

Asymptotic notations are mathematical notations used to describe an algorithm's time complexity (and space complexity).

There are mainly three asymptotic notations that we'll talk about:

- Big Oh notation (
`O`

) - upper bound - Omega notation (
`Ω`

) - lower bound - Theta notation (
`ፀ`

) - exact bound

Among these three, Big O and Theta are particularly useful while analyzing algorithms.

### Big Oh Notation - (O)

Mathematically, We say that f(n) = O(g(n)), if there exist positive constants N_{0}and C, i.e. f(n) <= C*g(n) ∀ n >= N_{0}

Let's not get into the mathematical definition for now, the crux (in easily understandable terms) is that the Big O notation represents the **upper bound** of the running time of an algorithm.

Since it gives the worst-case running time of an algorithm, it is widely used to analyze an algorithm as we are always interested in the worst-case scenario.

Here are some common time complexities (ordered best to worse):

- O(1): Constant time complexity
- O(log n): Logarithmic time complexity
- O(n): Linear time complexity
- O(nlogn): Linearithmic time complexity
- O(n
^{2}): Quadratic time complexity - O(2
^{n}): Exponential time complexity

### Omega Notation - (Ω)

Mathematically, We say that f(n) = Ω(g(n)), if there exist positive constants N_{0}and C, i.e. f(n) >= C*g(n) ∀ n >= N_{0}

Again, the crux is that the Ω notation represents the **lower bound** of the running time of an algorithm.

### Theta Notation - (ፀ)

Mathematically, We say that f(n) = ፀ(g(n)), if there exist positive constants N_{0}, C_{1}and C_{2}, i.e. C_{1}*g(n) <= f(n) <= C_{2}*g(n) ∀ n >= N_{0}

In simpler terms, if an algorithm, let's say has ፀ(N^{2}) time complexity, it basically means it's an **exact bound, **because the time taken by the algorithm will never grow faster than C_{2}*N^{2} and slower than C_{1}*N^{2.}

## Summing Things Up

If an algorithm has O(N) time complexity, it means, for a sufficiently large input:

- The code may run within ፀ(LogN) time.
- It may run within ፀ(N) time.
- however, it will never take ፀ(N
^{2}) time. (because the growth of n^{2}is faster than that of n)

If an algorithm has Ω(N) time complexity, it means, for a sufficiently large input:

- The code may run within ፀ(N
^{2}) time. - It may run within ፀ(N) time.
- however, it will never take ፀ(LogN) time. (because the growth of
*log n*is slower than that of*n*)

If an algorithm has ፀ(N) time complexity, it means, for a sufficiently large input:

- The code will complete execution in ፀ(N) time.
- It will never take ፀ(N
^{2}) time. (because the growth of n^{2}is faster than that of n) - It will never take ፀ(N
^{3}) time. (because the growth of n^{3}is faster than that of n)

Basically, in case of Theta, if a program has a certain time complexity, let's say, ፀ(log n) we can be sure that the time complexity of the program cannot grow faster or slower than ፀ(log n).