# Big omega Functions and Examples – Complete Guide

Sometimes, we want to say that an algorithm takes at least a certain amount of time, without providing an upper bound. We use **big-Ω **notation; that’s the Greek letter “**omega**.”

## Big Omega definition

A function t(n) is said to be in Ω(g(n)), denoted t(n) ∈ Ω(g(n)), if t(n) is bounded both above and below by some positive constant multiples of g(n) for all large n, i.e., if there exist some positive constants c1 and c2 and some

nonnegative integer n0 such that c2g(n) ≤ t(n) ≤ c1g(n) for all n ≥ n0

- Big Omega(Ω) function is
**used in computer science to describe the performance or complexity of an algorithm**. If a running time is Ω(f(n)), then for large enough n, the running time is at least k⋅f(n) for some constant k.

- Big Omega describes the lower bound of an algorithm.

- Big-Omega is commonly denoted by Ω, is an Asymptotic Notation to denote the best case analysis of an algorithm.

- Ω notation provides an asymptotic lower bound.
- Sometimes, we want to say that an algorithm takes at least a certain amount of time, without providing an upper bound.
- We use big-Ω notation to show that least certain amount of time for an algorithm.
- If running time is Ω(f(n)), then for large enough n, the running time is at least k. f(n) for some constant k.

**Note:**

- Best case Ω Of an algorithm describes a lower bound for all combinations of input. It implies that the function can never get any better than the specified value. For example, when sorting an array, the best case is when the array is already correctly sorted.
- Worst case Ω describes a lower bound for worst-case input combinations. It is possibly greater than the best case. For example, when sorting an array, the worst case is when the array is sorted in reverse order.

**Big Omega Examples:**

**n3 + 4n2 = Ω(n2)**

**Proof:**Here, we have f(n) = n3 + 4n2, and g(n) = n2

It is not too hard to see that if n ≥ 0,

n3 ≤ n3 + 4n2

• We have already seen that if n ≥ 1,

n2 ≤ n3

• Thus when n ≥ 1,

n2 ≤ n3 ≤ n3 + 4n2

• Therefore,

1n2 ≤ n3 + 4n2 for all n ≥ 1

• Thus, we have shown that n3 + 4n2 = Ω(n2)

(by definition of Big-Ω, with n0 = 1, and c = 1.)

**Let f(n) = 5.5n**^{2}– 7n.

We need to verify whether f(n) is Ω(n^{2}).

Let c be a constant such that 5.5n^{2} – 7n ≥ cn^{2}, or, n ≥ 7/5.5-c.

Fix c = 3, to get n ≥ 2.8. So, our n_{0} = 2.8 and c = 3. This shows that there exists positive constants c = 3 and n_{0} = 2.8 such that 0 ≤ cn2 ≤ f(n), ∀n ≥ n_{0.}

## Big omega function:

Similar to big O notation, big Omega(Ω) function is used in computer science to describe the performance or complexity of an algorithm.