Say hello to Big (O)h!

Oh or rather O, chosen by Bachmann to stand for Ordnung, meaning the Order of approximation.

“Wait, what is this guy saying?”

Helloooo and welcome back! Do not mind my curtain-raiser for this article for what we are going to be covering is super ea…


This content originally appeared on DEV Community 👩‍💻👨‍💻 and was authored by Mark Gatere

Oh or rather O, chosen by Bachmann to stand for Ordnung, meaning the Order of approximation.

"Wait, what is this guy saying?"

Helloooo and welcome back! Do not mind my curtain-raiser for this article for what we are going to be covering is super easy. It is known as The Big O notation.

According to wikipedia, the Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity and characterizes functions according to their growth rates where different functions with the same growth rate may be represented using the same O notation.

Ooh oh! Not again. 😅
In simple terms, (as used in Data Structures), the Big O notation is a way of comparing two sets of code, super simple, right?

Comparison of two set of codes

Assuming Code 1 and Code 2 accomplish exactly the same thing, how would you compare one to the other? Code 1 might be more readable while Code 2 can be more concise. Big O is a way of comparing Code 1 and Code 2 mathematically on their efficiency as they run.
Assuming too that you have a stopwatch, and we run Code 1 and start the stopwatch and the code runs for 15 seconds, then we reset the stopwatch and run Code 2 together with the stopwatch and it runs for 1 minute, based on this, we can say Code 1 is more efficient than Code 2. This is called Time Complexity.
The thing about Time Complexity that is interesting is that it is not measured in time. This is because if you took the same code and run it on a computer that runs twice as fast, it will complete twice as fast which does not make the code better but rather the computer is better. Time Complexity is therefore measured in the number of operations that it takes to complete something, a topic we will look deeper into with more examples in the course.

In addition to Time Complexity, we also measure Space Complexity. Despite Code 1 running super fast comparatively, it possibly might be taking a lot of memory space and maybe Code 2, despite taking more time in execution, may be taking less space in memory. If preserving memory space is your most important priority and you don't mind having some extra Time Complexity, maybe Code 2 is better.
It is therefore best to understand both concepts to be able to address the most important priority when given an "interview question" on either of the concepts either being a program that runs super fast or either one that conserves memory space.

When dealing with Time Complexity and Space Complexity, there are three greek letters one will interact with: Ω (omega), Θ (theta), and O (omicron).

  • Assuming you have a list with seven items, and you are to build a for loop to iterate through it to find a specific number:
#Assuming we are looking for number '1', number '4' and number '7' in the list below:

[1, 2, 3, 4, 5, 6, 7]

To get the number '1', we are going to find it in one operation, hence it's our best-case scenario but when we are looking for the number '7', we have to iterate through the entire list to find it, hence it's our worst-case scenario. If looking for the number '4', it's our average-case scenario.
When one is talking about the 'best-case scenario', that's Ω (omega), while talking about the 'average-case scenario', that's Θ (theta) and while talking about the worst-case scenario, that's O (omicron). Hence, technically when talking about Big O, we are talking mainly of the worst-case scenario.

To start with, we will mainly focus on the Time complexity.
We are gonna discuss different types of Big O notations as printed on my 'sweatshirt' below😅​ (Never Mind! haha):

My sweatshirt

Below are images with some characteristics we will discuss for each notation on their complexities:

Big-o Complexity chart

Common Data Structure Operations

Array Sorting Algorithms

Disclaimer: I have used Python code to explain each complexity.

  • O(n)

This is one of the easiest Big O to explain. Let's use some code to explain this:

def print_items(n):
    for i in range(n):
        print(i)
print_items(10)

0
1
2
3
4
5
6
7
8
9

The above print_items function is an example of something that is O(n). n stands for the number of operations. We passed in the function number 'n' and it ran n times\ 'n' operations.

Image description

Visualizing this on a graph:

   |                              *
   |                    O(n) *
 T |                    *
 i |               *
 m |          *
 e |     *
   |_*_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
                input size

O(n) is always proportional hence it will always be a straight line.

  • Drop Constants

There are a few ways in which we can simplify the Big O notation. The first one is known as the drop constant.
Using our previous example: print_items function, we are gonna add a second identical for loop:

def print_items(n):
    for i in range(n):
        print(i)

    for j in range(n):
        print(j)
print_items(10)

0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9

The above operation ran n + n times\ 2n number of operations which we write as O(2n) but we can simplify this by dropping the constant 2 and writing it as O(n). We can therefore drop all constants, eg. O(10n), O(100n), O(10000n) == O(n).

  • O(n^2)

Adding a loop in another loop (nested for loop) using our previous example:

def print_items(n):
    for i in range(n):
       for j in range(n):
          print(i, j)
print_items(10)

0 0
0 1
0 2
0 3
0 4
0 5
0 6
0 7
0 8
0 9
1 0
1 1
1 2
1 3
1 4
1 5
1 6
1 7
1 8
1 9
2 0
2 1
2 2
2 3
2 4
2 5
2 6
2 7
2 8
2 9
3 0
3 1
3 2
3 3
3 4
3 5
3 6
3 7
3 8
3 9
4 0
4 1
4 2
4 3
4 4
4 5
4 6
4 7
4 8
4 9
5 0
5 1
5 2
5 3
5 4
5 5
5 6
5 7
5 8
5 9
6 0
6 1
6 2
6 3
6 4
6 5
6 6
6 7
6 8
6 9
7 0
7 1
7 2
7 3
7 4
7 5
7 6
7 7
7 8
7 9
8 0
8 1
8 2
8 3
8 4
8 5
8 6
8 7
8 8
8 9
9 0
9 1
9 2
9 3
9 4
9 5
9 6
9 7
9 8
9 9

The above function runs n * n == n^2 hence we get the O(n^2). By adding an extra for loop and taking the function deeper (my apologies, the output will be a bit longer but will easily help you to understand how this works out):

def print_items(n):
   for i in range(n):
      for j in range(n):
         for k in range(n):
            print(i, j, k)
print_items(10)
0 0 0
0 0 1
0 0 2
0 0 3
0 0 4
0 0 5
0 0 6
0 0 7
0 0 8
0 0 9
0 1 0
0 1 1
0 1 2
0 1 3
0 1 4
0 1 5
0 1 6
0 1 7
0 1 8
0 1 9
0 2 0
0 2 1
0 2 2
0 2 3
0 2 4
0 2 5
0 2 6
0 2 7
0 2 8
0 2 9
0 3 0
0 3 1
0 3 2
0 3 3
0 3 4
0 3 5
0 3 6
0 3 7
0 3 8
0 3 9
0 4 0
0 4 1
0 4 2
0 4 3
0 4 4
0 4 5
0 4 6
0 4 7
0 4 8
0 4 9
0 5 0
0 5 1
0 5 2
0 5 3
0 5 4
0 5 5
0 5 6
0 5 7
0 5 8
0 5 9
0 6 0
0 6 1
0 6 2
0 6 3
0 6 4
0 6 5
0 6 6
0 6 7
0 6 8
0 6 9
0 7 0
0 7 1
0 7 2
0 7 3
0 7 4
0 7 5
0 7 6
0 7 7
0 7 8
0 7 9
0 8 0
0 8 1
0 8 2
0 8 3
0 8 4
0 8 5
0 8 6
0 8 7
0 8 8
0 8 9
0 9 0
0 9 1
0 9 2
0 9 3
0 9 4
0 9 5
0 9 6
0 9 7
0 9 8
0 9 9
1 0 0
1 0 1
1 0 2
1 0 3
1 0 4
1 0 5
1 0 6
1 0 7
1 0 8
1 0 9
1 1 0
1 1 1
1 1 2
1 1 3
1 1 4
1 1 5
1 1 6
1 1 7
1 1 8
1 1 9
1 2 0
1 2 1
1 2 2
1 2 3
1 2 4
1 2 5
1 2 6
1 2 7
1 2 8
1 2 9
1 3 0
1 3 1
1 3 2
1 3 3
1 3 4
1 3 5
1 3 6
1 3 7
1 3 8
1 3 9
1 4 0
1 4 1
1 4 2
1 4 3
1 4 4
1 4 5
1 4 6
1 4 7
1 4 8
1 4 9
1 5 0
1 5 1
1 5 2
1 5 3
1 5 4
1 5 5
1 5 6
1 5 7
1 5 8
1 5 9
1 6 0
1 6 1
1 6 2
1 6 3
1 6 4
1 6 5
1 6 6
1 6 7
1 6 8
1 6 9
1 7 0
1 7 1
1 7 2
1 7 3
1 7 4
1 7 5
1 7 6
1 7 7
1 7 8
1 7 9
1 8 0
1 8 1
1 8 2
1 8 3
1 8 4
1 8 5
1 8 6
1 8 7
1 8 8
1 8 9
1 9 0
1 9 1
1 9 2
1 9 3
1 9 4
1 9 5
1 9 6
1 9 7
1 9 8
1 9 9
2 0 0
2 0 1
2 0 2
2 0 3
2 0 4
2 0 5
2 0 6
2 0 7
2 0 8
2 0 9
2 1 0
2 1 1
2 1 2
2 1 3
2 1 4
2 1 5
2 1 6
2 1 7
2 1 8
2 1 9
2 2 0
2 2 1
2 2 2
2 2 3
2 2 4
2 2 5
2 2 6
2 2 7
2 2 8
2 2 9
2 3 0
2 3 1
2 3 2
2 3 3
2 3 4
2 3 5
2 3 6
2 3 7
2 3 8
2 3 9
2 4 0
2 4 1
2 4 2
2 4 3
2 4 4
2 4 5
2 4 6
2 4 7
2 4 8
2 4 9
2 5 0
2 5 1
2 5 2
2 5 3
2 5 4
2 5 5
2 5 6
2 5 7
2 5 8
2 5 9
2 6 0
2 6 1
2 6 2
2 6 3
2 6 4
2 6 5
2 6 6
2 6 7
2 6 8
2 6 9
2 7 0
2 7 1
2 7 2
2 7 3
2 7 4
2 7 5
2 7 6
2 7 7
2 7 8
2 7 9
2 8 0
2 8 1
2 8 2
2 8 3
2 8 4
2 8 5
2 8 6
2 8 7
2 8 8
2 8 9
2 9 0
2 9 1
2 9 2
2 9 3
2 9 4
2 9 5
2 9 6
2 9 7
2 9 8
2 9 9
3 0 0
3 0 1
3 0 2
3 0 3
3 0 4
3 0 5
3 0 6
3 0 7
3 0 8
3 0 9
3 1 0
3 1 1
3 1 2
3 1 3
3 1 4
3 1 5
3 1 6
3 1 7
3 1 8
3 1 9
3 2 0
3 2 1
3 2 2
3 2 3
3 2 4
3 2 5
3 2 6
3 2 7
3 2 8
3 2 9
3 3 0
3 3 1
3 3 2
3 3 3
3 3 4
3 3 5
3 3 6
3 3 7
3 3 8
3 3 9
3 4 0
3 4 1
3 4 2
3 4 3
3 4 4
3 4 5
3 4 6
3 4 7
3 4 8
3 4 9
3 5 0
3 5 1
3 5 2
3 5 3
3 5 4
3 5 5
3 5 6
3 5 7
3 5 8
3 5 9
3 6 0
3 6 1
3 6 2
3 6 3
3 6 4
3 6 5
3 6 6
3 6 7
3 6 8
3 6 9
3 7 0
3 7 1
3 7 2
3 7 3
3 7 4
3 7 5
3 7 6
3 7 7
3 7 8
3 7 9
3 8 0
3 8 1
3 8 2
3 8 3
3 8 4
3 8 5
3 8 6
3 8 7
3 8 8
3 8 9
3 9 0
3 9 1
3 9 2
3 9 3
3 9 4
3 9 5
3 9 6
3 9 7
3 9 8
3 9 9
4 0 0
4 0 1
4 0 2
4 0 3
4 0 4
4 0 5
4 0 6
4 0 7
4 0 8
4 0 9
4 1 0
4 1 1
4 1 2
4 1 3
4 1 4
4 1 5
4 1 6
4 1 7
4 1 8
4 1 9
4 2 0
4 2 1
4 2 2
4 2 3
4 2 4
4 2 5
4 2 6
4 2 7
4 2 8
4 2 9
4 3 0
4 3 1
4 3 2
4 3 3
4 3 4
4 3 5
4 3 6
4 3 7
4 3 8
4 3 9
4 4 0
4 4 1
4 4 2
4 4 3
4 4 4
4 4 5
4 4 6
4 4 7
4 4 8
4 4 9
4 5 0
4 5 1
4 5 2
4 5 3
4 5 4
4 5 5
4 5 6
4 5 7
4 5 8
4 5 9
4 6 0
4 6 1
4 6 2
4 6 3
4 6 4
4 6 5
4 6 6
4 6 7
4 6 8
4 6 9
4 7 0
4 7 1
4 7 2
4 7 3
4 7 4
4 7 5
4 7 6
4 7 7
4 7 8
4 7 9
4 8 0
4 8 1
4 8 2
4 8 3
4 8 4
4 8 5
4 8 6
4 8 7
4 8 8
4 8 9
4 9 0
4 9 1
4 9 2
4 9 3
4 9 4
4 9 5
4 9 6
4 9 7
4 9 8
4 9 9
5 0 0
5 0 1
5 0 2
5 0 3
5 0 4
5 0 5
5 0 6
5 0 7
5 0 8
5 0 9
5 1 0
5 1 1
5 1 2
5 1 3
5 1 4
5 1 5
5 1 6
5 1 7
5 1 8
5 1 9
5 2 0
5 2 1
5 2 2
5 2 3
5 2 4
5 2 5
5 2 6
5 2 7
5 2 8
5 2 9
5 3 0
5 3 1
5 3 2
5 3 3
5 3 4
5 3 5
5 3 6
5 3 7
5 3 8
5 3 9
5 4 0
5 4 1
5 4 2
5 4 3
5 4 4
5 4 5
5 4 6
5 4 7
5 4 8
5 4 9
5 5 0
5 5 1
5 5 2
5 5 3
5 5 4
5 5 5
5 5 6
5 5 7
5 5 8
5 5 9
5 6 0
5 6 1
5 6 2
5 6 3
5 6 4
5 6 5
5 6 6
5 6 7
5 6 8
5 6 9
5 7 0
5 7 1
5 7 2
5 7 3
5 7 4
5 7 5
5 7 6
5 7 7
5 7 8
5 7 9
5 8 0
5 8 1
5 8 2
5 8 3
5 8 4
5 8 5
5 8 6
5 8 7
5 8 8
5 8 9
5 9 0
5 9 1
5 9 2
5 9 3
5 9 4
5 9 5
5 9 6
5 9 7
5 9 8
5 9 9
6 0 0
6 0 1
6 0 2
6 0 3
6 0 4
6 0 5
6 0 6
6 0 7
6 0 8
6 0 9
6 1 0
6 1 1
6 1 2
6 1 3
6 1 4
6 1 5
6 1 6
6 1 7
6 1 8
6 1 9
6 2 0
6 2 1
6 2 2
6 2 3
6 2 4
6 2 5
6 2 6
6 2 7
6 2 8
6 2 9
6 3 0
6 3 1
6 3 2
6 3 3
6 3 4
6 3 5
6 3 6
6 3 7
6 3 8
6 3 9
6 4 0
6 4 1
6 4 2
6 4 3
6 4 4
6 4 5
6 4 6
6 4 7
6 4 8
6 4 9
6 5 0
6 5 1
6 5 2
6 5 3
6 5 4
6 5 5
6 5 6
6 5 7
6 5 8
6 5 9
6 6 0
6 6 1
6 6 2
6 6 3
6 6 4
6 6 5
6 6 6
6 6 7
6 6 8
6 6 9
6 7 0
6 7 1
6 7 2
6 7 3
6 7 4
6 7 5
6 7 6
6 7 7
6 7 8
6 7 9
6 8 0
6 8 1
6 8 2
6 8 3
6 8 4
6 8 5
6 8 6
6 8 7
6 8 8
6 8 9
6 9 0
6 9 1
6 9 2
6 9 3
6 9 4
6 9 5
6 9 6
6 9 7
6 9 8
6 9 9
7 0 0
7 0 1
7 0 2
7 0 3
7 0 4
7 0 5
7 0 6
7 0 7
7 0 8
7 0 9
7 1 0
7 1 1
7 1 2
7 1 3
7 1 4
7 1 5
7 1 6
7 1 7
7 1 8
7 1 9
7 2 0
7 2 1
7 2 2
7 2 3
7 2 4
7 2 5
7 2 6
7 2 7
7 2 8
7 2 9
7 3 0
7 3 1
7 3 2
7 3 3
7 3 4
7 3 5
7 3 6
7 3 7
7 3 8
7 3 9
7 4 0
7 4 1
7 4 2
7 4 3
7 4 4
7 4 5
7 4 6
7 4 7
7 4 8
7 4 9
7 5 0
7 5 1
7 5 2
7 5 3
7 5 4
7 5 5
7 5 6
7 5 7
7 5 8
7 5 9
7 6 0
7 6 1
7 6 2
7 6 3
7 6 4
7 6 5
7 6 6
7 6 7
7 6 8
7 6 9
7 7 0
7 7 1
7 7 2
7 7 3
7 7 4
7 7 5
7 7 6
7 7 7
7 7 8
7 7 9
7 8 0
7 8 1
7 8 2
7 8 3
7 8 4
7 8 5
7 8 6
7 8 7
7 8 8
7 8 9
7 9 0
7 9 1
7 9 2
7 9 3
7 9 4
7 9 5
7 9 6
7 9 7
7 9 8
7 9 9
8 0 0
8 0 1
8 0 2
8 0 3
8 0 4
8 0 5
8 0 6
8 0 7
8 0 8
8 0 9
8 1 0
8 1 1
8 1 2
8 1 3
8 1 4
8 1 5
8 1 6
8 1 7
8 1 8
8 1 9
8 2 0
8 2 1
8 2 2
8 2 3
8 2 4
8 2 5
8 2 6
8 2 7
8 2 8
8 2 9
8 3 0
8 3 1
8 3 2
8 3 3
8 3 4
8 3 5
8 3 6
8 3 7
8 3 8
8 3 9
8 4 0
8 4 1
8 4 2
8 4 3
8 4 4
8 4 5
8 4 6
8 4 7
8 4 8
8 4 9
8 5 0
8 5 1
8 5 2
8 5 3
8 5 4
8 5 5
8 5 6
8 5 7
8 5 8
8 5 9
8 6 0
8 6 1
8 6 2
8 6 3
8 6 4
8 6 5
8 6 6
8 6 7
8 6 8
8 6 9
8 7 0
8 7 1
8 7 2
8 7 3
8 7 4
8 7 5
8 7 6
8 7 7
8 7 8
8 7 9
8 8 0
8 8 1
8 8 2
8 8 3
8 8 4
8 8 5
8 8 6
8 8 7
8 8 8
8 8 9
8 9 0
8 9 1
8 9 2
8 9 3
8 9 4
8 9 5
8 9 6
8 9 7
8 9 8
8 9 9
9 0 0
9 0 1
9 0 2
9 0 3
9 0 4
9 0 5
9 0 6
9 0 7
9 0 8
9 0 9
9 1 0
9 1 1
9 1 2
9 1 3
9 1 4
9 1 5
9 1 6
9 1 7
9 1 8
9 1 9
9 2 0
9 2 1
9 2 2
9 2 3
9 2 4
9 2 5
9 2 6
9 2 7
9 2 8
9 2 9
9 3 0
9 3 1
9 3 2
9 3 3
9 3 4
9 3 5
9 3 6
9 3 7
9 3 8
9 3 9
9 4 0
9 4 1
9 4 2
9 4 3
9 4 4
9 4 5
9 4 6
9 4 7
9 4 8
9 4 9
9 5 0
9 5 1
9 5 2
9 5 3
9 5 4
9 5 5
9 5 6
9 5 7
9 5 8
9 5 9
9 6 0
9 6 1
9 6 2
9 6 3
9 6 4
9 6 5
9 6 6
9 6 7
9 6 8
9 6 9
9 7 0
9 7 1
9 7 2
9 7 3
9 7 4
9 7 5
9 7 6
9 7 7
9 7 8
9 7 9
9 8 0
9 8 1
9 8 2
9 8 3
9 8 4
9 8 5
9 8 6
9 8 7
9 8 8
9 8 9
9 9 0
9 9 1
9 9 2
9 9 3
9 9 4
9 9 5
9 9 6
9 9 7
9 9 8
9 9 9

The above function run n * n * n times hence it becomes O(n^3) but using drop constants to simplify this, it can also be written as O(n^2).

Notice that the first and the last elements in the output of the function(s) respectively are: O(n) - 0, 9 || O(n^2) - 00, 99 || O(n^3) - 000, 999.

Visualizing it on a graph:

   |          *                  
   |  O(n^2) *              
 T |        *          
 i |       *     
 m |      *  
 e |    * 
   |_*_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
                input size

O(n^2) graph is steeper than the O(n) graph. This means that O(n^2) is a lot less efficient than O(n) from a Time Complexity standpoint.

   |          *                  *
   |  O(n^2) *              * 
 T |        *          *
 i |       *      * O(n)
 m |      *   *
 e |    * *
   |_*_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
                input size
  • Drop Non-Dominants

The second way to simplify the Big O notation is known as drop non-dominants.
Using the nested for loop example:

def print_items(n):
   for i in range(n):
      for j in range(n):
         print(i, j)
   for k in range(n):
      print(k)
print_items(10)
0 0
0 1
0 2
0 3
0 4
0 5
0 6
0 7
0 8
0 9
1 0
1 1
1 2
1 3
1 4
1 5
1 6
1 7
1 8
1 9
2 0
2 1
2 2
2 3
2 4
2 5
2 6
2 7
2 8
2 9
3 0
3 1
3 2
3 3
3 4
3 5
3 6
3 7
3 8
3 9
4 0
4 1
4 2
4 3
4 4
4 5
4 6
4 7
4 8
4 9
5 0
5 1
5 2
5 3
5 4
5 5
5 6
5 7
5 8
5 9
6 0
6 1
6 2
6 3
6 4
6 5
6 6
6 7
6 8
6 9
7 0
7 1
7 2
7 3
7 4
7 5
7 6
7 7
7 8
7 9
8 0
8 1
8 2
8 3
8 4
8 5
8 6
8 7
8 8
8 9
9 0
9 1
9 2
9 3
9 4
9 5
9 6
9 7
9 8
9 9
0
1
2
3
4
5
6
7
8
9

The above function ran twice. The first nested loop (i and j) ran O(n^2) while the second single loop (k) ran O(n). Hence the total number of the output is O(n^2) + O(n) which can also be written as O(n^2 + n).
The main thing of 'n' we are concerned about is "what happens when we start having 'n' as a very large number?" eg. when we have 'n' as 100, n^2 == 10000, hence as a percentage of the total number of operations, the stand-alone 'n' will still remain to be 100, which is a very small portion of the number of operations, hence it is insignificant. In the equation O(n^2 + n), 'n^2' is the dominant term while the stand-alone 'n' is the non-dominant, we drop the non-dominant and the notation remains to be O(n^2).

  • O(1)

In comparison to other Big Os we have covered, as 'n' gets bigger, the number of operations increases, but with this Big O...:

def add_items(n):
   return n + n
add_items(5)
10

...there is always a single operation, and an increase to 'n' does not lead to an increase in the number of operations(remains constant). return n + n + n == O(2) == O(1).
O(1) is also called constant time. This is because as 'n' increases, the number of operations will remain constant.
Visualizing it on a graph:

   |                              
   |                    
 T |                    
 i |               
 m |          
 e |              O(1) 
   |_*_*_*_*_*_*_*_*_*_*_*_*_*_*_*_*_ _ 
                input size

As 'n' increases, there is no increase in Time Complexity. This makes it the most efficient big(O). Anytime that you can make something O(1), it is optimal as you can make it.

   |          *                  *
   |  O(n^2) *              * 
 T |        *          *
 i |       *      * O(n)
 m |      *   *
 e |    * *           O(1)
   |_*_*_*_*_*_*_*_*_*_*_*_*_*_*_*_ _ _ 
                input size
  • O(log n)

In demonstrating and explaining this, we will start with an example.
Assuming we have a list and looking for number '1' in the list:

[1, 2, 3, 4, 5, 6, 7, 8]

The numbers should be sorted.
What we are going to do is to find the most efficient way of finding the number '1' from the list.
We will first divide the list into half and check whether the number is either in the first half or the second half. We will take the half with the number and do away with the other half. We will continue with the process till we find the number.

[1, 2, 3, 4] [5, 6, 7, 8] --- 1 is in the first half
[1, 2] [3, 4] --- 1 is in the first half
[1] [2] --- 1 is in the first half
1 --- We finally remain with the number

We then count how many steps we took to find the number:

1. [1, 2, 3, 4] [5, 6, 7, 8] --- 1 is in the first half
2. [1, 2] [3, 4] --- 1 is in the first half
3. [1] [2] --- 1 is in the first half

We took 3 steps to find the number from a total of 8 items(numbers).
2^3 = 8 == log2 8 = 3
8 -- Total items
3 -- Steps taken to arrive to the final number after dividing the number into halves
We can have an example like: log2 1,073,741,824 = 31 meaning that you can divide that number 31 times into halves to get to the final number despite which item you're looking for.
O(log n) is more efficient that O(n) but slightly less in efficiency than O(1).

  • O(nlog n)

This is a Big(O) used with some sorting algorithms like merge sort and quick sort. This is not an efficient Big(O), however, this is the most efficient that you can make a sorting algorithm, sorting various types of data besides numbers.

Different terms for input is a Big(O) concept that interviewers like to as kind of a gotcha question.

An example is the function below with different parameters:

def print_items(a, b):
    for i in range(a):
        print(i)

    for j in range(b):
        print(j)

When covering O(n), we mentioned dropping constants and hence you might think that because this has two loops in the function, it can be equated to O(2n) == O(n) WHICH IS WRONG.
Being that the function has two parameters, a - 'n1' cannot be equivalent to b - 'n2' hence the Big(O) is O(a + b).
Similarly:

def print_items(a, b):
    for i in range(a):
       for j in range(b):
          print(i, j) 

...the Big(O) is O(a * b).

Big(O) of Lists

Assuming we have a list my_list = [11, 2, 23, 7]:
If we want to append a number 17 at the end, (my_list.append(17)), there will be no re-indexing of the original numbers in the list.
Same case applies when we want to pop the number from the list, (my_list.pop()), no re-indexing of the items will occur. This therefore becomes a Big(O) of O(1).

However, when we want to pop a number at the index of 0, (my_list.pop(0)), or insert a number at index 0, (my_list.insert(0, 13)), re-indexing of all the items in the list occurs. This therefore becomes a Big(O) of O(n).

What about inserting items at the middle of the list? (my_list.insert(1, 'Hi'))
Re-indexing of the items at the right of the inserted item occurs. It being at the middle cannot be referred to as O(1/2 n). This is because Big(O) measures worst case and secondly '1/2' is a constant, hence we drop the constant. Therefore the Big(O) of inserting and popping items from the middle of the list is O(n).

  • Looking for the item in a list

When looking for number 7 in the list my_list = [11, 3, 32, 7], we will iterate over the entire list till we find 7 in the list hence the Big(O) is O(n). But when we are looking for a number using an index from the list, (my_list[3]), we will find the number in the index without iterating, hence the Big(O) is O(1).

Wrapping Up

When we take n to be 100;
O(1) == 1
O(log n) == 7
O(n) == 100
O(n^2) == 10000

This means that O(n^2) compared to the other 3 is very inefficient. The spread becomes even bigger when 'n' becomes larger.

When we take n to be 1000;
O(1) == 1
O(log n) == 10
O(n) == 1000
O(n^2) == 1000000

  • Terminologies used to refer to the Big(O)s:

O(1) - Constant
O(log n) - Divide and Conquer
O(n) - Proportional (will always be a straight line)
O(n^2) - Loop within a loop

Summing it all up; We can now understand the chart below, that we had seen earlier in the course... however we didn't cover O(n!) and O(2^n) as they are not common and one has to intentionally write bad code to achieve.

Big-o Complexity chart

Till Next time... Bye bye


This content originally appeared on DEV Community 👩‍💻👨‍💻 and was authored by Mark Gatere


Print Share Comment Cite Upload Translate Updates
APA

Mark Gatere | Sciencx (2022-09-06T22:58:50+00:00) Say hello to Big (O)h!. Retrieved from https://www.scien.cx/2022/09/06/say-hello-to-big-oh/

MLA
" » Say hello to Big (O)h!." Mark Gatere | Sciencx - Tuesday September 6, 2022, https://www.scien.cx/2022/09/06/say-hello-to-big-oh/
HARVARD
Mark Gatere | Sciencx Tuesday September 6, 2022 » Say hello to Big (O)h!., viewed ,<https://www.scien.cx/2022/09/06/say-hello-to-big-oh/>
VANCOUVER
Mark Gatere | Sciencx - » Say hello to Big (O)h!. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2022/09/06/say-hello-to-big-oh/
CHICAGO
" » Say hello to Big (O)h!." Mark Gatere | Sciencx - Accessed . https://www.scien.cx/2022/09/06/say-hello-to-big-oh/
IEEE
" » Say hello to Big (O)h!." Mark Gatere | Sciencx [Online]. Available: https://www.scien.cx/2022/09/06/say-hello-to-big-oh/. [Accessed: ]
rf:citation
» Say hello to Big (O)h! | Mark Gatere | Sciencx | https://www.scien.cx/2022/09/06/say-hello-to-big-oh/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.