DEV Community

Cover image for What is Big-O Notation? Understand Time and Space Complexity in JavaScript.
Chandra Prakash Tiwari
Chandra Prakash Tiwari

Posted on • Updated on

What is Big-O Notation? Understand Time and Space Complexity in JavaScript.

As we know, there may be more than one solution to any problem. But it is hard to define, what is the best approach and method of solving that programming problem.
Writing an algorithm that solves a definite problem gets more difficult when we need to handle a large amount of data. How we write each and every syntax in our code matters.

There are two main complexities that can help us to choose the best practice of writing an efficient algorithm:

1. Time Complexity - Time taken to solve the algorithm

2. Space Complexity - The total space or memory taken by the system.

When you write some algorithms, we give some instructions to our machine to do some tasks. And for every task completion machine needs some time. Yes, it is very low, but still, it takes some time. So here, is the question arises, does time really matters.

Let's take an example, suppose you try to find something on google and it takes about 2 minutes to find that solution. Generally, it never happens, but if it happens what do you think what happens in the back-end. Developers at google understand the time complexity and they try to write smart algorithms so that it takes the least time to execute and give the result as faster as they can.

So, here is a challenge that arises, how we can define the time complexity.

What is Time Complexity?:

It quantifies the amount of taken by an algorithm. We can understand the difference in time complexity with an example.

Suppose you need to create a function that will take a number and returns a sum of that number upto that number.
Eg. addUpto(10);
it should return the sum of number 1 to 10 i.e. 1 + 2+ 3 + 4 + 5 + 6 + 7 + 8 + 9 + 10;

We can write it this way:
function addUpTo(n) {
let total = 0;
for (let i = 1; i <= n; i++) {
total += i;
}
return total;
}
addUpTo(5); // it will take less time
addUpTo(1000) // it will take more time

Now you can understand why the same function takes different time for different inputs. This happens because the loop inside the function will run according to the size of the input. If the parameter passed to input is 5 the loop will run five times, but if the input is 1000 or 10,000 the loop will run that many times. This makes some sense now.

But there is a problem, different machines record different timestamp. As the processor in my machine is different from yours and same with multiple users.

So, how can we measure this time complexity?

Here, Big-O-Notation helps us to solve this problem. According to Wikipedia, Big O Notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. The letter O is used because the growth rate of a function is also referred to as the
order of the function.

According to Big O notation, we can express time complexities like

  1. If the complexity grows with input linearly, that mean its O(n). 'n' here is the number of operation that an algorithm have to perform.
  2. If complexity grows with input constantly then, the Big O Notation will be O(1).
  3. If complexity grows with input quadratically, then the Big O Notation will be O(n^2). you can pronounce it as O of n square
  4. If the complexity grows with input with inverse of exponentiation, we can say.

log algorithm

We can simplify these expressions like below. Basically while calculating the Big O Notation we try to ignore the lower values and try to focus on highest factor which can increase the time of the performance. So,

  1. instead of O(2n) prefer O(n);
  2. instead of O(5n^2) prefer O(n^2);
  3. instead of O(55log n) prefer O(log n);
  4. instead of O(12nlog n) prefer O(nlog n);

Log Image

For better understanding, please have a look at some algorithms which we use daily that have O(n),O(n^2), and O(log n) complexities?

In Quora, Mark Gitters said,
``
O(n): buying items from a grocery list by proceeding down the list one item at a time, where ā€œnā€ is the length of the list

O(n): buying items from a grocery list by walking down every aisle (now ā€œnā€ is the length of the store), if we assume list-checking time is trivial compared to walking time

O(n): adding two numbers in decimal representation, where n is the number of digits in the number.

O(n^2): trying to find two puzzle pieces that fit together by trying all pairs of pieces exhaustively

O(n^2): shaking hands with everybody in the room; but this is parallelized, so each person only does O(n) work.

O(n^2): multiplying two numbers using the grade-school multiplication algorithm, where n is the number of digits.

O( log n ): work done by each participant in a phone tree that reaches N people. Total work is obviously O( n ), though.

O( log n ): finding where you left off in a book that your bookmark fell out of, by successively narrowing down the range
``
and Arav said,
"
If you meant algorithms that we use in our day to day lives when we aren't programming:

O(log n): Looking for a page in a book/word in a dictionary.
O(n): Looking for and deleting the spam emails (newsletters, promos) in unread emails.
O(n ^ 2): Arranging icons on the desktop in an order of preference (insertion or selection sort depending on the person)."

I hope you are now familiar with the complexities.
I am not completing the topic in this article, I will make another in future.
If you have any questions and suggestions please write down the comment or feel free to contact me.

Thanks for giving your valuable time in reading this article.

Top comments (9)

Collapse
 
layzee profile image
Lars Gyrup Brink Nielsen

We will never be able to express the complexity of space-time with JavaScript šŸ˜„

Collapse
 
chandra profile image
Chandra Prakash Tiwari

Thank you Lars, Would you please explain to me why? It will really help me to learn more.

Collapse
 
layzee profile image
Lars Gyrup Brink Nielsen

It's a joke šŸ˜„ I was referring to space-time.

Collapse
 
kyleuk profile image
Kyle-uk

Thanks that was some nice examples.

I have tried a few times to understand Big-O notation and logarithms, but I never have and don't think I ever will.

Do you know of any resource in particular that helped you?

Collapse
 
chandra profile image
Chandra Prakash Tiwari

I have learned from Udemy. Its the best online solution out there.

Collapse
 
nyc4m profile image
Baptiste Prunot

I think you got a mistake in your article,
You should prefer O(log n) to O(n), it's the principle of binary tree šŸ˜

Collapse
 
chandra profile image
Chandra Prakash Tiwari • Edited

Yes, thanks Baptiste for the correction. Please have a look at the graph. I think there is a little difference between the two.

Collapse
 
ypedroo profile image
Ynoa Pedro

Awesome post

Collapse
 
chandra profile image
Chandra Prakash Tiwari

Thanks Ynoa