What is Big O Notation ?
Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. src
What is Time Complexity ?
Time complexity is defined as the amount of time taken by an algorithm to run, as a function of the length of the input.
You can check out the different complexity times in the table below:
Time Complexity Name | Complexity |
---|---|
Constant | O(1) |
Linear | O(n) |
Logarithmic | O(log n) |
Quadratic | O(n²) |
Exponential | O(2^n) |
Factorial | O(n!) |
You can see the big o notation of common functions here.
To get a good perspective of time complexities, we can use the graphical calculator of these constants. You can use desmos graph calculator. I created a simple demo, you can check out here.
By calculating time complexity of an algorithm, we can get an information about scalability and performance. If we have an algorithm that performs in quadratic or exponential time complexity, we can try to optimize it, so it can scale.
You can see some examples of time complexities of loops.
//O(n)
for(let i=0; i<n; i++){
}
//O(n*m) --> O(n²)
for(let i=0; i<n; i++){
for(let j=0; j<m; j++){
}
}
let s = 0; //O(1)
//O(log(n))
for(let i=n; i>0; i=i/2){
}
When we check out the graph examples in the desmos calculator, we get a better picture of how algorithms can scale. And this is not related to the only algorithms, we can use this knowledge for database queries. For example, if we have a table with no index defined, the time complexity for search will be O(n). But if we define an index, the time complexity of search will be O(log(n)). Because when we create an index, we don't have to check every row in the table; instead, we can use a divide-and-conquer algorithm by looking based on index.
Calculation
Let's see how can we calculate big o notation of an algorithm.
let sum = (n) => {
let res = 0 // O(1)
for(let i=0; i<n; i++){ //O(n)
res += i // O(1)
}
return res // O(1)
}
So when we sum all time complexity, we get O(n).
sum = 1 + n*1 + 1 = n + 2 => O(n)
An example for n*log(n)
time complexity
let sumDivide = (n) => {
let res = 0 // O(1)
for(let i=n; i<n; i/2){ //O(log(n))
for(let j=n; j<n; j/2){ //O(n)
res += j // O(1)
}
}
return res // O(1)
}
sumDivide = 1 + log(n)*n*1 + 1 = n*log(n) + 2 => O(n*log(n))
Top comments (1)
Your last example is broken. Your loops have 0 iterations. You initialize i to n, but then repeat if i is less than n. That condition is never true since you are beginning i equal to n. Same problem with inner loop and j. There are other problems with it too. You're dividing i and j by 2 without actually assigning results back to i and j. Your loop conditions are also written assuming i and j increase, but dividing will decrease them. Also, if your intention in inner loop is to half j repeatedly until it reaches 0 then the runtime of that loop is O(log n) but your comment indicates O(n).