When it comes to programming there is one thing we all dev suck at... well mostly the new devs. And that's big O notation. So here's what Big O is and why it matters.
❓What is Big O notation❓
Imagine this(and imagine only), you are at a buffet where you can eat all you want without any payment(see why you only have to imagine it?) eyeing those sweet sugary diabetes-giving chocolate stuff. Now say you want to know how quickly can you scoop up all that diabetes on your plate without taking much time. That's what Big O notation is all about! No not chocolate, how quickly can you do one thing?
Of course, this was just an imagination
and in programming, it's not us doing the thing, it's the computers that do our dirty work! So Big O notations tell us how fast an algorithm works.
📊 Types of Big O notation 📊
There are different types of Big O notation, but I won't call them types more like stages.
O(1) - The myth: Constant Time ⏱️
Now say you have to pick something up from a toolbox and you have 2 tools(not a handyman are?) you wouldn't take much time, say you have 15 tools in the toolbox, but still no time taken. You take the same amount of time in both cases because all you have to do is pick it up!
AND IF YOU CAN'T EVEN DO THAT WELL THAT'S A SKILL ISSUE!
This is called constant Time of O(1), how do you say it? That's another skill issue.
There are different ways of saying, Big O of 1, O of 1, order of 1, and bluh bluh bluh bluh and I hate all that, so I call it... constant time... yeah don't even ask.
In code, it would look kind of like this:
const firstElement = randomStupidArray[0];
Exactly like picking a hammer from a toolbox, this code also is trying to look for a hammer(element) in a toolbox(array).
And that's just how simple O(1) run time complexity is. Simply, it's fast and it's just another myth that we believe because we all write code in O(n) run time complexity.
O(n) - The one we all do: Linear time 📈
Now O(1) sounds really nice, it's the dream you have for your code, but sadly we devs are just too optimistic and write code only in our dreams in O(1), meanwhile, our real codebase looks like it takes more time than a tortoise.
That's O(n), it's that useless yet seen everywhere organism that just doesn't make sense, like those humans outside your house(but, really do you think you are a robot?).
Imagine, and again just imagine, that you have to cut 10 ropes, each one by one, that takes you say 1s each, that would be 10s in total. Now say you have to cut 100 ropes, taking the same 1s each how much time will you take now? No! You are wrong! Unless you are right. It would take 100s in total.
And there you have it that's O(n) rum time complexity!
Of course, we also need the code example for the wewbies!
// real life code
if (let i = 0; i < arr.length; i++) {
console.log(i);
}
And this will take exactly... O(n) time! Why? Because it takes a certain time to console log i
, and now it has to console i
n
number of times. Thus it takes O(N) to execute.
O(n^2) - The killer: Quadratic Time 🔳
Once again let's imagine and only imagine!
Imagine you have to make pairs of socks, and there are exactly 10 pairs in a jumbled-up pile of socks. For each sock, you will need to search through the pile, and for each pair, you will need to do this 2 times. So, now say you take 1s to find 1 sock(fast for a robot), that means to find 1 pair you will take 2s, which gives us a total time of 20s to sort out all the pairs. And it grows exponentially.
Mathematically (Prepare to die) it means that we are doing n X n
, or more simply we are just doing the same operation 2 times and over n
number of times again for the other elements as well... yeah you know I am talking about nested list.
for (let i = 0; i < arr.length; i++) {
for (let j = 0; j < arr.length; i++) {
console.log(i * j)
}
}
Yeah I have no idea why but this code makes sense. We have a for loop that has another for loop, and both are doing the same thing, thus it's twice the thing and has exponential growth in run time. Professional isn't it?
O(log(n)) - The Hero: Logarithmic Time 📉
Let's take a book... never heard of it? Are you even real?
Say I want to open page 69... cause why not.
One way is to go over each page one by one and see if it's page 69. but sadly this method takes a bunch of time... or run time if you may.
And yes it's slow even for some god sack reason 10ms is also slow!
And in case you tried to get what run time complexity it is... well comment below that your homework!
Now to make this fast there is one way we can do this.
Now think about it, it's really simple. All the pages are in sorted order right? They all are sorted, page 4 comes after page 3 and page 69 comes after page 68. So that means if we pick a page randomly and check if it's greater then or lesser then the target page, we can ignore all other pages before or after that page and repeat the process till we reach page 69!
In fact it has it's own name, it's called the binary search algorithm. All it does is it takes the middle item of the array, checks if it is greater or smaller then the target, if it is greater then it ignores all the other items in the array after that item, it again picks the middle item of this new array and repeats the process of dividing and checking if the middle item is the target or not. And there you have it! You can find page 69 faster in O(log(n)) run time complexity!
Yeah in code it's like this:
function binarySearch(arr, x) {
let low = 0;
let high = arr.length - 1;
let mid;
while (high >= low) {
mid = low + Math.floor((high - low) / 2);
if (arr[mid] == x)
return mid;
if (arr[mid] > x)
high = mid - 1;
else
low = mid + 1;
}
// in case the item does not exist... edge case!
return -1;
}
And there you have it! This code runs in O(log(n)), so it is fast but keep in mind, it only works if the array is sorted, if not... well then sort it first, and don't you use the default sorting method!
And there you have it! That's all about Big O!
Now keep in mind there is more to Big O than just this. Creating an efficient algorithm takes a lot of thinking, which I guess nil
of you do, well even I don't! But it is quite a fun thing if you ask me. Big O is useful as it can help you reduce your AWS bill from $1.000000001B to $1B, now of course that's not how it works but you git it right?
For a proper example, say your company has an AWS EC2 server running 24 hours a day, and the only piece of code you wrote is a sorting algorithm that takes 69ms to sort an array of 10 items, now that's a skill issue if I have seen any, not only will the latency go up, your AWS bill will go O(n^2)! and now that's a crucifixion!
To fix this issue you can just make your algorithm a bit more faster, and voila! Your algorithm now runs 10x faster and your bill is now more payable! Finally, now you can pay your AWS bills without robbing your neighbors!
If you think you have what it takes to solve a problem, try solving this one in O(n)!
Given an array of n
numbers that contains between 1 and n+1, find duplicates in that array in O(n) run time complexity. Assume that there is only 1 duplicate number but it can repeat multiple times, find that number
And of course the array: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 1, 12, 13, 14, 15]
And if you did solved it... good. You did a good job, because I sure love making other people do my work!
Top comments (14)
The whole concept of Datastructures and Algorithms is the most essential aspect every developer should be conscious of.
Yeah, if you don't do DSA you won't get a job! Of course, there is more to it, but mostly the job factor, and surely becomes a lot easier to understand a problem in a more code perspective. It truly helps.
Yes absolutely for the job hunting. However, not every developer needs it for day to day work. And with limited time one should be careful about what to study. I hate it that DSA is standard in our job interview. It tells me that one is good or bad at DSA. Not that it has more or less potential to be a great engineer, which is a lot more than coding. Just saying.
I know right? Job interviews are most of the time DSA focus. They just don't seem to get it.
My friend, I spent so much time meditating on the concept of "them"
Who are those people? Going a bit offtopic I had so much struggle looking for a job in tech, and always had this thought. Why can't "they" seems to get it?
I believe "they" are actually you and me.
I have seen so many developers who as soon as they are promoted keep doing as everyone else, not changing or challenges status quo.
Thus, nothing changing ever.
So, I don't know you Ezpie, but if you are ever in a position to influence interview process. Maybe you should do it. And if job requires knowledge of DSA, absolutely nothing wrong with testing ones ability to do them.
All the best to you. I am doing my part to the best of my abilities.
Niko
By "they" I meant both the interviewers and us... don't mind my English it's bad I know. And I agree and disagree with this take. You see most people start to get hold over DSA, now I am not saying that DSA is bad my take is, that DSA is great, but only at a certain limit, of course here I wouldn't refer to CSS, but that doesn't mean it's not important, now of course no one would do a CSS interview of course. But the point is, I don't like it once the interviewer goes, "Can you do a dijkstra algorithm", which to me is a concerning question, because it tests if the candidate can solve a problem requiring the dijkstra algorithm, but it does not test if the candidate is capable of really solving a problem, cause I mean how many combinations of dijkstra question is one going to make? I just have to learn the algorithm, do a bunch of dijkstra questions, and done. I can pass an interview. Now of course keep in mind I am not saying that DSA is useless, it is useful, but asking it in an interview? I mean you can ask it, but at a limit, I prefer to go by more role-specific questions, because these questions are the ones that can tell if a candidate fits for the role or not. For a more proper POV, here in India, we have IIT, it's a standard university for entering software engineering, in fact, if you are an IIT graduate, big tech companies come straight to you for an interview, and you don't even have to apply. And I can understand why, because we Indians are really grade focused, here in India if you don't get above 93% you are considered almost useless I would say. And it has its own side effects, most IIT grades know their DSA really well, except that they don't really know how to make production-ready code, take any Indian, they can solve problems in theory but won't be able to solve it practically. And that's a downfall.
This is more of my opinion, not sure how it really goes but this is what I have seen.
Wow. Thanks for sharing this. Learned a lot. I agree with literally everything.
This thread has been most educative, even I have learned alot and it is a beautiful thing.
It would be best if you stuck around because I am currently working on a very toxic post... it's clean code. That's the most misleading topic and I would like to cover it and show how real production-ready code is written. Of course, now I am not a professional so, that's just an opinion.
Greeat.
don't forget to please star my opensource project dataDisk. version 1.1.0 is on pypi ready to use, just
Currently working on 1.2.1 with more interesting features and machine learning capabilities.
sure give me the link and I will check it out.
Here is the dataDisk project link
dataDisk
I'm not sure I agree that O(1) is a myth - for the entire program? Yes, then it's not possible. But optimising a search using O(1) rather than O(log n) is a massive improvement at the expense of memory and is something you should consider frequently.
If you have 1,000,000 (call it n) items in a set and you want to check 1,000 items against it (call it m) then using a "HashSet" for the n items is going to be dramatically faster than a binary searched sorted array.
HashSet = m * O(1) = m
Binary Search = m * O(log n)
In the example, the binary search is ~20x slower than the HashSet. Clearly, the HashSet implementation will have "some" impact on that improvement, but that's why Big O is so useful to rule-of-thumb things.
Also, creating that sorted array in the first place is 20 times slower than adding them all to a HashSet, so unless its already stored that way, then the O(1) approach is much preferred.