The story starts with invention of computational machine. The easiest (but surprisingly not the only) way to build such machine and scale it up and down was to use binary coding for storing data and instructions to manipulate this data. The era of computing and IT was born. The core of every computational device is a processor or CPU (Central Processing Unit). Those guys are able to read instructions and those binary encoded instructions tell what to do with binary encoded data. That's simple hey? we just need to write those instructions and provide data.
Yet you obviously see that if you want CPU to do something more complex than just move bytes from one register to another you'd need to write a lot of code, and using binary code will have its flows like you can quickly get lost of what your code is doing and passing it to another developer would be a nightmare.
So you have to be inventive there. And the first idea may be to write down binary code not in 1 and 0, but in byte-sized hexadecimal numbers.
This is already a HUGE improvement. For the start you reduce your code (in the written on the paper form) in 8 times and with human's perfect ability to read symbols like... any symbol you can kinda understand better the code, but... well yeah, still not ideal.
Let's think how could we further improve on this? What do we basically know about machine code and instructions?
Alright now we can tell our processor to do commands in the way we understand them - take this byte, put it to that register, make an interruption... But is how do we really want to do things here? Let's step up and look at the code again - it is bunch of instructions for processor to do, but what do we get at the end, what is the end result?
Bingo! That's it. We want our computer just to put "Hello world" on the screen, so why don't we do it in the very clear for us humans form, so when I want to share my program code another human being can understand me just right. Sure to be able to do it we need to go from writing an Assembly program using machine code, then we use Assembly language to write Compiler, than we use Compiler to write Interpreter...
And the quest doesn't end here. As the more and more complex systems we create we always can make a step up and take a look at our system from higher level, find some patterns, invent new concepts which describe how the system work and then explain it to another human being so they can understand it. IMHO at some point we did stuck rather with textual description as with bridge between human and processor worlds, and although there're many complex concepts introduced into programming languages we still using visualizations like charts and diagrams for only human communication and then we write down the code in textual form (syntax highlight was a huge leap though in visualizing code). Yet for me this is the area of opportunity for development of programming tools, though this is another story for another day...
Top comments (0)