I use one service from time to time: I need to upload some files there (the name of the service does not matter, because, frankly, they are all the...
For further actions, you may consider blocking this person and/or reporting abuse
I don't know what it's like now, but when I ran "hello world" in React a couple of years ago it extracted a quarter of a gigabyte of files.
I mostly agree with you (and as I'm in your father's age range that might not be surprising) but I do recognise that everything we do is standing on the shoulders of those who came before. We have things like Electron to genericise making applications and while everyone knows it's bloated, it works, and it works for almost everyone because it does everything you don't need in case someone else does.
I find the weird background tasks applications use upsetting as well. I have no idea why every proprietary application needs to run at startup and have "update" and "maintenance" processes running when I'm not trying to use it. It's another reason to steer clear of proprietary software wherever possible; free software almost never does this, or if it does it asks you first.
The answer to this is simple, whether you like it or not - it's the users. They expect the software to be updated automatically. They never do it manually, but they are first to complain that something is not working on their not-updated-for-two-years version.
Isn't that what package management is for, though? It's not the job of an application to handle things like window decorations or storage quotas... or updates.
It is. Does Windows have package manager that is popular among its users? Isn't the most popular way to install things on MacOS to just download a dmg from a website?
On Mac, it's the App store or homebrew (third party) (or Ports, if that's still maintained). On Windows, it's the MS Store or chocolatey (third party) (I think?)
I don't think many people download .exe or .dmg files any more.
Literally nobody is using Chocolatey or MS Store. Some people use Mac App Store, but most application aren't there. Only developers use Homebrew or MacPorts. Most people download from websites.
@moopet When there are tens of thousands of software packages in a distribution, updating a program with hundreds of dependencies becomes quite difficult, especially for non-free software. Yes, all sorts of snap and flatpak are used to solve this problem, but they completely eliminates the advantages of shared libraries, which are designed to avoid code duplication.
@katafrakt it doesn't matter for the purposes of this whether many people use it or not, it's the only practical way for applications to behave unless (as @mariamarsh notes) we all move to statically-linked behemoths.
Oh yeah, of course, if the reality does not matter, keep wondering why every application has this kind of background process.
The question is, "what's wrong with code in 2022" and the culture of downloading separate things for everything is part of that.
It works, but most of the time it sucks. And lately, the user does not care what the software is written on, as long as it works. For users, the quality bar has already gone underground, so when they find something that works quickly even on old hardware and does not take up a hundred megabytes, they are really surprised.
Imagine if almost all popular instant messaging clients were written in Word macros and shipped with a Word copy. In my opinion, Electron is just as ridiculous.
But despite the general disappointment that there is a lot of software on the electron, it is still really the leader in cross-platform desktops... And there are certain reasons for that which can't be solved by other frameworks.
Electron got so popular because it meant you could write an application once for the web and then never have to care about cross-platform compatibility because that stuff is all taken care of for you. It had a great developer experience and a good-enough user experience.
I personally regret the rise of Electron because it's lead to slow, bloated applications that take half a gigabyte of memory instead of a few megabytes, but at the same time I understand why it's gotten so popular.
It's kind of like money -- the less you have, the more conservative you are with it. The more you have, the more you get conditioned to irresponsibly spend it and you start to care less about how much you are spending. It's not responsible, necessarily, but you've got enough to the point where you don't really care that you could be spending less; and I think that's kind of what's happened with the current problem of software bloat.
Yes, exactly as you say
Reasonable alternative to Electron:
github.com/cubiclesoft/php-app-server
Produces similar results at a fraction of the size (e.g. 85KB app size for Linux). And PHP is available to write code for the server side of the app.
How popular is this product? I haven't heard of it before, but it looks interesting. Are there any examples of software on it? What features did you personally use working with it?
P.S. Sorry, I did not notice that you are a representative of this software.
That's okay. It's obviously not as popular as Electron but a number of people have clicked star/fork, so it's slowly gaining traction. I don't advertise my software very often and just expect people to run into my stuff. As to software that uses PHP App Server, there's this:
file-tracker.cubiclesoft.com/
A commercial project I wrote and actively use/maintain.
Thanks for sharing 😊
I'll be sure to explore your products, and hope you'll find your audience 😌
I was checked my "
hello world" "mmorpg poc" application written in NextJS ( react + BE ) memory footprint was 2.5Mb. That is still count as huge. But I think that is quite fine in our memory gobbler times../server size is: 256kb.
./static (graphic files - maybe few unwanted still left): 1.3Mb
Worst part the node_modules lib: 350 MB.
This way of programming fare more distand from my Z80 assembly game code writtend around 1987 on Videoton TV Computer.
We could use far less code if browsers supported other languages besides JavaScript
Actually you can write any languages to browser compiled to WASM/JS/HTML. By the way if you create any JS/HTML application that is also need for compile to proper JS target, mainly ES5 for compatibility reason.
I think actual browsers is so complex application - with tons of unused API ( surprise ! ), any extra languages will be raise these complexity unnececcary.
The good question is:
Why don't able WASM reach these API endpoint and HTML page without any JS code?
Thanks for sharing 🌈
Just look at the calculator, which used to require less than a megabyte, but now it eats up more than 150 MB of RAM at a time with the same functionality.
I don’t know, maybe in Photoshop this is justified: there is a new useful and diverse functionality, albeit bloated code for its implementation. But in the case of a calculator, this is really just indecent.
A 20-year-old calculator did not have to think about hidpi, 4k, multi-monitor configurations, touch control, support for a dozen interface themes, synchronization to the cloud, loading exchange rates from the Internet and a whole lot of functionality that an individual user may not need, but others users do, and writing millions of versions for each is unprofitable.
The fact is that a modern calculator does not have to think about it either. All these themes/touchs/hidpi and others like them — all this is resolved by the libraries of the operating system. And in terms of functionality, it has not gone too far from a 20-year-old calculator, but it eats resources like AutoCAD or MathLab 20 years ago.
It is precisely the libraries that are part of the calculator that occupy those megabytes
They are part of the OS, why separate them? And it is highly doubtful that touch control support will increase the code by 40-50 MB.
This is the characteristic mindset of some modern programmers, justifying bloat with incredible complexity: all those "touchs/hidpi/unicode" is just a mantra to stop thinking. "What are you doing? Don't go there! It’s difficult, there are touch, hidpi, currency conversion and some other important things!”
SpeedCrunch: 4MB RAM. Has a portable version in the Portable Apps platform.
It does way more than Windows Calculator. The only thing it doesn't do is graph equations. For graphing, I will occasionally fire up:
desmos.com/calculator
I still run Calculator though when I need "as soon as I type it in" conversions between bases (binary, decimal, octal, and hex). That's something SpeedCrunch doesn't do very well.
They look like really useful tools, thanks for the recommendation 🤗
4MB are much better than 150
I have a better example. To turn off the RGB lighting of the Aorus graphic card (Gigabyte vendor), I need to install 0.5GB of software. It's that shitty soft... 400 megabytes to turn off the RGB.
It's funny. How much does the installer weigh? And how much space does the installed utility take up?
RGBFusion2.0 installer 253 MB.
When installing and uninstalling the program, it says that it takes 170MB.
In fact, 2 utilities are needed there, and as I understand it, the 2nd one used to be separate, and then became a plugin for the first one. 0.5GB is the total weight of the installers.
Can't you physically disconnect the RGB cable?
In my case, the card must be disassembled for this.
It would suck to lose the warranty.
It's a pity... on some of them, the wire can be traced and unhooked without unscrewing a single screw.
I have a Gigabyte card like this too. I'm pretty sure you can install the software, use it, then uninstall it and the light settings will remain (not really the content of this thread, I know).
When I mentioned .25GB for React, I was mostly talking about the total size of files it added to my
node_modules
directory. The "installer" was technically only a few MB, the compressed files it downloaded might have been (guessing) 50MB, but after unpacking all of them to text files, and using up a zillion inodes, and creating directory structures you need a rope and a torch to explore, the end effect is a lot more than the initiator.I think it'd be benefitial to talk about all these things separately (size of download, size of installation, amout of resources uses when running) because if the installation process only downloaded the files it needed, and everything else was run with libraries existing on the system, then that would be... great? Right? But you can have an application which behaves that way during installation and turns out to chew its way through all your available RAM when running, for example. And these things are different, and if they were the responsibility of different actors, they might be better optimised?
This is a similar topic, but I think that you are right and it is better to consider it separately, since my page will not withstand as many comments of such a discussion 😁
There's no need to install a 500MB application to turn off RGB lighting. A piece of electrical tape works wonders. It's non-conductive and comes in a variety of colors with black being the most popular.
I use painters tape, electrical tape, and even the sticky part of sticky notes to cover up the blindingly-bright LEDs that all modern technology gadgets seem to come with. Depends on how much light I want to let through that decides which route I go with to cover up the LEDs. When I want nothing showing, electrical tape gets used to great effect.
Using electrical tape for such purposes is an incredible bullshit, I'd rather install 0.5 GB of software. My video card is not 4090, of course, but that's how much you have to disrespect your hardware to do such things with it.
Here I can support you, I would not do such "modifications" with my hardware, in my opinion it is "village custom". But for someone a PC is just a working machine, then this is quite an effective option.
That is, you must first install the aorus engine so that it installs RGB fusion. If you install RGB Fusion separately, it doesn't work for some reason. Moreover, with this utility installed separately, aorus engine will not install it. Such crap...
500 MB is about the weight of Windows 2000 installation files, in which there is "a little" more functionality 🙃
This meme comes to mind:
programmerhumor.io/programming-mem...
Two elite astronauts landed on the moon with 2KB ram, and now over 20 million untalented people run slack with their 15GB ram. I'm not sure which is better, but certainly the former is more inspiring 😄
Now the first one just blows my mind. 🤯
I always complain because we sent man to the moon with waaay less memory, but my IDE and other daily software freezes from time to time and for no reason...
Well, don't use an IDE. Also, run Portable Apps wherever possible.
I run Crimson Editor (last updated in 2004) + Command Prompt + Windows File Explorer. That's my "IDE." Never freezes on me. Well, the first two don't. File Explorer likes to barf on network shares from time to time necessitating a reboot once every 6-8 months.
What OS are you using?
Lol that's right 🤣
Modern computers can absolutely waste a ton of CPU and memory, but at the highest levels, the organizations would not care about solving these minor issues. Why does my laptop need 15 gigabytes of RAM to have a spreadsheet, Microsoft Teams and one Chrome tab open? Couldn't tell you. I absolutely hate Windows for how useless it feels. Can't even open Outlook without it spinning up to 100% CPU usage.
But, things aren't always so obvious; software is objectively a challenging problem that requires a lot of well-thought out plans.
For instance: how do you write one project and compile it to work on other people's computers? If you were still writing C, you would have to write a C codebase and use a compiler to support a given architecture and hope the operating system can run your code. Then you have to test it and all that other fun stuff.
But then suddenly, you need to target an entirely different platform that you've never seen before with a different compiler. Suddenly your C code doesn't work, because the platform is different; compiler types are of different sizes, the endianness is a different direction, certain standard library functions don't exist or take different parameters, and now you're stuck writing C preprocessor macros hoping that your pains go away.
This was what coding was like in the awful days, and it can still exist too if you write C/C++. No compiler will make you happy. Everyone either acknowledged C/C++ was awful and moved onto other things during this time period, or simply stuck behind and said "this isn't so bad guys" as they wrote high performance software dealing with the issues that C/C++ brings. Java won a lot of fans because it was very portable with it's "write once, run anywhere" mantra, while C++ is still hated to this day by many.
With Java came other languages that offered more dynamic and flexible programming, like Python and Ruby, which most people scoffed at when thinking about building full-fledged software in. The performance metrics of these two languages aren't great, but sometimes people write Python/Ruby code that can interop with C-world and get decent performance. Fast forward one or two decades and now we have insane machine learning libraries that you can use with Python and are used at Fortune 500 companies.
The web is popular, but golly does it suck to write things for it. HTML pages aren't dynamic, so you need a language to be able to create dynamic pages that can retrieve information from database, so in comes PHP, which sells itself as a solution to a problem web devs were having with FastCGI and Perl. PHP proved itself as an okay solution, and somehow companies around the world threw their million-dollar industries at it and it got us decently far. But the browser was the real pain in the butt, so in came JavaScript, and JavaScript took off to the moon.
Where am I getting with all this? I'm mostly stating that things that alleviate headaches from the programmers are far more popular than things that don't. Writing and maintaining separate language codebases with different purposes is not necessarily better to some people who would prefer to have a "monolithic" repository of code that can do everything and not make things complex. JavaScript can now control the front end, the back end, it can be used to design games, GUI applications, and so much more without ever having to leave the comfort of the JavaScript language itself.
Facebook, Google, Apple and all the other tech companies are the foremost "leaders" of the technical world, and when Facebook publishes a library called "React" where the goal is to make the web easier to develop, what happens? Everyone's going to write React. Google makes Go and uses Go? Go devs will pop up in random places. Increasing your surface vector of being a potential hire at a Fortune 500 is a very promising idea for many aspiring programmers.
But this doesn't mean everything will appear pretty at the bottom of the totem pole. Organizationally, no one gets promoted for fixing memory leaks. Sometimes I have to close my Discord because it leaks memory after several hours and gets slow. Is anyone going to fix it at Discord? No, because it's an Electron problem (probably), and Discord isn't there to fix Electron problems, they're around to fix Discord problems.
YouTube, a site owned and ran by Google, has to polyfill in a ton of extra JavaScript for non-Chrome browsers, making the performance of YouTube on non-Chrome browsers not all that great. Google runs and upgrades Chrome with non-standard libraries so they can move fast, and in turn, makes other browsers perform worse on their sites. I can admit that sometimes other browsers are very slow at upgrading their features (Firefox), but it's not likely that Google will care about the non-Google browsers, as Chrome is included in every single Android device, and is renowned for being the most popular browser on the internet. Why? Probably because it's a Google browser lol, remember that part about everyone using React?
So in summation, we live in a society where poorly-built software is so common-place that people have to upgrade computers to go on Facebook of all things. The lowest common denominators of computer hardware are not the targets of big business, and probably never will be all that important. Our ten-year old laptops are deemed unimportant, and everyone is expected to upgrade their smartphone every two years. Why? Probably because nobody wants to write fast software made with slow phones in mind!
/rant
@sleibrock This is valuable insight, you really show how complicated and ugly things get and fast... and it's true, our world of software and technology is a mess and you really bring clarity to a messy topic
But to talk to @mariamarsh real quick, there's a couple things I want to point out:
I assume you wrote this in reference to Windows computers. Windows is my background and it's honestly a frustrating and messy OS, for many reasons I won't get into. It's not Linux, and Definitely not the prime pick to run containers in. But, it is an extremely complicated software platform where I've touched things I didn't even know existed that affect things I didn't even know X component could. However, since windows 7, they've much improved their act and it's a much more reliable platform.
Where are you getting your 99.9% figure from? How do you calculate resource waste? Is this in regards to software that runs on windows, or you talking about the OS itself? Or both?
I don't think, necessarily, file-size is the End-All, Be-All indicator of resource-usage/waste. For example, compare the file size between the exact same .csv and .xlsx(Excel) file, which one is smaller? File compression, and other factors play into this.
Libraries. Yes, There are built-in libraries in the OS, but that's not the end of the story. What about different Versions of the same library/software? I think it was Windows xp where you had to download different versions of .Net and .Net components for every piece of software. But the same thing applies to every language and dependency version.
Version mismatch and Dependency Hell are very real things that cause issues, from personal to corporate environments to this day. All software is built on software before it, and once you dive deep into what dependencies everything is built on, you'll never reach the bottom. Do you remember how one developer removed left-pad from NPM and broke the internet? That's the situation we are in with all of our software on any operating system. This is a joke tweet, but also, so very true:
"the most consequential figures in the tech world are half guys like steve jobs and bill gates and half some guy named ronald who maintains a unix tool called 'runk' which stands for Ronald's Universal Number Kounter and handles all math for every machine on earth" - twitter.com/6thgrade4ever/status/1...
This is a big problem with any modern operating system, whether it's Linux or Windows.
XUbuntu eats up 500-800 MB just after startup, and it needs 1.5-2 GB for some significant work. Win2K startuped and ran with128MB, WinXP with 256. Well, that's not entirely fair, because they were 32-bit: just to CALL to the full address, we need a 64-bit address, but XUbuntu has a difference even with WinXP by 8 times. In fairness, a workstation on Linux still doesn’t eat up more than the conventional 1-2 GB after startup, but Win10 / 11 can easily eats 2-3.
On Linux, all problems manifest themselves in exactly the same way, just try to build any open source project, it will immediately pull billions of the same open source libraries to itself. And many of them are needed only for the sake of one or two functions.
If it were not for the SSD, the available RAM, and the hardware instructions in the processors and their multithreading, the operation of computers running any modern OS would be a sad sight.
The main resources are eaten not by a bare machine, but by applications on it. Websites are almost entirely crap code, and one page "weighs" a hundred megabytes. This needs to be optimized on the server side, otherwise nothing. IDEA, VSCode and a bunch of other applications eat about the same (a lot) almost regardless of the OS. Another example for you is Jetbrains Toolbox, a little application for downloading and updating the IDE. It eats up 200-500 MB of RAM. What? How? Why?
Dependency hell can also exist in linux, I would not put 2 different versions of openssl, or libjpeg without "dances with tambourines". Look at the NPM and Composer dependencies of any site. Previously, jQuery was enough of all JS, but what about now? NPM folder can easily reach several gigabytes, and then from too many files the collector will fail and fall, great!
What about 99.9%, maybe I'm exaggerating, but the absolutely irrational loss of resources applies to both software and OS.
Thank you for your questions 🤗
I also advise you to read the comments of other users, there are a lot of interesting thoughts and opinions 🌈
"If it were not for the SSD, the available RAM, and the hardware instructions in the processors and their multi-threading, the operation of computers running any modern OS would be a sad sight." Well, yes, things are written for the current hardware standards of the day. People(developers, commuters, pedestrians) will "fill the space" of where they are. People naturally use the tools at their fingertips.
Nothing you've described is particularly new to me, but it feels like you are just describing the state of software in 2022. So, since I'm not sure what you are comparing everything to, I have to ask:
P.S. - Linux Dependency hell is particularly frustrating because if you try to update your packages, and one of those packages was installed by pip / is dependent on something installed by pip, the package manager could fail to update anything.
It feels like you are a very curious young man 😏
$500 and we'll face you in a 1v1 Discord battle to see who wins, the Dark Side or the Light Side 🔴⚔️💚You will be in the role of Darth Vader 👾
But I have a condition: I will take my father Chewbacca with me 🤣
Some of these conversations can be tagged under the "static linking versus dynamic linking" category and others probably file under "software bloat". What do you think your approach to application development is with respect to static/dynamic linking? Ship with deps, or ship targeting deps on a host environment?
We can assemble a separate blog on this topic from the comments under my post haha 🤣
With each new comment some new information is added and it's really cool 👍
Thanks for sharing your thoughts 🤩
I was doing "premature optimizations" at the beginning of carrier, was optimizing and minimizing what was possible, and later I learned that it's not something that is appreciated in our space, they won't pay you more for more optimal code, won't thank you and most likely won't even notice. Clients will appreciate if you finish a task in a shortest term, no matter how it's implemented and how much RAM and CPU it will consume. So being a good developer requires sacrificing performance for a development velocity. Code quality matters a lot so we can maintain the project instead of rewriting in the future, and this also has its price of performance.
Clients and companies are also not the ones to blame: they are focused on building a valuable projects with a limited budget in a reasonable time, they have much more important things to consider rather than resource usage.
I hate reddit from technical perspective, it's slow, buggy, inconvenient, fails with 500 on regular basis, but I still keep using it because simply don't know an other place with large community for IT topics. We, as users, want something that can serve it's purpose at least somehow and it's easy to find. Maybe there are much better alternatives to what we use but we just don't want to spend time searching for it. Gmail, for example, now it seems to be improved, but it was a masterpiece of bloatware some years ago. Maybe there is a lightweight and even free slack alternative, but no one is using it for the work.
Nobody is really guilty in this situation. I think team leads should try to convince business to allocate time (money) for optimization, but it's not always possible.
I'm learning Rust and there is a little hope that it will keep becoming more and more popular. Desktop framework Tauri is using Rust and it promises to produce 600Kb executables with same HTML/CSS/JS abilities as in Electron. But to be realistic, Rust is much more damn complex than JS. Between 600Kb executable and faster development + easier to find developers business will pick the latter.
Thank you for such an extensive feedback. 🥰
I agree with you, and I also mentioned in one of my comments that companies value cheapness and speed of execution, but not optimization. And that users use low-quality products because there are simply no others on the market.
I hope my post encourages more developers to try to write cleaner code, and I'm glad it's gaining popularity.
Have a nice day 😊
If that was true, if better alternative was missing on the market, someone would definitely create it. And they created it, I'm sure there are alternatives to FB, Reddit, Twitter, Gmail, Slack, but people just don't need something better, they use what others use. Some people still were using IE only couple of years ago. And this works even in development, people use popular libs and frameworks even if better exists for a long time.
That's a great intention for writing, developers should strive for building a better products, so thank you for posting!
Yes, there are probably many projects that do not receive due attention, and this is sad 😔
Thank you for your support, I really appreciate it 😊
You ablsolutely right! I think the root of the problem is that large companies that were once not very big started make development tools that intentionally make life easier for programmers outside of these companies primarily for the purpose of marketing. The idea was that developers outside of the big companies would do the least amount of effort and get the most out of it, while the developers inside the big companies would stay very experienced and develop those development tools. This situation gave rise to the fact that developers outside large companies after a while already demanded that development be as simple as possible. The simpler the development environment was (and languages with technologies, of course), the more developers outside of large companies received large companies and, as a result, more profit (a prime example of Flash technology). New developers are already coming to development environments in which simplicity is above all and they are simply afraid to go down a level, they think that black magic is going on there, and when they encounter a problem at a lower level, they think that it is enough to go to stackoveflow or write an issue on github and there are already dark lords lower levels versed in black magic just solve the problem. In fact, software development has been swallowed up by the same problem that has existed and will continue to exist in many industries. Initially, the conditional bubble of swelling is small and not visible, but it grows and becomes larger and larger. And once it bursts, it's all just a matter of time.
A very interesting thought. You almost wrote a whole article, thanks for such a detailed feedback. 😍 Have a good weekend 😊
honestly, I work as a dev for a mid sized company and its pretty apparent why this happens.
its a spiral of business value/impact of your software and the time crunch. you need to deliver a very impactfull feature within unrealistic deadlines. you lean towards one and you lose the other (getting a feature within the deadline might mean losing a bit of its impact and conversely getting the highest impact means spending more time on it).
This combined with the fact that most companies ironically care about time the most ( deadlines) and the fact that you need to work with multiple other devs who might not be on the same page as you, devs are never pushed the other way, they're never asked to push themselves to explore, truly be passionate and care for the product and create something that maximizes impact and performance, causing our current industry situation.
best way around this is for devs is to keep honing their own skills, work on their own little side project and explore the crazy advancements in web, tinker with styles trying to get lower latency faster ttfb.
I agree, it is often not the programmers who are to blame, but the business. Now the market value of an employee and his chance to get into a cash vacancy is determined not by the efficiency of the code, but by the knowledge of the list of trendy frameworks and libraries.
Beyond that, business decisions are driven by consumers: businesses do what drives sales and growth.
But consumers do not know the limits of the possible and do not understand that the product they are offered is bad, because they have not seen the good.
And in this closed circuit, programmers (and only a few of them) are the only ones who have an understanding of what a product can be.
Inexperienced devs with too little knowledge, using garbage frameworks and libraries, resulting in bloatware on top of bloatware. Devs are getting spoiled. They have these monster CPUs, and terabytes of memory, and believe the purpose is to consume as much as possible - Instead of conserving as much as possible. I made a point out of ensuring Magic can run on "dust" when I implemented its backend parts, and Hyperlambda, optimising it into oblivion, to the point where you can have the thing running on a Raspberry Pi if you wish. The result is that it performs 5x faster than PHP and 10x faster than Python, even though it purely technically should run slower due to its internal architecture ...
The irony ... :/
There is a chain reaction - new software increases the amount of code, the user has to increase the hardware, as a rule, with a surplus from the requirements of the software. New software is developed for surplus - why optimize if there are so many resources? And at some stage the resource is exhausted, the user buys new hardware.
And if user doesn’t buy it and stays on older versions, then questions of compatibility of formats and scripts begin to arise, it’s necessary to install some incomprehensible garbage to view / play the file, which you either can’t put on the old hardware / OS, or eats up the resource like crazy .
In fact, even in 2022, Microsoft itself is selling the type of “modern” Surface Go laptops that start with 4 GB of RAM. User puts Slack, Discord and other "masterpieces" there and that’s it.
And what you wrote about Hyperlambda is very cool.
I would like to correct the author - the first Elite occupied not 64, but 48 kilobytes, where the first 16 KB - it was a computer ROM with some analogue of the operating system.
And yes, in 2004 we had a Pentium-3 533 MHz at work, and one browser game on Flash worked perfectly on it. This flash player had a bad habit of being updated a couple of times a month, and after each update it worked slower and slower. After 5 years this game was already barely running on the 1700th Celeron.
To be even more precise, the original Elite version took only 22KB. But its version for the ZX Spectrum has almost doubled in weight - up to 40KB. But on the other hand, many different enemy starships were added.
How is that even possible? I just can't believe 🤯
Thank you for this remark. This makes it even sadder 😔
Yes, we've come a long way from the days when thousands of developers around the world would reinvent ten thousand wheels every day and every single one of them was convinced their wheel was the best wheel. Today we have embraced the concept of open source, where code libraries are reviewed and commented on and improved by thousands of developers, who see the value in contributing to the development of the best wheel that we can build with our collective knowledge. Sure there are inexperienced devs who chose (often for bad reasons) to use code libraries that are not well maintained. The other reason why software has more files and bytes now, is because the capabilities they can make use of, have increased dramatically. For example, web browsers can render 3d animations now. No computer could render anything in 3d in 1970s.
Yes, you are right, this is the problem with libraries, where everyone is trying to add something of their own. And when business requires a ready-made solution here and now, of course it is easier for a developer to connect such a ready-made library, from which he needs one feature out of 5000 🤪
Here we can smoothly switch to browsers: the problem is not in new features, because most sites do not use 3d, and in fact surfing ordinary text pages has gone not very far from the 2000s. The problem is that the sites are almost entirely composed of crappy code, which is why one page eats up hundreds megabytes.
Web browsers themselves are no better - they, like many applications, grow over time until they turn into monsters that contain another operating system of their own (sometimes two or more). It seems that in the end they must die under tons of their own unsupported code. But for some reason they never die. 😵
I propose to call this problem the "Problem of the French Scribes". There is a legend explaining why in French words there are many more letters in writing than they are pronounced (sometimes only one sound is pronounced out of 5 letters) - in ancient times, when there were very few literate people, and paper documents were already in circulation, scribes took bribes from clients for each letter. So they cheated quite good amounts for writing simple texts. Smart-ass puffed-up turkeys 😄 This is only speculation of course, but it is quite logical.
By the way, as far as I remember, when writing The Three Musketeers, Dumas was paid for the number of lines (like some programmers). Therefore, he specifically came up with monosyllabic characters with a bunch of stingy dialogues, for example, Grimaud Athos's servant.
In the end, the writer was told that lines that took up less than half a column would not be paid. Then he even thought about removing Grimaud from the story.
But ten years later, the writer was paid for the number of words - and Grimaud became more talkative. So the French trace is confirmed, and this problem can also be called "Dumas syndrome" 😲
This is a curious fact 🧐
This name sounds interesting and mysterious heh 🤖
Just in a process of rewriting one of my web applications to get rid of the most npm libraries as possible. Because when I examined them in detail they were bloated and chain dependent as hell.
I also decided to go with Sveltekit because its usage of native web features (forms, http requests, progressive enhancement ) seem really appealing to me Using the default in-build technologies and as less of JS in browsers seems like a way to go forward or rather to go back to roots.
You are one of the few who are on the Light side 💚💙💜
Cost of Dev Time > Cost of CPU/Ram/Disk. I think that summarizes it.
Back in the old days, hardware was expensive, hence optimization was crucial. Now that hardware is dirt cheap, what is "optimized" is the developer's time. Bloat is a "natural" side effect of that.
Another source is certainly scope creep, by adding feature after feature, that slowly and silently eat up the resources and grows the codebase.
I think that relatively cheap hardware is really one of the reasons.
But what about the features: not all software has significant improvements with each new version. We've already looked at the calculator and websites here, and there are many more examples. In most cases, "new" features do not justify such a large increase in code.
I use a Mac. It's better, but not as much as it should be.
The problem is that most developers are mediocre. This is not a dis on devs. Most people are mediocre at most things -- when they are not entirely incompetent.
This is unavoidable. Intelligence, talent, etc. are distributed on a bell curve. Most people are in the middle, hence mediocre, which means average.
So most devs copy and paste code, which isn't a problem unless you don't really understand what you're copying and pasting. Most devs turn to libraries, frameworks, and other dependencies as a first resort. They only code it themselves as a last resort. And then badly because using dependencies and copying and pasting doesn't teach you to code well.
Because most devs (like everyone else) are mediocre and, frankly, infantile (need supervision, essentially), there is an enormous overhead of process (and people to monitor that process).
Because we all (infants all) really just look out for ourselves, personal gain takes precedence over the group's ostensible goals. Most employees are working against the system, and those who try to do good work are generally frustrated and exhausted or driven out.
But the real question that no one asks is this: Do we really need all this garbage? What if we only made what we actually need, and then shared everything so that everyone was fed, housed, kept healthy, entertained, etc.?
Nope! Nose to the grindstone slaves! We must crank out another million lines of tech debt this week! Extinction is right around the corner. MUST. WRITE. MORE. USELESS. CODE.
You definitely need to write books haha 😄
I really like your comments 😍
can't agree with you more! I am still thinking if I should only get Macports only or also Homebrew. It also brings about the questions on hard coding from scratch or getting content management system.
🤪
You might find it interesting that writing small, fast code is not as dead as it might seem:
js1k.com/
Plus there is this:
icpc.global/
ICPC requires quickly writing code to correctly solve complex problems. Submissions get something like 30 seconds of CPU wall clock time and a certain amount of RAM to solve a problem. Code that takes too long to execute will get ejected. Most of the problems are designed with sneaky edge cases that will burn CPU cycles. Each failed submission causes the team to take a penalty.
Also, Assembly, C, C++, Rust, and Go communities are still pretty focused on writing really fast code. It's only within the interpreted language space (e.g. Javascript, Python, PHP) and compiled languages that come with massive libraries (e.g. C#, Java) where significant laziness starts creeping in. That's because the hardware has been abstracted away from the developer to such a degree that the developer has no idea what it takes to allocate a chunk of RAM or how data structures are stored or how instructions are executed by the CPU.
At this point, it's really only a matter of time until someone drafts a law that takes note of the inefficiencies in computer software and wasteful use of today's computer hardware. Such a law would require developers to write efficient software within certain specifications. The larger the organization, the more efficient the software would have to be written plus a scaled fine schedule up to 10% of annual revenues for any violation. Wasting CPU cycles = more electricity usage and also more heat. More electricity/heat = more global warming. Wasting RAM = more hardware. More hardware = more landfill e-waste. That law might not see the light of day in the U.S., but it might find purchase somewhere more receptive such as Sweden or Denmark. All it would take is one country somewhere to pass such a law and companies writing and releasing inefficient, wasteful software would find their bank accounts drained shortly afterward. Everyone else would realize "they're next" and optimize their own software. Software development is currently a largely unregulated industry but the bloated software being produced today makes it more and more likely to turn it into a regulated industry.
It is a pity that js1k.com has not been held since 2019, it was a very cool competition.
And what about ICPC, where did you find such requirements? I looked through their documentation and didn't find any similar info 🤔
You have a rather interesting idea about the regulation of development, I think it makes sense with the right approach, and maybe someday we will get to that. The main thing is not to create an even worse problem, as is usually the case.
Do you know what we do when things dont work my friend? We RESTART
In the first step, lets scrap all frameworks, angular, react, vue and a 1000 of their friends, scrap that webpack stuff, scrap vite and go back to plain html + css + js days
In the second step, scrap windows 10 with its shitty search, 1000s of bloatware programs running in the background and jump back to windows 7
In the third step, we setup a committee under every programming language to evaluate and approve libraries. If a library does not solve a genuine problem, no approval. This committee can be a centralized body or a decentralized institution thanks to blockchain
In the fourth step, we completely revamp the mobile OS, nobody needs 1 TB of HDD and 64GB RAM on a phone. Most of the apps are full of bloatware. The way you remove this is by making everything backwards incompatible and building the OS from the ground up again. This way the main OS removes all the bloat
This is a very bold idea, I think that with proper implementation, it has the right to life.🤔
Using the decentralization charms of blockchain technology, you can fix many things in this world. But, unfortunately, even among crypto projects there are only a few Dex/DeFi with the correct execution, so it is not worth thinking about normal use in real life in the next 5 years. 🚀🐕😏
Software is bloated because we no longer feel the pressure from hardware constraints. It's so much easier to pull in an existing library and write adapters to force solutions.
Code gets bloated because developers only optimize their code, and never think about optimizing 3rd party libraries.
Another problem is a lot of devs ship code with dependencies, source code, or a whole directory of config files.
Also if you think web is bad, let me remind you that Call of Duty and other AAA games are over 200gb!
This is true, we have already discussed this in other threads.
230 MB is more than Windows 2000 or Windows 98.
It was a whole operating system 32 and 16 bits 20 years ago.
But since then, almost nothing has changed in the user experience.
You are too categorical in "nothing has changed in the user experience". A lot has changed, ranging from more convenient and intuitive interfaces that do not require reading tutorials on them, ending with the increased functionality of programs.
I do not deny the problem that Maria raised, and I agree with her position. But this is the price for the fact that development becomes cheaper, which means that the user has a much greater choice of software that he can use. You can write a program as close to the hardware as possible and improve performance, sometimes it is justified. But then some company comes and says "how much does it cost to write a corporate website? $1k?? Why is it so expensive?!" and the developer takes a ready-made framework (bloated because it is universal) and quickly writes this site on it.
To make a good “intuitive interface”, you just need to be able to make good interfaces. It has almost nothing to do with the code.
And at the same time, not every interface is convenient and intuitive now. Take a look at the "simplest" LightRoom or Krita - not everyone is able to open a file without reading tutorials.
It will all blow out one day eventually I believe. And we’ll start building apps properly again.
Yes, it will definitely happen sometime. Roman described it well above.
I'm sure it will greatly change the user experience.
I wish you good code 🌈
Haha nice 😂😂😂
This is exactly what I'm talking about 🤪
That pretty interesting thoughts, but to your mind how should code develop in future? What are the perspectives?
We can only guess, express our thoughts and try to make our own code better.
I think we perfectly understand who is leading this game.😏
But I really like the idea of Roman and Fedor, and I think there is a high probability that everything will go according to this scenario.
What are your thoughts about it?🤔
Cool
Thanks! Have a great day!😊
it's ok i think
💋
👏🏻
🤗
I think that games on Nokia and Samsung phones have less weight than now in the "play market" or "up store", even if the quality of new games has improved, there are still too many megabytes, in my opinion.