Header photo by mali maeder from Pexels.
"I have but one lamp by which my feet are guided, and that is the lamp of experience. I know of no wa...
For further actions, you may consider blocking this person and/or reporting abuse
All 20 quotes are good.
Though I would say they don't really give you a full picture of what happened.
In your storytelling it seems that technology always win! If that's the case, predicting the future would be really easy indeed.
But here is techno-skeptical prediction that has hold out really well in the last 30 years
So what about another list of technologics claims that flopped hard?
I think you would find a lot to say as well.
Please complete the list π
It's called Firefox.
Nope.
The rewrite of Netscape was called Mozilla and was a disaster as explained in this classical article
Things You Should Never Do : rewrite your software from scratch (Joel on Software)
Firefox came later and was the simplification of Mozilla.
So Firefox was the rewrite of Mozilla, the rewrite of Netscape. Meaning, by a separation of one iteration, Firefox is a rewrite-from-scratch of Netscape. And it turned out well. ;)
Jean-Michel is right - it was a disaster for Netscape. It was the cause for the company's bankruptcy (that's what their ex-employees say themselves!). Mozilla/Firefox source code was given away for free when the company was practically gone. And the software was way behind Internet Explorer in those days.
It took developers 2 or 3 years to make Mozilla/Firefox a decent product and it took several more years until one could say that it was a success. Netscape was long gone by then.
Hilarious!
Just one thought...
Maybe it ought to be? It's actually appalling how much memory (not to mention processing power) is wasted in modern programs, purely from lazy coding practices and unnecessary UI "chrome" and effects.
I need the shiny, Jason. I need it.
We can afford that though, since memory has become so cheap that even a few MB more or less don't really matter to anyone anymore.
My prediction (which I know may end up in a similar lists in 20 years): RAM prices and SSD speeds will continue to fall/rise resp. so much that we can just put a couple of TB flash memory in our computer which serves both as persistent storage and working memory. This would open up some interesting new possibilities for software too, basically eliminating all storage-related loading times (software and OS startup, loading screens in games etc.)
Look up Intel Optane drives...basically Enterprise grade SSD drives that are fast enough to work as a RAM cache (to my very limited understanding). I have one in my room I've been meaning to play with but lack the proper adapter to connect it to one of my rack mount servers.
EDIT: Fixed product name
You forget that managing memory costs CPU time.
High memory usage is often paired with high number of dynamic (de)allocation. This causes memory fragmentation. De-fragmenting memory costs even more effort.
High dynamic (de)allocation is not bad. Or at least, not in the Java world. Just do not keep the data around for a long time.
That'd be nice, but I'll counter that it probably won't work out that way. As always, our lazy coding habits and unnecessary bells-and-whistles will take up all the available memory, even if there are terabytes of RAM available. Consider that a single tab in a web browser now takes up more memory than was available for the entire Apollo mission.
Also, I never trust flash/SSD for primary persistent storage. You can't recover data from it when it fails. This is why to this day, HDDs are still often used for persistent data where recovery is a necessary possibility; the recoverability is a side-effect of the physical characteristics.
Unfortunately, we'll always be limited -- to some extent -- by memory hierarchies and physical distances in the machine:
electronics.stackexchange.com/a/82...
Indeed. There's a hard physical limit, at least until someone cracks the code for making a consumer-friendly system that stores at the atomic level...and even that has its limits.
It makes me value even more the memory we have. The average computer has more memory and CPU power than the supercomputers of the 80s. Wasting it on poor coding practice and unnecessary graphical fireworks is such a shame! We could be funneling all that wasted memory into more useful things, making our computers do far more than they do now, and far more efficiently.
Related: qr.ae/TWo1hM
Very fun.
Seems like the Cloud companies would all love to make this true in their favor π¬
Charles Darwin Junior invented serverless before it was cool!
I was thinking something really similar π
#2 have become true. Who uses telephones anymore.
#17 was true. At that time Apple was quite worthless as a company.
Indeed - it would not be fair to laugh at #17, because Apple was a completely different company before the return of Jobs. You can hate him for many things, but you can't argue away that it was his sole achievement to turn around the fate of Apple when nobody believed in the company anymore. Who could have anticipated that?
To be fair, the Apple Jobs "saved" was the Apple Jobs helped create. Jobs needed this "time out" to get this priorities straight again. Although NeXT was still no success, Jobs did figure out again than you also need to ship products. That you cannot wait until you have the perfect thing.
Apple failed more when axing Jobs by also throwing out the culture that made Apple back in the day. The board and CEOs only wanted to play safe and not invest in new technology.
To me it feels like Apple has been returning to the mid 90s version of itself. The main difference is they are now huge.
Very entertaining π just shows how little we actually know, even those who are very smart by educational standards... I for one canβt wait to get a nuclear powered hoover π
In 2014 i was working as an assistant professor and in one lecture about game development I said in future we are going to have a 3D games that people can be emerged into to game virtually a professor from Australia he said no. After about two years we gave VR games.