The best software / programming concepts have put a lot of thoughts on how to prevent developers to not shoot themselves in the foot again and again.
Should we learn from the best? Sure, but most often we don't actually realize how much efforts have been put into preventing those errors. We are focused in the task at hand (thanks to them!). At best we are grateful that the thing we wanted to do wasn't as complex as we feared.
If learning from the best doesn't work, what we can do is to learn from the worst.
And that's where I need your stories and insights
What are the best tools/concepts that empower developers to shoot themselves in the foot again and again?
The kind of things that could have been as welll designed by an evil genius curious to see how much confusion it can infuses in the world.
It could be anything (IT-related):
- a concept considered as harmful as GOTO (nullable booleans?)
- some wonderfully confusing piece of interfaces (git's cli?)
- some overly complex architecture principles (microservices)?
And what lesson can we learn from this? (Because they didn't in fact do it on purpose, so we could too).
Please use your keyboards, release your frustrations and let's celebrate the worst of the worst!
And vote for the worst answers.
Top comments (38)
How to quit a program in Unix?
vim has become a meme for being harder to quit than smoking or alcohol
but is it really fair though?
Here is how you can quit vim
:
part opens a command palette much like in visual studio code or IntelliJquit
seems like... utterly reasonable.So what's the big deal?
One issue is code golf : developers striving to advertize the shortest possible solution that solves a particular problem (quitting without saving). So they will tell you to use
:q!
. Code golf is bad. Strives for clarity, not for shortness.But the true issue is UNIX's masterfully simple yet evil decision to let each program decides what it feels is the best way to quit for himself.
That seems high minded. Aren't app developers adults?
Adults don't approach decision making this way. Instead, to evaluate whether a decision is good, they look at and only care about one thing: the effects of that particular decision has in the real world:
Here the decision forces the user to recommeber:
:q!
for ya'nano
? That's^X N
. Supposedly much easier?man
? that'sq
. Shorter than vim, code golf win!emacs
? That'sC-x C-c
python
? That's>>> quit()
node
? That's.exit
perl
? That's^c
telnet
? That's^]
. For some reason?ssh
session? That'sexit
echo <<HERE
? That'sHERE
killall java
kill
? often programs will choose themselves how to interpret your signals, very nice and handyI am sure I forget some, but it's enough I think to see the masterfully evil decision UNIX made.
The evil decision being to not take any decision.
Which is also a decision.
And often a very bad one.
How to quit
ed
? That'sq
:)Abstracting too early, when you don't have enough knowledge what to abstract, it my go-to recipe for disaster.
Hype-driven development, when you take a tool or a concept just because you heard it's cool, without understanding it and the need for it. With that, you can easily shoot yourself with completely valid things, like microservices, but you introduced them from the start and misidentified what should be a separate microservice. Or GraphQL used in a way that you just basically expose your database schemas to frontend without much thought or control, which ends up being security disaster.
Awesome answers, agree on all of them thanks
Didn't know it was called "hype-driven development" but I always get a kick out of people jumping ship to some new "blazingly fast" web framework. It's like the web dev community has an epiphany every week. Or perhaps building a framework nowadays is some sort of business model. Weird.
I think the term comes from this amazing task
We have had the one web framework to rule them all for over 1 decade now, probably 2... and there's a new one every year. Web is crap environment to program in, and it's constantly providing solutions to its own ineptitude.
So true! It's a blessing and a curse for sure. I'm pretty sure all the packages save time because I don't have the reinvent the wheel. But at the same time, I find myself digging through source code almost more than my code. Gotta love open source haha!
Now that I think about it, I do the same thing with other languages, PHP, Rust, and Python. Oh no!
Partially agree with you.
There are different things, the reason I bring them together is that all of them have to be learned.
I agree that even if Unix did the right thing, there wouldn't be only one thing.
But maybe only three "
:quit
for quitting, Ctrl-c for force quitting, Ctrl-d for end of input.maximilianocontieri.com/null-the-b...
NULL
The worst concept
evil advocate: so how do you deal with accessoing data from
Map<Key, Value>
?Or are there never a good use cases for maps?
Data is accidental
it the data is not present, we should deal with "not present"
Null does not mean 'not present'
I beg to differ and to claim that in a modern programming language,
null
is actually the simplest correct way, which means the better way, to representnot present
in something likemap[key]
orlist.find { someCriteria }
.See Null is your friend, not a mistake by Roman Elizarov
ok. You can continue using null.
Good luck with your null pointer exceptions!
BTW, Optionals are code smells, too
No worry the compiler prevents npe for me.
what type of compiler can prevent you from such runtime exception? :0
The compiler of any statically typed language that doesn't replicate Java's error of allowing 'null' (not present) as a valid member of all types.
Typescript is a good example.
And for me Kotlin
Null safety | Kotlin
I don't know Kotlin a lot, bit from reading these docs it looks like a "nullable type" is in fact an option type, but hidden behind a smart syntax.
This is a core feature of a
null
. Taking something different and calling itnull
does not change the original sentiment towardsnull
in my opinion.It's strange to say that a mistake is a feature.
Interestingly they didn't do this mistake for
boolean
which can only betrue
orfalse
but nevernull
. I prefer when booleans have only two values. Don't you?String
does not acceptnull
like in KotlinAbsent
instead when something is not in a map.type StringOrAbsent = String | Absent
Absent
and usetype StringOrNull = String | null
String?
as beingString | null
... and do the same for all typesNot at all. Lots of languages and frameworks have features that are mistakes.
Because it's a primitive? Primitives cannot be
null
, IIRC from writing Java some 10 years ago. ButBoolean
can benull
.Again, you described an option type hidden behind smart syntax. That's not
null
people talk about when saying thatnull
is a billion dollar mistake.That's the technical reason yes.
My point is that it's the right thing to do. Only the worst psychopaths use nullable "booleans", an insult to mathematical sanity.
No I'm not.
Option<String>
is a monad (aka it has.map()
and.flatmap()
), it's stored as wrapper class in memory.String
andString?
on the other hand is more like a union type. The values are stored just as string, no memory allocation.I happen to have referenced Tony Hoard's exact quote in an article
Android's billion-dollar mistake(s)
Jean-Michel Fayard ๐ซ๐ท๐ฉ๐ช๐ฌ๐ง๐ช๐ธ๐จ๐ด ใป Sep 25 '19 ใป 10 min read
That goal is perfectly implemented in a type system that allows null but only explicitly, and then it has to be handled safely. Not implicitly, and everywhere, and good luck.
I think you're kind of talking past each other.
Null-Pointers (think C) are a necessary foot-gun because the language needs the low-level concept to be both performant and simple. Pointers, in this context, are just numbers that we interpret as memory addresses, and numbers can be 0.
Null-Objects, on the other hand, make no sense at all. If your variable if a
User
, and yourUser
class has aname
attribute, your variable should necessarily always have thatname
attribute.Nullable Types, that is, union types with a special non-value type, make about as much sense as union types in general: it's debatable whether they are a good idea, but they work well for real-life programming. The point is that no sane type system will let you put something nullable where a non-nullable type is expected, which mostly fixes the footgun problem.
Perfect summary,, completely agree ๐ช๐ป
ORM, Repository Pattern and Active Record.
By the time you realise they won't scale with complexity, you are already committed and the refactoring cost is too high.
What's your issue with the "repository pattern"?
The first time I used an ORM it sounded like a good idea. Now in my career I have used a dozen ORMs and I realized I wasted my time learning 12 variations of SQL with their own quirck, instead of learning just the quirks of SQL.
My issue with RP is the technology that is normally part of the same stack, namely the ORM. It also assumes that the shape of the data on the database is the same as your OOP class which in larger systems is unlikely
I'm a database guy so I would much prefer that you update the database using a stored procedure where I can carry out additional validation, optimisation and logging that is not practical if you are injecting new records or updating existing records using a magic abstraction of the data layer.
Thanks, that makes more sense with a detailed context!
"I'm just gonna put up a quick JavaScript hack inside 48 hours"
Years later, still debugging that JS code...
Yes, that's called human interface guidelines, and no coalition of individual developers can do this right, this must be defined by the system owner.
developer.apple.com/design/human-i...
Exception Handling
You mean for things that are actually not exceptional like
htppClient.get(url)
?Yeah why would anyone not assume that network calls always work?
Technical debt. It turns out for most programmers, technical debt is "code written by others."
How to shoot yourself in the foot? Start programming.
JK but you can easily waste weeks trying to manage project dependencies. Even with a package manager, bugs can propagate into a release and now your code is broken and it isn't even your fault.
That's a nice and simple answer :)