DEV Community

Cover image for What are the best tools/concepts to shoot yourself in the foot again and again?
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

Posted on • Edited on

What are the best tools/concepts to shoot yourself in the foot again and again?

The best software / programming concepts have put a lot of thoughts on how to prevent developers to not shoot themselves in the foot again and again.

Should we learn from the best? Sure, but most often we don't actually realize how much efforts have been put into preventing those errors. We are focused in the task at hand (thanks to them!). At best we are grateful that the thing we wanted to do wasn't as complex as we feared.

If learning from the best doesn't work, what we can do is to learn from the worst.

And that's where I need your stories and insights

What are the best tools/concepts that empower developers to shoot themselves in the foot again and again?

The kind of things that could have been as welll designed by an evil genius curious to see how much confusion it can infuses in the world.

It could be anything (IT-related):

  • a concept considered as harmful as GOTO (nullable booleans?)
  • some wonderfully confusing piece of interfaces (git's cli?)
  • some overly complex architecture principles (microservices)?

And what lesson can we learn from this? (Because they didn't in fact do it on purpose, so we could too).

Please use your keyboards, release your frustrations and let's celebrate the worst of the worst!

And vote for the worst answers.

image

Top comments (38)

Collapse
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard • Edited

How to quit a program in Unix?

vim has become a meme for being harder to quit than smoking or alcohol

Google

but is it really fair though?

Here is how you can quit vim

:quit
Save changes to "/private/tmp/hello.txt"?
[Y]es, (N)o, (C)ancel:
Enter fullscreen mode Exit fullscreen mode
  • the : part opens a command palette much like in visual studio code or IntelliJ
  • quit seems like... utterly reasonable.
  • then the user may have unsaved change, what's the right thing to do? That's ...asking to save, discard or cancel. Utterly reasonable as well.

So what's the big deal?

One issue is code golf : developers striving to advertize the shortest possible solution that solves a particular problem (quitting without saving). So they will tell you to use :q! . Code golf is bad. Strives for clarity, not for shortness.

But the true issue is UNIX's masterfully simple yet evil decision to let each program decides what it feels is the best way to quit for himself.

That seems high minded. Aren't app developers adults?

Adults don't approach decision making this way. Instead, to evaluate whether a decision is good, they look at and only care about one thing: the effects of that particular decision has in the real world:

Here the decision forces the user to recommeber:

  • want to quit vim? that's :q! for ya'
  • want to quit nano? That's ^X N. Supposedly much easier?
  • want to quit man? that's q. Shorter than vim, code golf win!
  • want to quit emacs? That's C-x C-c
  • want to quit python? That's >>> quit()
  • want to quit node? That's .exit
  • want to quit perl ? That's ^c
  • want to quit telnet? That's ^]. For some reason?
  • want to quit a ssh session? That's exit
  • want to quit echo <<HERE ? That's HERE
  • want to quit a java program? That's sometimes killall java
  • want to quit something with kill? often programs will choose themselves how to interpret your signals, very nice and handy

I am sure I forget some, but it's enough I think to see the masterfully evil decision UNIX made.

The evil decision being to not take any decision.

Which is also a decision.

And often a very bad one.

image

Collapse
 
qainsights profile image
NaveenKumar Namachivayam ⚑

How to quit ed? That's q :)

Collapse
 
katafrakt profile image
PaweΕ‚ ŚwiΔ…tkowski

Abstracting too early, when you don't have enough knowledge what to abstract, it my go-to recipe for disaster.

Hype-driven development, when you take a tool or a concept just because you heard it's cool, without understanding it and the need for it. With that, you can easily shoot yourself with completely valid things, like microservices, but you introduced them from the start and misidentified what should be a separate microservice. Or GraphQL used in a way that you just basically expose your database schemas to frontend without much thought or control, which ends up being security disaster.

Collapse
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

Awesome answers, agree on all of them thanks

Collapse
 
wadecodez profile image
Wade Zimmerman

Didn't know it was called "hype-driven development" but I always get a kick out of people jumping ship to some new "blazingly fast" web framework. It's like the web dev community has an epiphany every week. Or perhaps building a framework nowadays is some sort of business model. Weird.

Collapse
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

I think the term comes from this amazing task

Collapse
 
n13 profile image
Nikolaus

We have had the one web framework to rule them all for over 1 decade now, probably 2... and there's a new one every year. Web is crap environment to program in, and it's constantly providing solutions to its own ineptitude.

Thread Thread
 
wadecodez profile image
Wade Zimmerman

So true! It's a blessing and a curse for sure. I'm pretty sure all the packages save time because I don't have the reinvent the wheel. But at the same time, I find myself digging through source code almost more than my code. Gotta love open source haha!

Now that I think about it, I do the same thing with other languages, PHP, Rust, and Python. Oh no!

Collapse
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

Partially agree with you.
There are different things, the reason I bring them together is that all of them have to be learned.
I agree that even if Unix did the right thing, there wouldn't be only one thing.
But maybe only three ":quit for quitting, Ctrl-c for force quitting, Ctrl-d for end of input.

Collapse
 
mcsee profile image
Maxi Contieri

maximilianocontieri.com/null-the-b...

NULL
The worst concept

Collapse
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard • Edited

evil advocate: so how do you deal with accessoing data from Map<Key, Value> ?

Or are there never a good use cases for maps?

Collapse
 
mcsee profile image
Maxi Contieri

Data is accidental

it the data is not present, we should deal with "not present"
Null does not mean 'not present'

Thread Thread
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard • Edited

I beg to differ and to claim that in a modern programming language, null is actually the simplest correct way, which means the better way, to represent not present in something like map[key] or list.find { someCriteria }.

See Null is your friend, not a mistake by Roman Elizarov

Fear of null leads to some extreme practices. There are Java coding styles that forbid null completely, forcing heinous workarounds upon developers. Have you seen a codebase where every domain object implements a special Null interface and must provide manually coded β€œnull object” instance? Lucky if you have not, but I bet you’ve seen Optional wrappers that pollute Java code only for the sake of avoiding nulls.
(...)
The truth is that the concept of null is not mistake, but Java’s type-system, that considers null to be a member of every type, is.

Thread Thread
 
mcsee profile image
Maxi Contieri

ok. You can continue using null.

Good luck with your null pointer exceptions!
BTW, Optionals are code smells, too

Thread Thread
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

No worry the compiler prevents npe for me.

Thread Thread
 
thedenisnikulin profile image
Denis

what type of compiler can prevent you from such runtime exception? :0

Thread Thread
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

The compiler of any statically typed language that doesn't replicate Java's error of allowing 'null' (not present) as a valid member of all types.

Typescript is a good example.

And for me Kotlin

Thread Thread
 
katafrakt profile image
PaweΕ‚ ŚwiΔ…tkowski

I don't know Kotlin a lot, bit from reading these docs it looks like a "nullable type" is in fact an option type, but hidden behind a smart syntax.

The truth is that the concept of null is not mistake, but Java’s type-system, that considers null to be a member of every type, is.

This is a core feature of a null. Taking something different and calling it null does not change the original sentiment towards null in my opinion.

Thread Thread
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard • Edited

This is a core feature of a null

It's strange to say that a mistake is a feature.

Interestingly they didn't do this mistake for boolean which can only be true or false but never null. I prefer when booleans have only two values. Don't you?

  • Let say the type String does not accept null like in Kotlin
  • Let say you don't like the letters "n-u-l-l" so you define a singleton Absent instead when something is not in a map.
  • Let say you have union types, so you define type StringOrAbsent = String | Absent
  • At that point you realize being interoperable with Java would be very important, so you give up Absent and use type StringOrNull = String | null
  • At that point you realize that's a very common use case and you implicitely define String? as being String | null... and do the same for all types
  • At that point you build lots of clever syntax shorthands to work with null. Adding null safety to the language doesn't make it more cluncky like Option anymore and people love it.
  • null is your friend, not a mistake.
Thread Thread
 
katafrakt profile image
PaweΕ‚ ŚwiΔ…tkowski

It's strange to say that a mistake is a feature.

Not at all. Lots of languages and frameworks have features that are mistakes.

Interestingly they didn't do this mistake for boolean which can only be true or false but never null.

Because it's a primitive? Primitives cannot be null, IIRC from writing Java some 10 years ago. But Boolean can be null.

null is your friend, not a mistake

Again, you described an option type hidden behind smart syntax. That's not null people talk about when saying that null is a billion dollar mistake.

Thread Thread
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard • Edited

Boolean doesn't accept null because it's a primitive?

That's the technical reason yes.

My point is that it's the right thing to do. Only the worst psychopaths use nullable "booleans", an insult to mathematical sanity.

Again, you described an option type hidden behind smart syntax.

No I'm not.
Option<String> is a monad (aka it has .map() and .flatmap()), it's stored as wrapper class in memory.
String and String? on the other hand is more like a union type. The values are stored just as string, no memory allocation.

That's not null people talk about when saying that null is a billion dollar mistake.

I happen to have referenced Tony Hoard's exact quote in an article

I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement.

That goal is perfectly implemented in a type system that allows null but only explicitly, and then it has to be handled safely. Not implicitly, and everywhere, and good luck.

Thread Thread
 
darkwiiplayer profile image
π’ŽWii πŸ³οΈβ€βš§οΈ

I think you're kind of talking past each other.

Null-Pointers (think C) are a necessary foot-gun because the language needs the low-level concept to be both performant and simple. Pointers, in this context, are just numbers that we interpret as memory addresses, and numbers can be 0.

Null-Objects, on the other hand, make no sense at all. If your variable if a User, and your User class has a name attribute, your variable should necessarily always have that name attribute.

Nullable Types, that is, union types with a special non-value type, make about as much sense as union types in general: it's debatable whether they are a good idea, but they work well for real-life programming. The point is that no sane type system will let you put something nullable where a non-nullable type is expected, which mostly fixes the footgun problem.

Thread Thread
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

Perfect summary,, completely agree πŸ’ͺ🏻

Collapse
 
aarone4 profile image
Aaron Reese

ORM, Repository Pattern and Active Record.
By the time you realise they won't scale with complexity, you are already committed and the refactoring cost is too high.

Collapse
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

What's your issue with the "repository pattern"?

The first time I used an ORM it sounded like a good idea. Now in my career I have used a dozen ORMs and I realized I wasted my time learning 12 variations of SQL with their own quirck, instead of learning just the quirks of SQL.

Collapse
 
aarone4 profile image
Aaron Reese

My issue with RP is the technology that is normally part of the same stack, namely the ORM. It also assumes that the shape of the data on the database is the same as your OOP class which in larger systems is unlikely
I'm a database guy so I would much prefer that you update the database using a stored procedure where I can carry out additional validation, optimisation and logging that is not practical if you are injecting new records or updating existing records using a magic abstraction of the data layer.

Thread Thread
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

Thanks, that makes more sense with a detailed context!

Collapse
 
n13 profile image
Nikolaus

"I'm just gonna put up a quick JavaScript hack inside 48 hours"

Years later, still debugging that JS code...

Collapse
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

Yes, that's called human interface guidelines, and no coalition of individual developers can do this right, this must be defined by the system owner.

developer.apple.com/design/human-i...

Collapse
 
srirammahadevan profile image
Sriram.Mahadevan

Exception Handling

Collapse
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard • Edited

You mean for things that are actually not exceptional like htppClient.get(url)?

Yeah why would anyone not assume that network calls always work?

Collapse
 
bobrundle profile image
Bob Rundle

Technical debt. It turns out for most programmers, technical debt is "code written by others."

Collapse
 
wadecodez profile image
Wade Zimmerman

How to shoot yourself in the foot? Start programming.

JK but you can easily waste weeks trying to manage project dependencies. Even with a package manager, bugs can propagate into a release and now your code is broken and it isn't even your fault.

Collapse
 
jmfayard profile image
Jean-Michel πŸ•΅πŸ»β€β™‚οΈ Fayard

Start programming.

That's a nice and simple answer :)