About a year ago, right after I quit my big tech role to manage my burnout, I had what seemed like a brilliant idea at a time. In the months prior to that moment, I had grown more interested in AI so it made sense to go through several online specializations. A couple of months and a few dozen courses later, I was receiving my last certificate. I don’t regret it but I’d be hard pressed to claim it has helped me all that much. Instead, it was an almost chaotic mix of blogs, books, videos and working on an actual product that made a real difference.
No surprise why I wasn’t impressed, a few days ago, when I came across a post listing nearly a thousand technical certifications. On the surface, it might even sound like a conservative figure, except the post highlighted only prominent institutions. Yet, behind every major source, lies an abundance of lesser-known alternatives. Once you also factor in casual content creators, the numbers add up quickly. Yet, each day is still 24 hours long.
This begs the question why would you need to revisit the same exact content so many times, usually presented in almost the same exact ways? Wasn’t the original treatment supposed to cover everything you needed to be productive? Why introduce incremental changes that give you a false hope of novelty, if the primary resources already meet the audience’s needs? Perhaps they don’t.
If you’ve been in tech related spaces for more than a minute, you should be well acquainted with information overload. Not a novel problem by any means, yet one that has grown in recent years, in part due to misplaced incentives. So you doubt or blame yourself wondering if you’re smart enough or have put in the right amount of hours. You open yet another tutorial, go halfway through it, only to face similar issues: either it stops right after covering the very basics, or simply repeats some more advanced ideas along with a quick copy-paste of some code or diagrams. And there you are, yet again at the end, encouraged to like and subscribe for more of the same. Why settle for dozens of redundant Medium articles when a single Reddit or X thread explains everything? There’s no need for things to be so complicated.
You could also play devil’s advocate and argue that curiosity is a virtue and it drives personal growth. Fair enough. Except, as you grow older, exploring these topics competes with some of your other priorities for your time. The straight out of college luxury to spend tens of hours each week filtering through random materials dwindles a couple of decades into your career. However, in a field where significant shifts occur every five to ten years and the fundamentals will only get you so far, the expectations may stay constant or increase. That cost of lost time has to fall on someone: either you or your employer.
What causes this disparity?
After all, most people are perfectly capable to learn basic concepts and most things can be broken down into relatively simple building blocks, with a reasonable low effort.
While your experience may vary significantly, some things tend to come up over and over again. For example, the strong bias against generalist knowledge tends to favor specialization. Hence the frequently cited, yet often misquoted saying ending in “...master of none”. That isn’t inherently problematic, especially when depth and complexity are more relevant over broad general knowledge.
Nevertheless, some of the same mechanics and motivations can lead to terminology inflation, particularly when the ownership of concepts boosts one’s professional image and ultimately their bottom line. As competition intensifies and spaces become more crowded, it starts to clash with the choice for more banal, yet intuitive names. It’s no surprise when you go through entire books of “lore” only to realize there’s little new insight. Another couple of days down the drain.
Background knowledge is certainly useful. Multiple terms for similar concepts, not so much, as it only adds more noise and confusion resulting in a cacophony of jargon. Conversely, an excessive reliance on tradition and assuming what’s familiar has to also be intuitive, is equally detrimental. An established practice that has been followed for generations is not an inherent proof of its validity just like tenure and years of experience do not automatically guarantee anyone’s performance.
Which naturally brings us to another major aspect impacting the accessibility of a piece of information for the average consumer. There’s frequently insufficient interest in explaining the “why” behind choosing a particular technical solution or doing things in a certain way. Moreover, it’s almost an open secret that simply inquiring about the “backstory” can get you in hot waters due to being perceived as challenging authority and a sign of a confrontational personality. Similarly, it’s often assumed that “how” can serve as a substitute. That’s seldom the case. The former can act like a retrospective of your journey while the latter is just the destination you settled for.
From there to flame wars around the effectiveness of code comments is just a small step. There’s little value in adding comments that simply repeat what the code is already doing. However, when they capture the reasoning or underlying business logic, the situation changes. You’re providing essential context that can’t be easily inferred otherwise: the thing that everyone conveniently forgets to write down and it slowly gets lost as people leave the team. “Why don’t we delete that piece of code that hasn’t been used anywhere else in years? No idea, but let’s play it safe and leave it there, just in case.”
“That’s just how we do it”, “deal with it”, “take it or leave it, plenty of others waiting in line for this opportunity” can certainly end a conversation but probably not for the right reasons. Yet, being able to anchor a piece of information in sound logic yields more enduring benefits.
Now's the point where you could easily label this as just another snowflake rant or weak stream of consciousness puff piece built around anecdotal evidence and scroll down to the next content item in your feed. After all, if you really want something, you have to make some sacrifices, right?
This reminds me of an internal DEI training from a few years ago. At the time, I was still naive enough to think there were sincere intentions beyond merely protecting against legal risks, but that’s a story for another time. What particularly stuck with me was the findings from several studies about the business impact of neglecting a diverse audience. Once you strip the veneer of false concern about equity, the studies made it as clear as possible how exclusionary practices often failed to tap into lucrative sources of revenue.
What's the impact?
Education and upskilling are often seen as secondary aspects of business. Yet, this doesn’t always limit the negative effect that counterproductive entrenched approaches have on outcomes. Even a cursory glance reveals the ways in which it impacts individuals, businesses and researchers alike. Let’s take a closer look, shall we?
From a personal perspective, it’s crucial to recognize that different people learn in different ways. What feels and sounds intuitive to someone, might be completely obscure to someone else, even when engaging on a level playing field. Once you factor in learning disabilities, things become even more complicated.
Take, for instance, the well-known fact that individuals with ADHD strongly favor non-linear information processing while those with high-functioning autism require explicit context. Meanwhile, individuals with dyslexia may struggle with consuming text but can handle audio and video much better. As with fast fashion, one size fits all might be cost conscious but it’s seldom sustainable or beneficial.
No matter what, when looking at this through the lens of practical results, there’s a strong case against unchecked complexity and the compulsive obsession with completeness. Just because these things serve a genuine role for research and archival purposes, it doesn’t mean they translate equally well into a real world focused mostly on achieving fast results.
Having worked in tech for enough time, and provided your choices have not been fully subsidized by ZIRP free funding, you should know fairly well that any new piece you introduce has an added weight to it and it’s only justified if it serves a clear purpose. Otherwise, why commit to anything bound to increase complexity, slow things down and potentially generate tech debt?
You could argue the same principle applies to creating any kind of documentation, formal or not. Unless it makes things simpler, memorable or contextually relevant, it’s just another energy drain. Each new one chipping away at someone’s will power and competing for what is arguably a limited conscious context. Good luck managing this without a solid foundation of meaningful abstractions meant to anchor what you’re learning.
As with anything in life, as much as we fail to acknowledge or hate to admit it, we are opportunistic creatures. At any given time, we engage in weighing how profitable or feasible an endeavor is. When your intuition tells you there’s no point in trying, you’re less likely to start. Even if you pass this initial challenge and take a chance, you still may quit halfway through if one too many hurdles mount along the way. You conclude you might just not be suited for it and forfeit once again to imposter syndrome.
Still, I can only assume the worst of all is when you manage to trick yourself into a false sense of understanding and self-confidence by equating memorization with comprehension. This can lead to mimicking proficiency without truly being able to generalize. Similar to basic AI systems.
Things steer well into toxic, counterproductive territory once you notice the otherwise obvious imbalances. We often place blame on knowledge consumers by implying a lack of initiative, laziness, superficiality or even missing sufficient cognitive abilities. A dissonant stance from the same science that has publicized how differences in neurotypes influence how we process, model and recall information.
Whether that’s an unhappy artefact of academic elitism, an overly pedantic attitude or a certain level of scientific machismo, that’s still up for debate. One thing is clear though: bad traditions die hard, especially in the absence of any awareness or interest to even acknowledge it. Worse yet, when it comes as a result of: “I went through this and it made me stronger, so you should too”.
But wait, there’s obviously more, given how these things permeate well outside of personal spaces. Since tech strongly relies on constant learning, there’s no surprise why inefficiencies in one space bleed well into other interconnected spaces.
Let’s consider the industry’s infatuation with extensively using platforms like Leetcode for recruiting. At first glance, relying on theory and artificial problems masquerading as practical challenges can seem like a quick and painless solution. The results are measurable and this approach works fairly well for a wide range of potential candidates. Thus, it’s frequently hailed as the most effective method available.
After all what kind of professional are you without a good grasp on fundamentals: frog jump, house robber and best time to buy ice cream.
Upon closer inspection, it starts to look more like a cop out. You realize this kind of assessment offers no guarantees about practical knowledge or a candidate’s ability to perform well in a fast changing environment. It strongly favors new grads or anyone who is willing to devote copious amounts of time gaming the system instead of widening their hands-on skills.
It often relies simply on good memory and limited pattern recognition that fades unless exercised. Shouldn’t core knowledge be more enduring? Additionally, is a party trick all that valuable if an AI system can replicate it for just $9.99, in a couple of minutes, and with just a few retries, give or take?
Still uncertain about this issue? It ultimately places an extra burden on your existing roadmap. Then it slowly shifts the culture. In just a few generations, teams learn to rely heavily on a limited set of recipes for success which dissuades individuals from innovating or acting independently.
Are you sure your team affords implementing a big tech architecture when you lack either the funding or the business case to warrant such a “robust” solution? Once that's in place, you finally get to rest and vest. When things go south, you just pass the hot potato somewhere else. Change companies and let others deal with it. Rinse and repeat.
This results in the wrong set of incentives taking hold of recruitment followed closely by a thriving parasitic interview prep industry. We’ve seen this before, one too many times. For example, in QA teams driven to prioritize reporting as many bugs as plausible, rather than finding real-world issues. All as a result of misaligned success metrics.
For the sake of the argument, let’s assume you’re thriving despite relying on these aspects. More power to you. Think the effects stop there? Think again.
The same incentives that encourage sloppy or overly dogmatic processes, disconnected from the needs of their intended audience also impact a team’s daily operations. For example, once they establish a bias against proper technical writing or doing it just for the sake of it, things become dangerous. This leads to a gradual loss of knowledge and syncopated onboarding that invisibly reverberate into other activities, wasting precious resources.
It’s not uncommon, and I’ve certainly been through examples of documentation that looked like a goldmine on first inspection. Yet, it was nearly impossible to find solutions to specific problems. You’d be forced to simply go through tens of pages and try to distil that into a working example. This can add up significantly when multiplied by thousands of developers over the course of a whole year. Was it worth the added cost just so a select few can feel good with their overly academic treatment? I’d argue probably not.
As a direct consequence, it’s also only logical to expect an over reliance on well intended, albeit over prescriptive, set of “clean” patterns. Nothing wrong unless they become the ever present hammer of choice used to transform any problem into a nail, thus contributing to bloated codebases and processes.
This highlights the need for introspection around why we do things in the first place. Once you’ve reached a critical mass around a set of biases that favor performative completeness over pragmatic selectiveness, there’s no easy way back. A more complex solution is not always a guarantee you’ll escape tech debt. Sometimes it’s quite the opposite, especially if your system is changing fast.
Similarly, an overly academic internal culture can contribute to a strong preference for ceremony and the documentation equivalent of play pretend and make work. Which brings us back to one of the original ideas: just because something is comprehensive doesn’t mean it’s effective.
The hard truth is clients seldom care about what happens behind the scenes. They have their own problems and goals and usually only care if your product works well enough to make their lives a little or a lot easier. It makes little difference if every invisible thing is just right or if your product has a full collection of peer review stamps on it. In fact, it’s unsurprising when a culture with this kind of characteristics also showcases pushback against customer feedback. After all, “what do the customers know, we’re the professionals”, right? Sometimes it’s justified and you certainly shouldn’t bend to every request. Other times, it’s just a dangerous, lazy excuse.
Conversely, an excessive focus on scientific rigor, when left unchecked, can negatively impact research just as well. Obviously there’s nuance. However, a good example of things going south is once you start to equate a paper’s scientific value with the density of formulas and expressions.
As AI fills more of these gaps, intuition and connecting diverse ideas become equally important. Sharing a common language with your peers certainly helps. However, an overly dogmatic preference for jargon can end up excluding talent that might have otherwise contributed equally well. After all, the history of science is filled with stories of providential encounters and flashes of inspiration. Are they all be made up?
How do we address these issues?
While there’s no definitive solution, attempting to resolve them is still worthwhile. Addressing the root cause is a fairly easy optimization and a good starting point. You’re essentially taking in a finite amount of initial work, which for all intents and purposes can be outsourced or crowdsourced. This saves your audience from engaging in the same amount of work by simply making it good enough from the get go.
In other words, if your documentation's accessibility depends on more than a couple courses or tutorials, maybe it's time to reconsider either the audience or the presentation. Very much aligned with this is the over reliance on “the curious reader” to fill in the blanks. Providing necessary context and references, especially when consumed online, can remove a lot of friction.
Fewer obstacles remain for switching to a more interactive, real-time approach rather than the cold monologues we’ve been used to so far.
While information heavy manuals and references can still act as a fallback source of truth, maybe we should favor just in time documentation, customized to specific scenarios that the user encounters daily. That’s where AI powered solutions combining LLMs, RAG and knowledge graphs enter the picture, lowering hallucinations to a much more tolerable level while managing to quickly sift away contextual noise. Things become even more engaging once an AI system is able to understand and adapt to your own learning style.
However, there are two more benefits that often get lost in the hype. For one, an automated system provides a non-judgmental and unbiased set of interactions. There are either no wrong questions or if they are you won’t get penalized for asking them. A safe space without raised eyebrows or rolled eyes. Then there’s the added bonus of similarity search allowing you to correct and enhance your original requests when you’re not particularly sure what you’re looking for. Once again, little to no judgement attached to this, so more incentives to explore freely, course correct and make progress fast while staying focused.
As AI becomes more prevalent, providing a wider range of different formats catering to different audiences end up being increasingly straightforward. Also, leaning into multi-modal learning helps tap into different senses and ways to model information. By casting a wider net in terms of presentation, you’re more likely to hook into an effective personal mechanism that improves the chances for success. Do not just blindly assume things about your audience.
On the surface, it might look like just another way to package the same thing but there’s nuance to that too. A new presentation often ends up subtly restructuring the narrative, sometimes relying on different triggers and stimuli and eventually forcing your brain to anchor the information from different perspectives. No wonder, immersive experiences have been touted as the next best thing in areas like language learning.
Beyond focusing on the tools themselves, let’s consider other less tangible changes that can enhance the learning experience. Firstly, keep in mind that at the end of the day, even in the most technical context, you’re creating for people on areas that either impact or involve other people. This means you should consider some messiness and certainly a constant degree of change.
As an author, there’s little value in being the only one able to decipher the knowledge you’re sharing. Much like that US compiler course infamous for having a surprisingly high failure rate among some of the brightest students in the world. Is everyone really that unprepared or maybe the course is not doing a great job making the information accessible? That’s why it pays to stop and ask yourself "Does anyone actually get this?".
Circling back to one of my previous points, if you make it right the first time, you won’t force people to go through the same hoops over and over again. This goes hand in hand with understanding and adapting your delivery to the context in which you function. If you’re acting in a practical, commercial space, then prioritizing full scientific rigor over business impact is probably not a very good approach.
At the end of the day, the main takeaway is that learning is a deeply personal journey. While it often starts with paying attention and taking notes, it can easily turn into a pointless time sink and a hoarding exercise. Instead you’re probably far better off by simply experimenting and combining different tools until you figure out what works best for you. And if you end up on the other side of the table, you might find more fulfilment in facilitating access to knowledge than the fleeting high of shouting “I am the documentation”.
Top comments (0)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.