As software developers, learning is a vital part of our role. We work in an ever-changing field where new technologies and ideas are continually being introduced. The high level of complexity that we deal with means there is always a better way of tackling a given problem. We wear so many hats and need so many different skills that it can be challenging just keeping up to date, let alone expanding our knowledge.
Despite the dynamic nature of our chosen field, I often see developers that don't feel like they are learning as much as they want to be. They don't feel they have sufficient opportunities to keep their skills relevant and up to date, or they have exhausted what is available. They can grow frustrated and become passive, delaying growth, and waiting for the right set of circumstances to present itself.
It seems that many developers have accumulated misconceptions about learning in the working world. Perhaps we failed to transition from the methods that were effective while we were learning at university. We mistake thinking that learning has to occur in our own time and wait for the time, energy, and motivation to start a side project. We think that our manager should be providing us with training resources and a learning plan. We believe that the only things worth learning are the newest and shiniest technologies, and so wait for our company to start adopting them before we resume learning.
In saying this, I don't want to tell you how you should or shouldn't learn. Nonetheless, it is crucial to recognize when you want to learn but are feeling stuck, looking in the wrong places, or stressing and burning yourself out spinning your wheels going nowhere.
Thinking that learning has to happen outside of work
We have become surrounded by this mythos of coding in our personal time. Blog posts present aspirational stories of what the authors have been able to put together. Tales of people who've built their own company from a side-project. Open source projects that we use daily that are built and maintained in someone's own time. Our co-workers show off their side-projects and participate in programming competitions. Online threads perpetuate the idea that 'the best' developers also code at home. The recent trend of listicles titled "Five side projects ideas that will turn you into a great developer!." We treat writing software as a hobby, as well as a job. Finally, add to the mix a tendency towards individualism and self-reliance, and the result is the idea that we should be coding, learning, and working all hours of the day.
Now, don't get me wrong; personal projects can be very satisfying. You get to build things the way that you want to. You can tinker and hack away at interesting problems. You get to build something that makes your life easier or gives back to the community. Perhaps you're working in a terrible dead-end job, and writing code at home is a way to remind yourself that you still enjoy creating software. And when it is finally time to change jobs, we need to deal with fussy recruiters who expect us to come pre-loaded with experience in a long list of technologies that we have never used.
But what we miss is that the people doing all of these things in their own time are usually doing it because they enjoy it. That's not to say the rest of us don't like developing software, but maybe eight hours a day is enough, or what we enjoy are teamwork and collaboration. We each have hobbies and interests outside of software and code.
The trap we risk falling into is attempting to adopt two opposing viewpoints: that learning and growth can only occur in our own time, and that coding is the last thing we want to do in our free time. This contradiction paralyzes us. Either we give up learning, or it leads us in the wrong direction: rather than looking for opportunities for learning and growth that can happen at work, we try to figure out how to convince ourselves to start a side project. I have fallen into this situation more times than I can count and have a directory littered with side projects that I started because I thought I should, but without having the time or motivation to make any real progress.
Finally, I suspect we also overestimate the value of learning from home. At work, we are working on a decently sized project; there are many good examples available to learn from, and its size provides us with lots of challenging design and structural problems to tackle. We have co-workers with knowledge and experience that we can leverage, and who can review our code. What we are learning can be immediately applied. We can spend more time getting things correct, as there is usually a lower tolerance for bugs and technical debt. Compare this to personal projects, which can take a long time before they get big enough that we are learning something besides rote memorization of method names and project structure; where, when you need help, we can spend hours on StackOverflow, trying to get unstuck.
Expecting our managers to organize our learning
The pendulum can swing too far in the opposite direction, where we decide not only do we not want to learn at home but that we don't want to be responsible for our learning at all. We adopt the view that education should be the responsibility of our employer and be in the form of training courses, conferences, Pluralsight subscriptions, or allocated "learning time" on the job. We expect that their managers are responsible for organizing training, initiatives, and learning plans, or introducing new technologies into the workplace for the sake of our learning. When this inevitably doesn't happen, we shift into holding mode, waiting for our manager to see the value of education, or to open up the purse strings.
Now, I don't want to suggest letting managers off the hook entirely here, as they do play an essential role. But rather than by directly organizing learning plans and training resources, they can facilitate learning by influencing the culture and values of the development teams. They can create a 'learning culture.' An environment where finding the best solution is encouraged, even when it requires some additional time upfront to learn or research alternatives. An environment that encourages employees to take the initiative to share knowledge. Where 10% time is seen as valuable, not just for paying down technical debt, but because of the learning opportunities it provides. Where teams are trusted to acquire the knowledge they need in the ways that they see fit.
Regardless of the availability of learning resources or the presence of a learning culture, it is essential to recognize whether you have put your continuing growth on hold. Have you have elected to remain stationary while you wait for others to take action for you, or for the right situation to present itself?
Placing too much emphasis on learning frameworks and languages
The third mistake that I see developers make when thinking about learning is to place all of our focus on tools and technologies. By focusing on the new and shiny, we devalue the technology stack we are already using. Rather than fixing our current problems, we fixate on the idea of rewriting and upgrading to the very newest, holding off learning until this happens. With so many tools we could learn, we underestimate what a narrow part of our discipline learning new technologies covers. When we prepare technologies we want to use in the workplace ahead of time, we risk making bad decisions and introducing inconsistency that makes life at work more difficult.
At least initially, when learning a new language and framework, we stay at the surface level, memorizing the expected structures and the names of common methods and APIs, copy-pasting pieces together out of tutorials and Stack Overflow comments. It takes a while before we get deep enough to start thinking about structure or design or tackling the tricky problems. As with personal projects, it takes time to reach the threshold where learning a new tool becomes worthwhile. As we flit about, moving from framework to framework, we risk staying at this surface level, feeling like we are being productive, but never really getting deep enough to take anything away from what we are doing. We get stuck, continually learning new things, but never really getting anywhere.
When we fall into the trap of focusing on the new and shiny, we end up warping our understanding of software development. I have worked with many developers who become frustrated when they don't get to work on the hottest new technologies. They think that working on or mastering even slightly old technologies is a waste of time that will cause them to get left behind and hurt their careers. This attitude can also lead them down the path of thinking the grass is always greener, that the current solution is unfixable or too hard to work with, and that we need to tear it all down and start again. We think that problems are solved by switching to the newest technology rather than by polishing what we already have. When developers only want to learn new technologies, and when the only options at the workplace are slightly old, they decide to give up on learning instead.
When your team does finally start a rewrite or a new greenfield project, it is worth considering just how useful it is to be "pre-prepared," having learned some frameworks and tools ahead of time. While there is a lot of value having someone with some experience and the ability to help everyone else get up to speed, this foreknowledge can also risk anchoring decision making. Frameworks and technologies aren't just a matter of taste. They come loaded with enough differences and considerations that making the correct choice is too crucial to be swayed by what a single person chose to learn in advance. It is vital to choose the most appropriate tool for the problem and the entire team.
The reality is that no technology stays current for long, and something newer and better will always be around the corner. In any case, as you grow in experience, learning new frameworks and languages becomes trivial. There is little need to prepare in advance once you get to the point where you can become productive with a new tool or technology in a matter of hours or days. Further, you start to recognize that most transferable pieces are the hard bits: the things you only learn once you have spent the time mastering your current tools.
Once we recognize these traps, we can realize that learning doesn't require our personal time, our manager to be our teacher, or for us to keep up to date with the latest and greatest technology stacks. Instead, we can take advantage of our co-workers' knowledge, improve our skills in the technologies we are already using, work on our soft skills, or find areas of our current application that need some love and attention. Looking in the right places, we can always find something new to learn or an existing skill to practice and recognize the opportunities that already surround us.
Top comments (0)