DEV Community

Michael Levan
Michael Levan

Posted on

Will AI Take Kubernetes And Developer Jobs?

At what point do we all stop thinking and just let AI do the thinking for us?

Well, no time soon (at least until General and Super Intelligence is no longer just theory).

The truth is, LLMs/GenAI/Prompt-Based AI is really no different than Googling something from an information perspective. You just get it faster… and that doesn’t mean it’s any less wrong than what you read on Google.

In this blog post, we’ll break down where the idea of AI currently exists and what could be the potential for it in the future.

Where AI Currently Exists

It’s pretty hard to find a place in tech where AI doesn’t exist. Whether you’re talking about chatbots or automated workflows, the power of LLMs seem to be everywhere.

Before diving deeper, you should know at a high level what LLMs are. In short, they’re no different than the data models we’ve always had, they’re just bigger (that’s why multi-billion dollar AI factories are being created). Because they’re bigger, they need more power. LLMs are still being trained on data that already exists. It can, however, “make up an answer” to give what it feels to be the most educated guess (like humans do).

From a programming/engineering perspective, the biggest places that you may see GenAI/LLMs is with assistants like GitHub Copilot.

Will that actually help us though?

The Idea Of AI In Programming

There was a time where programmers had to perform their own garbage collection and compiling. Now, it’s as simple as a command to compile code and most modern languages do garbage collection for you automatically.

It’s no different with AI. GenAI/LLMs/Chatbots are really just helping us with the low-hanging fruit.

Can it write the code for you so you can copy/paste? Sure, just like you can Google and copy/paste code from StackOverflow.

Does that mean you should? No, absolutely not. Remember - GenAI/LLMs are not anywhere near the Genera/Super Intelligence realm. It’s not thinking for itself. GenAI/LLMs are pulling from data models that already exist, and those data models were created by, you guessed it, humans… and as we all know, humans can be wrong.

Between the security concerns, humans needing to check the results, the results being wrong when produced by chatbots, and quality assurance, humans are still needed for programming. We’re just “going up a level” like we did when we no longer had to worry about garbage collection. This is kind of good when you think about it.

For example - do you really have to write your umpteenth method/function that does the same thing from a structure perspective? Or can you just have an AI assistant take care of the “template” for you while you focus on the logic? Let’s not forget the power of pair programming. If you don’t have anyone around you that you can partner up with, you can use an AI assistant (just remember to still leave your house every once in a while).

In the next two sections, you’ll see two examples of where an AI assistant may be able to help.

Where AI Currently Helps With Kubernetes

First, there’s the realm of Kubernetes and to be honest, AI assistants help a ton here. The truth is that not a lot of people wake up in the morning and think to themselves “oh boy, I can’t wait to write some YAML today!” or “writing this same Manifest for the millionth time is going to be so much fun!”.

Instead, you can have an AI assistant get the template out for you and you can revise as you go.

Below is an example from GitHub Copilot (I ran this is VSCode).

Image description

Here’s another example from Google Gemini. I asked the same question and got the same result.

Image description

Some things to consider here. For example, notice how it automatically put in the latest container image. Do you know you want the latest container image? How about the name of the app and the port?

You, as the human, will need to modify the template to work as expected.

Where AI Currently Helps With Programming

Here’s a more programmatic example.

Background: I’m currently writing a new product feature to incorporate support for HashiCorp Vault. The problem is that the two JS libraries that are most used are unfortunately (or at least they appear to be) no longer supported. Releases haven’t gone out in a very long time and GitHub issues aren’t being answered.

So, what is one to do? Well, I asked a few colleagues if they knew of any other libraries, but I also asked the AI Assistant (GitHub Copilot).

First, I pretty much said that it looks like the libraries are no longer supported.

Image description

It gave me a response to another unsupported library and a link to another library that didn’t exist (yeah…).

Image description

After that, I asked to confirm that the node-vault library was no longer supported. I was then told to fend for myself (go figure it out).

Image description

Last but certainly not least, which is the cool part, it said to me that I should just interact with Vault directly using the axios library.

Image description

In a world where third-party libraries losing support and breaking your environment is such an unfortunately reality, getting this suggestion from GitHub Copilot made sense.

Closing Thoughts

The world changes rapidly. The world of tech changes even more rapidly. The truth is, we can’t stay stagnant, it’s impossible. We have to “grow with the times” and it can potentially be exciting.

Top comments (0)