I’m not in academia and I’m not here to tell you about how Artificial Intelligence can be better used in learning institutions nor do I think I have the standing to do so. What I’m here to tell you tho is that in conversations about AI, using it for personal learning is one of the underrated facets of that conversation. Yeah, I get why it is underrated, we love to talk about trending things and this facet is not just among the trending AI-related things to talk about.
A recent (somewhat) comprehensive research by San Diego-based National University showed that just 19 percent of AI consumers use AI to summarize long or complex writings, which is the closest thing you’ll see in the research related to using AI for learning. Mostly, people use AI to "get productive", which tends to mean a lot within that context. But we need to start discussing and littering the internet with discussions about using AI for personal learning, not just because it should be an essential facet of that conversation but also because using AI for learning is fun and good as heck. I'll start with the littering.
I have been amazed at the possibilities of AI when using it for learning and troubleshooting than when using it for work. In fact, using AI for work makes me ponder about its long-term success. But Using it to clarify concepts that I have a vague understanding of or as a learning aid for tools/programming language that I am trying to get a hang of? That moves me. What makes AI a very effective tool for learning is the capability to endlessly tune its specificity. You could ask AI any form of question, whether obscure, normal, or seemingly dumb, and you would get a precise answer. Not only that, you could probe the answer further to get as specific and comprehensive as possible until your curiosity is satisfied.
A very recent example of how AI has been helpful with learning or making sense of things for me in the very dynamic world of technology is this:
The background to this is that I've been learning Bash Scripting with a blog tutorial published on Linux Config. The tutorial is beginner-friendly but I've experienced that it often glosses over obscure concepts, which is expected because it is designed to be practical. Also, if you are like me whose curiosity leads into always accidentally discovering related but intermediate concepts about something you are learning while at a beginner level (basically jumping the gun), you'd notice that self-inflicted complexities is the typical bane we get from such an act.
Well, I had figured out a relatively complex way to do something and when the tutorial introduced me to a simple and better way to do the same thing, I got confused, which led to the images.
I didn't do anything special I couldn't have done with Google search, but I definitely wouldn't have gotten responses this precise and comprehensive as quickly as I did using Google. To get the level of satisfaction I got using Chat GPT, I would have had to glance through multiple web pages and re-write my texts in various ways.
Having a dedicated learning aid that you could converse with anytime and get immediate responses that answer your question or at the very least open a doorway that leads to an answer is incredible, especially when you consider the alternative (Seriously). But that should be about it because using AI as a primary means of fundamental learning unlocks the frustration that everyone who uses AI can relate to. Actually, this is a typical problem with every possible thing you could do with AI. AI is best used as a secondary source, in some cases, you are the primary source and in other cases, there is a specialized source acting as the primary source. This avoids the inaccuracies and misinformation that AI tools are prone to, and ensures that the AI tool does not get overwhelmed (yes, they get overwhelmed) and starts generating incoherent gibberish.
For instance, if I want Chat GPT to create a summary of everything I have written so far, I would have to first feed the writing to it, tell it what I want the summary to include in addition to the intrinsic elements of what a summary would include, then tell it how brief or not-so-brief I want the summary to be; In this case, I'm the primary source. Another instance is recently when I was trying to do a Windows USB installation, which I had never done, using a Reddit tech support wiki document, I faced a problem that wasn't accounted for in the wiki and I had to troubleshoot using Chat Gpt
After a quite long conversation with Chat GPT and multiple google searches, I figured out the solution was a USB-C compatibility issue and I just had to flip the flash drive and use the USB 2.0 edge. In this case, I had multiple sources to combine with Chat GPT, including a primary source (the r/tech wiki), and that eventually led to a moment of realization.
The logic of complementing AI to bring the best out of it instead of burdening it should be implemented in everything we do with AI. Hence, AI should be used as a learning optimization tool and not a primary source of learning.
One of the biggest problems with AI usage today is that people still don't know how to use it safely and effectively, which is mostly a result of the porousness that comes with novelty. We need to start talking about incorporating AI literacy the same way we talk about incorporating financial literacy. People need to learn how to use AI to enhance learning, fix basic technological issues, and effectively perform simple tasks. While that is happening or isn't happening, you need to consider incorporating AI into your personal learning experience, not just because it makes your life easier but also because it is fun as heck!
Top comments (0)