The advent of LLM tools like ChatGPT sparks many questions. Will these tools improve our lives, take our jobs or just change them? Are we close to AGI? ...
A more philosophical question is: what do they tell us about ourselves? They force us to look at our jobs and hobbies differently, maybe even at ourselves/our identity.
These tools show us that many of the tasks we thought to require intelligence and highly skilled workers, maybe just don't.
In many cases it is possible to substitute intelligence with vast amounts of data. We even name the category of jobs most at risk of replacement knowledge work, not intelligence work.
And we all know this deep down and have also likely taken advantage of this in school. You either are attentive during lectures and learn and try to understand the subject, or you just cram everything into your short-term memory for the test. What is better in the long run, at least for us humans, is clear. But machines have the advantage of unlimited memory and near instantaneous and perfect recall.
It's kinda sobering to realize that there doesn't seem to be too much to what we thought to be highly qualified, uniquely human work.
So what is left for us. Have we created all the data that is needed to replace us? Have we already done everything that is to be done? Is everything yet to come just a recombination of what already was, or will LLMs stagnate without our continuous input? Is maybe empathy the only thing uniquely human/animal - from homo sapiens to homo empathia? And what exactly is intelligence anyway? π
Top comments (0)