DEV Community

Cover image for Artificial "Intelligence" and Controversial Ideas about Future Technology
Ingo Steinke, web developer
Ingo Steinke, web developer Subscriber

Posted on • Updated on

Artificial "Intelligence" and Controversial Ideas about Future Technology

This article is intended for developers with some prior knowledge of web technology. A less technical article with a greater focus on computer generated art, cyberpunk, and augmented reality can be found in my weblog (open-mind-culture.org). I published this as a DEV post just a few days before the hype about chatGPT, which I will discuss in an updated last chapter. Spoiler: I don't fear that it will make my job obsolete any time soon.

The cover image features original photography side by side with seemingly similar imagery created by a computer. Note that the artificial versions of myself are wearing a black hat that does not exist in reality, but only as a detail of the street art mural I had been standing next to.

Discussing the Future of the Web and Technology

In the ongoing discussion about new technologies, some of which are often subsumed under the umbrella term of "Web3", I try to keep an open mind for legitimate and helpful use cases, as I had been fascinated by early metaverse and cyberpunk ideas in science fiction literature. I also loved the idea of an open, decentralized, interconnected global network which is the internet, or which was the internet before it got more commercialized and centralized. Its "western" zone is now dominated by an oligopoly of companies mostly based in California, USA. This might be better than the censorship and state control in some other parts of the world, but it is still far from the original concept.

Intelligence vs. Machine Learning

Another popular misconception, apart from "Web 3", is "AI" or "artificial intelligence", a misnomer for applications making use of machine learning. The latter expression might make it more evident at first sight, what is actually happening: much like a search engine or a diligent student, "intelligent" applications reproduce and remix existing knowledge and artwork. Consequently, they also reproduce misconceptions, stereotypes, racism, ableism and other biased aspects often unknown and unnoticed.

In the current heated discussion, I wonder why so many fellow developers keep getting overly excited by the new features, fearing to become obsolete due to new technology, or otherwise criticizing openAI for the wrong reasons, becoming easy prey for the latest fad bootlickers.

Getting Upset for the Wrong Reasons

Despite all of the other reasons to get upset about (like climate change, war, poverty, politics, pandemics) and despite so many positive advancements on the other hand (eco-tech startups, non-profit communites, diversity, and some specifically developer-related advancements like new CSS features and coding assistance based on machine learning), I dedicate one (and probably only one) blog post to the latest hype, to add my own limited experience, and some inspirational artwork.

Let's have a look back into history, when many of today's convenience was nothing but science fiction:

Cyberpunk Literature

Screenshot of image search results for early cyberpunk literature

There are several blog post about futurist novels that include descriptions of virtual reality and global communication, like The Sheep Look Up by John Brunner and Snow Crash by Neal Stephenson.

While some of our current technology and future research might have been inspired by literary sources, it falls short of its potential.

As I replied to A.J. Sadauskas on mastodon.social,

the current Web3 enthusiasts offer no efficient alternatives to the commercial and centralised Web 2.0, while the fediverse and IndieWeb movement focus on the decentral and robust principles that the internet was build upon and which made email, usenet (NNTP, described in RFC 977 in February 1986), HTTP and HTML such a success in the first place.

Just to mention some noteworthy reads again: the web has no version numbers, and "Web3" is going great (not!)

Is Artificial "Intelligence" dumb and biased?

Machine learning refers to the fact, that we can train algorithms using input data, not only accelerating the time it takes to develop complex applications, but also to create interfaces generating unexpected output in a way that makes them seem to be sentient and intelligent.

But feeding large amounts of mainstream culture's output into machines tends to reproduce undesirable prejudice and bias found in our society and our past to present culture.

This phenomenon is not limited to machine learning, but when it manifests in code and machines built and documented by human teams, it might be easier to point out and adjust.

Read how Dr. Joy Buolamwini is fighting bias in algorithms and how to make technology serve all of us, not just the privileged few.

Digital Artwork thanks to Machine Learning

The Open AI art movement on the other hand already created tons of detailed high resolution images, either photo-realistic or looking like a handcrafted painting, often creating an "uncanny valley" effect due to misplaced details and seemingly unimportant errors that no human artists would ever come up with. While many digital artists seemed to favor a sinister, dystopian gamer aesthetic which has already been popular on platforms like deviantart, other artists, like my friend Andy "Retinafunky" Weisner, experiment with said flaws and different aesthetics.

Screenshot of AI artwork by retinafunky on instagram: "Cubism city , inspired by Picasso, imagined with Midjourney AI"

Criticizing AI Art for the wrong reasons

Some traditionalists criticize AI Art for the wrong reasons, denouncing it as plagiarism or no real work, as they fail to see innovation and creative effort.

Looking back in history, many famous artists had assistants, many of them stayed anonymous, some got famous as well. So we might conclude that you can be a brilliant artist without being the one painting every single brushstroke.

When photography was not Art

Photography, now an accepted art form exhibited in galleries, was criticized in the beginning in a discussion much like many of today's controversies about algorithmic art and AI art. When photography was not art, "the fear has sometimes been expressed that photography would in time entirely supersede the art of painting. Some people seem to think that when the process of taking photographs in colors has been perfected and made common enough, the painter will have nothing more to do."

Just like with photography, to produce a stunning work of art using modern algorithmic tools, you can either be extremely lucky, or you have to experiment and be inspired, taking your time to learn the tools and parameters, evolve and improve your art over time.

But what are my points about AI art them? I fear with AI art, we are passing too much power to algorithms thus losing control and losing touch to the real world, nature, people and social, ethical and environmental topics.

Nothing left but random squares?

Image description

Here is an image of myself in front of a poster showing the United Nations' 17 Sustainability Goals (SDG). All variations of this picture created by the public version of Open AI's Dall·E distort the icons and text and replace it with illegible symbols or random letters.

I fear that this might sum up how the current "AI" systems view our world, and it shows that they are either not that intelligent after all, or that their capabilities might prioritize aspects that take the relevant meaning out of our culture, valuing style, aesthetics and presentation much higher than content and context.

But then again, this proves the point, that we still need actual artists, and that Dall·E, Midjourney, and other tools are power tools, but still not very useful without human interaction.

I think that time will tell, and if practiced seriously, AI is a valuable and innovative tool for digital artists.

Other forms of Digital Art

There is creational art, creating images, objects, or text, done by creative artists like bleeptrack does. There is augmented reality art, combining actual real-world artworks seem to become alive using AR apps like artivive when looking at an enhanced painting.

I will follow up and dig deeper into the details of the artistic aspects of digital technology in post at open-mind-culture.org.

Now let's revisit an aspect of "Web3" that I am 99% critical about: NFT and the energy consumption of some of the latest trends in information technology. Training a single AI model can emit as much carbon as five cars in their lifetimes. It seems hard to calculate the CO₂ emissions of NFT "mining", but the popular cryptocurrency Ethereum uses about 31 terawatt-hours (TWh) of electricity a year, about as much as the whole of Nigeria, according to an estimate based on the Ethereum Energy Consumption Index.

NFT: a Good Idea Misused by Scammers?

You may have seen the various images of a "bored ape" cartoon character, often used as a profile picture by people not even creating any artwork whatsoever, trying to profit from the hype around blockchain, cryptocurrencies and non fungible tokens (NFT) investing a lot of money in hope for return of investment.

Bored ape images and a news headline "Someone buys a Bored Ape, gets scammed out of it two hours later"

Wasting energy to deceive people with NFT and the Metaverse

NFT and cryptocurrencies offer people to participate the profitable art market and other investments without having a bank account, a credit card, or being recognized by established gallery owners. NFT also created a large black market to scam aspiring artists, developers, and other hopeful individuals. Mining cryptocurrency using energy-consuming calculations in a blockchain is a waste of energy that could better be used for other purposes, even more so in the face of human-made climate change already starting to destroy our planet.

"Recreating" extinct species and places in a virtual enviroment, like Tuvalu "uploading itself into the metaverse" only accessible by using visual technology (headsets, cameras, displays) does not help either, and the current pitiable state of the so-called metaverse makes it look even more ridiculous. It reminds me of the final scene of the dystopian science fiction film Soylent Green.

A Second life as boring as the first one

A virtual reality does have its benefits, but unlike a real environment, it does not provide sunlight, fresh air, delicious food and have you ever tried to dance or swim in a virtual environment? You might remember Second Life: most people seemed to enjoy creating a second life as boring as their first one.

Same with #chatGPT and visual image generation: many people seem to get excited about the output that looks impressive at first sight. But like the images' uncanny valley artifacts, take a good look at generated code unless all you plan to do is submit a solution to a coding kata.

Summary

Web3 misses the point of their alleged goals like decentralization, equity and helping to create a better, more diverse and creative world through digitization. Web3 enthusiasts waste energy and compromise security and integrity of data and money, and give up freedom and agency, allowing to be manipulated by algorithms and greedy companies.

Copilot, tabnine, and ChatGPT as tools for developers

We can use artificial "intelligence" as a tool. As a tool for artists, as a tool for copywriters, and as a tool for coding, too. I admit that I have been using @tabnine, GitHub copilot, JetBrains context actions, and static code analysis like eslint, stylelint, phpstan, and code sniffer. I also use Grammarly to improve my writing, especially when posting in English, which is not my native language. I have been using all of those tools at the same time, they have saved me some debugging detours, some keystrokes, and some StackOverflow searches for generating boilerplate code and generic documentation. And I will also evaluate how chatGPT might come in handy. But I don't fear that any of those tools might seriously put my job as a senior developer in danger.

There are hopeful digital innovations that might help us build a better tomorrow despite, so let's take our time and find out how!

Top comments (2)

Collapse
 
ooosys profile image
oOosys

Yes ... this aligns perfectly with my own experience: But feeding large amounts of mainstream culture's output into machines tends to reproduce undesirable prejudice and bias found in our society and our past to present culture. . A reason why I have quit using ChatGPT after the AI helped me to improve my language skills and clarity of mind to a level making it possible to see clearly its limitations of not being able to provide other help on starting the oOo journey as encouraging to follow this path to success.

Collapse
 
valvonvorn profile image
val von vorn

Is Artificial "Intelligence" dumb and biased? It is - on purpose. Someone profits.