DEV Community

Cover image for Day 2 - Shepherding Streams of Tokens && why I call myself Kanzish 🦍
KANZISH 🦍
KANZISH 🦍

Posted on

Day 2 - Shepherding Streams of Tokens && why I call myself Kanzish 🦍

I've been following Kanzi the bonobo for like 15 years, when I first came across a video of him working with lexigrams:

The username "Kanzish" is inspired by him...it's a play on words on Kanzi and a vision I have for a fantasy universal shell script (which end in .sh) that is so good even a literal monkey could use it

Bash Script is a shell scripting language, which most commonly run in Bash environments, which are themselves the default shell for *nix Operating Systems as far back as the 80s: https://github.com/ghaerr/elks

ELKS linux running vim and cmatrix? on 3 ancient IBM PCs

A large part of our world today runs on Linux, including computers, mobile, edge, and embedded devices, robots, drones and possibly even from-scratch hand built breadboard computers

So not only is the scripting language itself lightweight, but so can be the environment, and host machine it runs on. It's so lightweight, you can run nested generative emulations of it without installing anything on any device using a single html file with dynamic ?run="code to run on start" queryparams without even requiring a server: https://github.com/kanzish/sim.html

The idea is that kanzi.sh would represent the smallest, most universal shell script possible that can be run on the most devices possible with the least amount of compute by simply cURLing an API:

photo of Linux running either natively or via emulation in the browser of a computer, laptop, phone, and my wristwatch all at the same time using a single static, offline LLM and Bash LLM wrapper

So Kanzi is an ape and apes are often used to represent evolution, which is a kind of generative process. This "process" is what's guiding this research

There's a concept I've been thinking about called "shepherding a stream of tokens", where tokens are the literal tokens generated from an LLM. "The stream" is what I call the processes that generate the literal flow of electrons used to generate the tokens. My mental model of backpropagation is like an evolving streams of streams of electrons

Gif of neural network

"Stream of tokens" is also a play on "stream of consciousness", which itself refers to both literally the series of moments of awareness you have && also literally the literary form of freeflow writing

Proteins undergo processes, and some proteins give rise to bacteria, and some form bioms that influence our behavior. In a similar way, groups of these bioms give rise to societies and societies are driven by a process we call zeitgeist

Processes can happen at the planck scale and below to cosmic scales and potentially beyond...

looping animated gif of game of life emitters and gliders recursively giving rise to themselves in a grid like universe

The cool thing about Bash is that it's so old that even small LLMs are great at writing it. It doesn't need to compile and even if your system doesn't have a bash environment it's so lightweight you can just emulate dozens, hundreds, thousands, potentially millions of instances of them locally

Another reason I chose to do this in Bash is that it allows piping, which is taking the output of one script as input to another. This means you can recursively call scripts to create agents, and the agents can even swap out the model it uses. It can run commands like cat to read its own source code and even improve itself

gifs of various simulation and agent experiments i made with generative iframes where any url you type generates a real website or webapp or in this case a keyboard, paino, marble game, and operating system running actual javascript that affects its own page

One of my approaches to try and grok this is to personally find the smallest possible, most universally accessible, zero permission, recursively running evolutionary network of generative Bash shelled LLM wrappers that maximally aligns all the compute available to you, locally, using your filesystem as a mixture-of-as-many-shell-scripts-as-you-can-run-and-simulate framework and then explore different ways to interact with it: https://github.com/basherbots/chat.sh

The hypothesis is that AGI is basically here and that anyone can activate it with the right set of prompts and context. Kanzish is kind of a "vision board", daily tech log, shower thought, and longform stream of consciousness blog and just one way I'm trying to capture my thoughts as I explore different prompts and chains and contexts and tools to tap into these streams and evolve myself

It's less something that I'm creating and more something that I'm tapping into, like a personal journey, a way of prompting and learning and building and growing with generative AI

To me it's less about making money now or getting there first and more being here now and learning how to get there using personally aligned AI with me as the copilot. A human-ai collaboration

A full spectrum computer science, neuroscience, engineering, universal human-animal-ai-computer interaction systems. Instead of going max speed I'll be going present speed and enjoying and documenting the journey

It's a longform creative exploration of the idea: "a sapian's stream of consciousness shepherding in sapient streams of tokens"

Top comments (0)