"Perhaps the prettiest number system of all is the balanced ternary notation" - Donald E. Knuth, The Art of Computer Programming, Vol. 2.
It's pretty common knowledge that computers store and operate on data using the binary number system. One of the main reasons for this can be found in the circuitry of modern computers, which are made up of billions of easily mass-producible transistors and capacitors that can together represent two states: high voltage (1
) and low voltage (0
).
Such a design is so ubiquitous nowadays that it's hard to imagine that computers could operate in any other way. But, in Soviet Russia during the 1950s, they did. Enter Setun, a balanced ternary computer developed in 1958 by a small team led by Nikolay Brusentsov at the Moscow State University.
OK, so before we continue talking any more about Brusentsov and Setun, let me explain a little bit about balanced ternary.
Balanced Ternary
Ternary, or base-3, is a number system in which there are three possible values: 0
, 1
and 2
. In balanced ternary, these three possibilities are -1
, 0
and +1
; often simplified to -
, 0
and +
, respectively.
So, in its balanced form, we can think of the ternary values as being "balanced" around the mid-point of 0
. The same rules apply to ternary as to any other numeral system: The right-most symbol, R
, has it's own value and each successive symbol has it's value multiplied by the base, B
, raised to the power of it's distance, D
from R
.
Uhh, right, maybe I'll just give you an example. Let's encode 114
:
+++-0 = (1 * 3^4) + (1 * 3^3) + (1 * 3^2) + (-1 * 3^1) + 0
= 81 + 27 + 9 + -3
= 114
And in binary (base-2):
1110010 = (1 * 2^6) + (1 * 2^5) + (1 * 2^4) + 0 + 0 + (1 * 2^1) + 0
= 64 + 32 + 16 + 2
= 114
And, just to be sure, here are the same rules applied in decimal (base-10):
114 = (1 * 10^2) + (1 * 10^1) + (4 * 10^0)
= 100 + 10 + 4
= 114
Cool?
Now, what if we wanted to represent -114
? Well, in binary and decimal, we need to introduce a new symbol: the sign. In the main memory of a binary computer, this would either translate to storing a leading bit to specify the sign or significantly reduce the amount of numbers we can represent 1. This is the reason we speak about signed
and unsigned
integers in programming languages.
But in balanced ternary, as we're about to discover, to represent the inverse of a number we simply have to swap all +
's with -
's and vice versa. We don't need any extra information to represent the sign!
Check it out:
---+0 = (-1 * 3^4) + (-1 * 3^3) + (-1 * 3^2) + (1 * 3^1) + 0
= -81 + -27 + -9 + 3
= -114
In a moment we will see that this property (and a couple others) of balanced ternary give us some very interesting computational efficiencies. But, for now, let's go back to talking about the Setun computer.
The Birth of Setun
The late 50's were an exciting time in computers: Nathaniel Rochester and his team at IBM had recently developed the first mass-produced, stored-program (a.k.a "modern") computer - the IBM 701. John Backus and his team had invented FORTRAN, the first high-level programming language that was seeing widespread use. And, perhaps most important of all, the first fully-transistorized computers, such as the TX-0 and the Philco Transac S-2000, were starting to be developed. The path was set for the development of the binary computing machines that have dominated the market ever since.
But that was North America.
At the same time in Russia, a group of mathematicians and engineers under the guidance of Nikolay Brusentsov and his colleague Sergei Sobolev were devising different ways in which computers could operate 2. Brusentsov and co. surveyed the array of computers and technological advancements emerging from the West and comprehended their use of transistors to represent binary data. But, let's remember, this was Russia - transistors were not readily available behind the Iron Curtain. And vacuum tubes sucked as much in Russia as they did in the West!
So Brusentsov devised an element from miniature ferrite cores and semiconductor diodes that was able to act as a controlled current transformer, which proved to be an effective base for implementing ternary logic 3. It was found that these elements, compared with their binary counterparts, provided more speed and reliability and required less power to operate.
The team of ten literally built Setun from scratch, working in a small room filled with laboratory tables (which they also built!). Every morning, the team members began their day by assembling five basic machine elements. They started with a ferrite core and, using an ordinary sewing needle, wound 52 coils of wire onto it. The cores were then handed off to technicians who completed the assembly process and mounted them onto blocks.
Ternary logic was implemented by combining two of these ferrite elements and wiring them in such a way that they could represent three stable states. This approach was successful but did nothing to reduce the number of elements required as, in reality, those two ferrite cores could potentially represent two binary bits, which represents more information (2^2
) than one ternary "trit" (3^1
). Alas, at least their power consumption was reduced!
Setun operated on numbers of up to 18
trits, meaning one could represent anything between -387,420,489
and 387,420,489
. A binary computer, on the other hand, would need at least 29
bits to reach this capacity.
Overall, development of Setun lasted two years - although it was operational after just ten days of testing, which was unheard of at the time. In total, only about 50 machines were produced. And even though Setun machines proved capable of operating successfully for years-on-end in Russia's extreme climate, the entire project was racked with controversy.
This was primarily due to the fact the manufacturing plant was unable to justify the mass-production of what they deemed as a cheap intellectual endeavor and the "fruit of university fantasy". I think it's safe to assume that Russia was simply not ready to comprehend the eventual importance of computing machines. Ultimately, the Setun machines were replaced with binary counterparts that were able to calculate with similar efficiency but at more than twice the cost of operation!
But why Ternary?
As I demonstrated briefly above, in Ternary, there is no need to store a leading bit, ahem trit, to indicate sign. Therefore, there is no concept of signed
or unsigned
integers - everything is just an integer. So subtraction is achieved by simply inverting the operand and applying addition (which is implemented similarly to binary machines). This plus-minus consistency is also able to cut down on the number of carries required in multiplication operations.
Another useful trait of balanced ternary (or any balanced numeral system for that matter!) is the fact that one can implement rounding on floating-point numbers by blatant truncation, which allows for a simplified implementation of division. This is because of the way balanced ternary represents the fractional portion of real numbers.
Let me give you a simple example. Encoding 0.2
looks like this:
0.+--+ = 0 + (1 * (3^-1)) + (-1 * (3^-2)) + (-1 * (3^-3)) + (1 * (3^-4))
= 0.33 + -0.11 + -0.03 + 0.01
= 0.2
And to encode 0.8
we must start with a +
in the most significant digit and then simply inverse the fractional portion (e.g 1 + -0.2
):
+.-++- = 1 + (-1 * (3^-1)) + (1 * (3^-2)) + (1 * (3^-3)) + (-1 * (3^-4))
= 1 + -0.33 + 0.11 + 0.03 + -0.01
= 0.8
Above we can see that truncating the trits to the right of the radix point is equivalent to rounding: 0.2
becomes 0
whereas 0.8
becomes 1
. Cool!
Programming with trits and trytes!
OK, back to Setun one last time. In the late 60's, Brusentsov developed a more modern machine called "Setun-70" that embodied it's ternarity more directly. The "Tryte" was introduced, which accounted for 6
trits (roughly 9.5
bits). The Setun-70 computer was a stack machine, and so rather than having machine instructions that specifically named registers for input and output, all operations worked on two stacks - one for operands (input) and one for return values (output). In order to accommodate this design, machine instructions were written in reverse Polish notation (a.k.a postfix).
In the late 70's, Brusentsov and some of his students developed a programming language for the Setun-70 computer called the Dialog System for Structured Programming (DSSP). In my research 4, I have been able to discover that it is a stack-based language (no surprise there) similar to Forth that uses reverse Polish notation. This allows one to write programs in a relatively high-level language yet still feel "close to the metal". So close, in fact, that the authors had the following to say:
DSSP was not invented. It was found. That is why DSSP has not versions, but only extensions.
Take the following DSSP program that sums a bunch of numbers:
1 2 3 4 DEEP 1- DO +
Let's try to break it down. In the first column we have the instruction, in the second we have the machine state after execution (the operand stack), and in the third I've provided an explanation:
1 [1] Push 1 to the stack.
2 [2 1] Push 2 to the stack.
3 [3 2 1] Push 3 to the stack.
4 [4 3 2 1] Push 4 to the stack.
DEEP [4 4 3 2 1] Push "the depth of the stack" (4) to the stack.
1- [-1 4 4 3 2 1] Push -1 to the stack.
DO [4 3 2 1] Initiate LOOP, pop top two items from stack. To control
the loop, the first item is applied to the second item until
it reaches 0.
+ [] Apply the "+" operation repeatedly until the LOOP terminates,
each time popping the top item from the operand stack stack,
applying + and then pushing the output to the return stack.
At the end of execution, the operand stack will be empty and the return stack will contain [10]
.
You can read a bit more about DSSP over at the website of Ivan Tikhonov (original authors Sidorov S.A. and Shumakov M.N.).
The future
The development of balanced ternary machines has all but faded into a small footnote in the annals of computer history. And whilst research into memory cells able to efficiently represent three distinct states has been relatively minimal, there have been some efforts in this area.
Particularly, researchers in Japan in the late 90's described the possibility of using a Josephson Junction to implement ternary logic. This could be achieved by circulating superconducting currents, either clockwise, counterclockwise, or off. They found that this gave the memory cells a "capability of high speed computation, low power consumption and very simple construction with less number of elements due to the ternary operation".
But, for the moment at least, I don't think that you'll be hearing much more about balanced ternary computers in the near future. Nor do I think DSSP is going to be the next big thing in programming language fanboyism. But I do believe that much wisdom can be sought by looking into the past 5.
Thanks for making it this far. Corrections and feedback is very welcome. You can also read more here, here and here. :)
- This depends on how the machine in question represents numbers. Two's complement is a common notation that would allow one to represent
-((2^n) / 2)
to((2^n) / 2) - 1
inn
bits. - Although Setun was the first electronic device to operate on balanced ternary, it's worth mentioning that the idea of using balanced ternary in computing devices was first popularized over 100 years before. In 1840, Thomas Fowler built a calculating machine entirely from wood that operated on data in the balanced ternary notation.
- A far more accurate description can be found on the Russian Computer Museum website.
- Reference material for DSSP in English is not widely available so I'd like to provide a warning here that my knowledge is very limited and just may contain some guesswork.
- My own contribution can be seen at computerpionee.rs.
- Cover image, originally from the Moscow University Supercomputing Center, depicts one of the Setun machines in operation.
Top comments (15)
This is a fantastic article. FYI the IOTA project has a prototype ternary microprocessor (called JINN) and they are working on plans for widescale adoption in Internet-of-Things devices. Specifically it will provide low-power high-velocity nanotransactions. They have their own "cryptocurrency" token with it as well. So we may see a resurgence in ternary coming soon. (or maybe not, who knows?)
Thanks for an interesting read. Given what you've pointed out about the relative efficiency of ternary over binary, it brought two thoughts to my mind:
1) That there might a lot to be gained from from a more cooperative open communication about advanced research in both science and engineering across what are artificial national/cultural bounds. That said with an understanding of why we weren't necessarily pushing computer "secrets" to each other in the late 50s, and also understanding that "momentum" has other advantages in capturing a market beside being "best of breed."
2) I wonder what kind of encoding patterns we search for in the SETI data. Did we use binary on Project Voyager? What if everybody else "out there" speaks ternary?
Quite an interesting read. Though i must say i probably have a bigger understanding of balanced ternary than most, given that i am chief developer of the SBTCVM project, a project that aims to create a balanced ternary virtual machine. The Setun computers are quite a fascinating bunch.
SBTCVM really started with one night browsing wikipedia, and coming across balanced ternary, and in turn an article on Setun. the basic idea was i wanted to work with balanced ternary on binary computers.
When I started the actual VM itself, i had gotten an integer mathematics library up and running, and figured id put it to use. And while SBTCVM has its own architecture design (that's mainly a result of the development process), The Setun machines were and are a key inspiration.
Wow! Phenomenal article. I had been under the massively mistaken impression that ternary computation required quantum computers (if they exist) - clearly I fundamentally misunderstood both quantum and more importantly ternary computation.
Do you know of any particularly good (preferably mathematics-heavy) books on alternative or "non-standard" modes of computation such as those discussed here?
Thank you!
Great stuff! Heard about it for the first time and I'm really surprised that it wasn't adopted widely before....
The only technical advantage in binary I can imagine is, that without an "off" state as value-representation you don't need any special check for invalid "trits" due to physical damage.
Interesting! Heard of Setun before, but was looking for more information on DSSP. Thanks for the code and link. The DO ... [] structure looks more elegant than traditional/standard Forth's DO ... LOOP (but Charles Moore didn't like DO LOOP anyway). I'm a bit sorry to hear there's not much information on it, but as it appeared in the late 70s, it could simply be a clone of Forth which appeared over 5 years earlier. If not, or perhaps even if it was, it's amusing that DSSP was said to be "found", because Charles Moore also described himself as the discoverer rather than inventor of Forth. :) Oh, Ivan Tikhonov described DSSL as "inheriting" things from Forth.
Regarding the hardware, I had a bit of a go in a cheap simulator, and was not impressed. Not being very familiar with inductors, I used transistors, and was not able to find a configuration equal to binary CMOS. I'm not surprised to hear the Setun needed 2 coils per trit, which is actually only only 3/4 the packing density of binary core memory. Binary core memory was itself huge, requiring about a cubic foot per kilobyte! Add to all this the very high cost of producing practical software in the Setun's era, and I can entirely understand the Soviet administrators' decision to replace the Setun with computers more nearly compatible with foreign software. Low power consumption seems to be the Setun's only advantage, and from my experiments I'm almost certain this could not be have been maintained into the transistor era. I was not able to find anything close to the power- or space-efficiency of CMOS.
I sometimes wonder how to efficiently emulate a balanced ternary computer, but it really is an academic fantasy as the Soviet administrators said. The Josephson junction thing has me a little bit curious, but I'm cautious because I don't know what logic could actually be done with it and what support each junction would require.
where is the article ?? T_T
It looks like a bug with dev.to - I will open an issue.
Pay attention to IOTA and Jinn processors. Ternary is coming.
Very cool article ! Thanks
Great article! I wasn't aware of this.
One minor typo, though, in your fourth example (encoding -114) there are few too many minuses I believe.
You are right! Fixed :)