Russians did this in the 50's?
Did the Russians not do this in the 50's (vastly different technology)
Reminded me of an old Computer Science lecturer tell us all about this.
Scientists in Singapore and Taiwan have developed an organic molecule which can have three electrically-readable states, making a ternary rather than binary device possible. Binary devices have two electrically readable states, corresponding to a one or zero. Ternary devices have three: zero, one or two. Consequently they …
Even today, Ternary systems are widely used in combination with binary ones. For example the UK0 Interface of German phoneline uses a Ternary voltage levels to transmit binary signals.
So essentially a final product probably would have binary connections, but convert that to ternary internally.
In theory, the trinary nature of RAM/CPU wouldn't have to affect software.
Most software we write these days don't care about number bases at all.
It would be reasonable to run existing Java applications in base 3.
The whole x86 architecture could be ported to base 3 internally such that the software doesn't see the difference.
Certain opcodes, such as SHL, ROR, which are base 2 would be less efficient, but still possible on a base 3 (or base 6) machine.
Mind you, keeping the x86 architecture alive is a terrible waste of man power.
As you've added another state there's another outcome...
There are 11 types of people in the world. Those that understand ternary, those that understand binary, those that understand both and those that understand neither.
Which is a truly awful mess of a geek joke ;-)
"It's very interesting science, but the entire binary computing infrastructure would have to alter to use it."
However, a little tweaking of ternary would render the entire binary computing infrastructure a capture to its proprietary instructions/System Wipes and ReWrites.
* for Virtually Advanced Post Modernist Confucianism and CyberIntelAIgent Taoism.
And bet against that if you want to walk naked as a J.
Nobody is going to run 3-logic computers. Not now, not in a 1000 years.
Even if some development makes 3-logic hardware attractive, it would be WAY WAY easier to just ignore and remove the 3rd state then to deal with 3-logic.
By the time someone comes up with a good 3-logic design, there will be a faster 2-logic machine, making 3-logic forever obsolete.
4-logic or 8-logic might be nice though.....
maybe not for large applications such as PC's but there are smaller applications that would benefit. I thought ternary was traditionally balanced i.e. -1 0 and 1 anyway so a ternary system could always be backwards compatible (by ignoring the sign). Code could be compressed to use ternary making the negative bit negligible - an instant saving in memory for common storage.
one of the limitiations of binary based electronics is how fast you can switch between "1" & "0" voltage levels and still accurately tell what you had (against noise). I imagine any gain from using ternary bits would be lost in speed, plus the associated logic would need to be more complex - almost getting towards old analogue electronics!
still, nice to see the Chinese trying to develop new stuff.
For storage, you can use two ternary "trits" to store three bits (with one unused combination).
Ternary logic could also be used for asynchronous binary logic, which has three natural states: 0, 1 and not ready.
But I doubt we will see ternary numbers replace binary numbers in computers.
Robert Heinlein used this idea in his novels, calling it 'trinary' - instead of one and zero, his scheme had values zero, 'unit' and 'pair'. He also proposed that future storage should be on a 'tell-you-three-times' basis (or 'tell-you-six-times' for really critical info - reminds me of Arthur C Clarke's Ramans). Though I think that Heinlein got the 'tell-me-three-times' from Edward Lear ('what I tell you three times is true')?
Base-3 is closest to the most efficient encoding representation which is base 2.718..., but fractional bases are not possible in practice. Heat dissipation is a major problem for chip designers and base-3 would enable them to use a lot fewer components (assuming the power consumption of a base-3 device was comparable to a base-2 device), so there could be a major cost/performance incentive in years to come to use base-3 devices.
Us human are are derived from a base-4 encoding, i.e., DNA.
"Base-3 is closest to the most efficient encoding representation which is base 2.718"
Not sure how you can prove e is the most efficient coding representation but (on paper) at least number systems based on non integer bases *are* possible. You might like to browse the back issues of the J. American Mathematical Association. A key issue for a viable number system is weather it can *uniquely * convert a decimal number into its number base and back without ambiguity.
Mine will be the one with some photocopies in it.
> Not sure how you can prove e is the most efficient coding representation ...
The inestimable Hacker's Delight covers that IIRC (outstanding book, a real eye opener). Another link is wiki, where else <http://en.wikipedia.org/wiki/Radix_economy>
> ...number systems based on non integer bases *are* possible
and negative nums. Maths teacher, a long time ago, tried to show us how to count in minus ten. Wiki link? Certainly <http://en.wikipedia.org/wiki/Negative_base>.
Ternary could never replace binary as traditional timing circuits avoid certain frequencies and timings (e.g. 10Hz / .1s) as they cannot be accurately stored as binary numbers - much like 1/3 in decimal. When you introduce base 3 timing, you eliminate most of the frequencies in use today.
so I believe. I understand that the optimal base is e (2.718...) and given that a church dude settled on trinary to account for his alms payments to the poor, apparently building a wooden 'computer' (this would be very pre-20th century) that used trinary. I understand there's very little record of this thing.
useful root link <http://en.wikipedia.org/wiki/Ternary_numeral_system> gets you to radix economy etc.
And bless wiki, here's the wooden pooter ref <http://en.wikipedia.org/wiki/Thomas_Fowler_%28inventor%29>
"Tell you three times" is Lewis Carroll, [The Hunting of the Snark]. Probably online. Stated, never justified. However, the Space Shuttle has at least three computers that have to agree on navigation et cetera.
"0, 1 and not ready" is feminine logic (ducks)
I think Star Trek had three-state computers, or maybe I'm thinking of something else, such as James Blish's adapted stories. They had duotronics which evidently was electronics with knobs on, they had brand-new multitronics which was the subject of "The Ultimate Computer" and didn't go well at all, but they also had tricorders.
How do you do parity with trits?
"What I tell you three times is true (why?)"
One computer may get it wrong. Of two computers, one may get it wrong, but you wouldn't know which. With three computers, the correct answer wins by popular vote. The chance of a rare-as-it-is fluke of getting it wrong in the first place, TWICE, is an "acceptible margin of error." (most likely in the realm of <0.0000000000001%).
Why do you think any kind of true vote-based system uses odd numbers? There will always be a tie-breaker vote, since "yea" or "nay" is binary after all.
Thomas fowler built a wooden calculating machine using balanced ternary in 1840.
Someone has built a copy recently, see:
Strangely, the machine appears in a stained glas window at St. Michael’s Church in Torrington, Devon.
I believe the Russians tried ternary machines in a university sometime in the late 1950's.
This post has been deleted by its author
IIRC (Electronics Weekly/Electronics Times mid 80s) either or both Ferranti and GEC did some chip designs, especially for gate arrays, using some kind of 3 level logic. Think it *might* have some links into Current Mode Logic, a cousin of ECL for V.high speed logic apps at lower power.
3 levels meant you could use fewer gates to do the same function. Handy if you have tight limits on number of gates per chip and each gate is pretty thirsty in the current department.
should this 8ever* get out of the lab it would give the same benefits in terms of simplified manufacturing (fewer "gates" to lay down and connect).
Keep increasing the number of values that can be carried on a single conductor and you'll end up with a new computing concept - maybe call it Analogue (or Analog for those that can't spell). For specific applications they are still much faster than any digital solutions yet invented.
Mine's the one with the patch cords haning out of the pocket.
Biting the hand that feeds IT © 1998–2021