@SuricrasiaOnline
I don't think we're presently stuck; in classical computing people are experimenting with processors with huge banks of nonvolatile RAM, such that there ceases to be a meaningful distinction between RAM and Disk, and as Cache scales up, surely something will erase that distinction too.
Then there's FPGAs.
@SuricrasiaOnline @ontploffing I seem to recall ternary computers would have better energy consumption and deal with negative values a little more elegantly which might be nice for some computation tasks
@kellerfuchs @SuricrasiaOnline @ontploffing oh, right, that must have been where I got the titbit from
Still got the whole "for negative numbers use -1 instead of chucking another bit on there" thing though, that sounds pretty neat
@troubleMoney @SuricrasiaOnline @ontploffing
Not really?
The reason we do binary is because turning a physical quantity (voltage, current, pressure) to a logic level gets more ambiguous (and prone to errors) as you add logic states.
The Soviet Setun computer was a 50s trinary computer, which had lower power consumption than some alternatives at the time, but I'm not aware of string reasons to ascribe that to the use of trinary.
50s vintage computer design was definitely not at an optimum: we hadn't even invented VLSI, and the machines then were room-sized, guzzling tremendous amounts of energy :D
I'm kinda annoyed by the whole meme because it's being used to peddle a bunch of craptography, in particular by iota (a cryptocurrency)