everyone keeps asking me why x86 uses an 8-bit byte, but I'm struggling to find an explanation that makes sense to me. Can any of you help?

what I've found so far
- it looks like x86 evolved from the intel 8008 (from 1972), which was an 8-bit CPU
- the 8008 came after the 4004, which was 4-bit

some questions I have:
- was the reason to build an 8-bit CPU to increase the size of the instruction set? or something else?
- did x86 really directly evolve from the intel 8008?

would love any links!

or maybe the reason was that the 8008 was a popular microprocessor, and it happened to use an 8-bit byte so it became the foundation for all of intel’s future microprocessors, but in theory they could have also built a microprocessor with a 10-bit byte instead and that would have been fine too?

so far the reasoning I'm getting for the 8-bit byte seems to be:

1. you want your byte size to be a power of 2. This is EXTREMELY believable to me, but I don't understand why exactly you want this, my understanding of CPU design is very bad. Maybe it's because of busses? (what's a bus?)
2. 4 bits is too small, you can't fit a character into 4 bits
3. you also don't want your bytes to be too big, and 8 bits was working well, so the byte size never got bigger after the move from 4 to 8

Follow

@b0rk My guess would've been that it took off more because of network and file format standards, but I am having trouble finding info on the development of these. There were Ethernet and EBCDIC which were 8-bit, but there was also ASCII which was 7-bit.

Sign in to participate in the conversation
Computer Fairies

Computer Fairies is a Mastodon instance that aims to be as queer, friendly and furry as possible. We welcome all kinds of computer fairies!