The release of Intel's 8086 microprocessor in 1978 was a watershed moment for personal computing. The DNA of that chip is likely at the center of whatever computer--Windows, Mac, or Linux--you're using to read this, and it helped transform Intel from merely one of many chip companies to the world's largest. The Intel 8086 was based on the design of the Intel 8080 and Intel 8085 (it was source compatible with the 8080) with a similar register set, but was expanded to 16 bits. The Bus Interface Unit fed the instruction stream to the Execution Unit through a 6 byte prefetch queue, so fetch and execution were concurrent - a primitive form of pipelining (8086 instructions varied from 1 to 4 bytes). It featured four 16-bit general registers, which could also be accessed as eight 8-bit registers, and four 16-bit index registers (including the stack pointer). The data registers were often used implicitly by instructions, complicating register allocation for temporary values. It featured 64K 8-bit I/O (or 32K 16 bit) ports and fixed vectored interrupts. Rather than just supplying missing bytes, as most segmented processors, the 8086 actually shifted the segment registers left 4 bits and added it to the address. As a result, segments overlapped, and it was possible to have two pointers with the same value point to two different memory locations, or two pointers with different values pointing to the same location. Most people consider this a brain damaged design.
Problems
But the 8800 project was in trouble. It had encountered numerous delays as Intel engineers found that the complex design was difficult to implement with then-current chip technology. And Intel's problems didn't stop there--it was being outflanked by Zilog, a company started by former Intel engineers. Zilog had quickly ...