• Please review our updated Terms and Rules here

A new article about x86 processors

vol.litwr

Experienced Member
Joined
Mar 22, 2016
Messages
324
I dare to share a link to my translation from Russian of an article about x86 - https://litwr.livejournal.com/436.html
It is a part of a series - https://litwr.livejournal.com/tag/processor (its original - https://habr.com/post/410591/)
It will be interesting to get some comments and additions. I am especially interested why 80186 was used so rarely. Was there any reason for this? I am not a native English speaker so please excuse me my a bit not proper language. Indeed, I will be glad to get any corrections of grammar or stylistic. Thank you.
 
The original 80186 had many incompatibilities with IBM PC architecture. Because it integrated things like the Programmable Interrupt Controller (PIC), Intel made decisions that conflicted with PC BIOS compatibility such as fixing the hardware IRQ mappings to vectors that conflicted with IBM BIOS interrupt numbers. It was mainly used as an embedded processor.
 
I am especially interested why 80186 was used so rarely. Was there any reason for this?
80186 (and 80188) contains integrated PIC/timer/DMA, but they aren't fully compatible with the chips used on the IBM PC mobo.
Which means 80186/80188 computers can't be fully compatible with IBM PC.
Anyway, these chips weren't really rare, they were pretty common in various controllers and similar devices, only rare as a PC CPU.
 
Intel was, at the time, competing with itself. The 8086/8088 was already an established product with a full line of (mostly 8-bit) peripheral support chips. If you'd done an 8085 design, the 8088 wasn't terribly different. (the 8257/37, 8259 and 8253/54 were designed for the 8080 world).

At the time, Intel was also advertising its "Micro Mainframe" iAPX432 chipset as the next step-up in power. That project ended disastrously.
At the same time Intel was working on the 80186, the 80286 project was in the works and Intel made sure that its customers were aware of it. Many chose to wait for the 80286 as it had significant features over the 8086. With the 80186, it was difficult to see where the chip fit into the overall picture--was it intended as a microcontroller or a microprocessor? When IBM went from the 8088 to the 80286, that pretty much locked out the 80186 from the PC picture, although it continued to be used quite widely in embedded designs.
 
I am especially interested why 80186 was used so rarely. Was there any reason for this?

More factors at play: The 80186 was released more or less simultaneously with the 80286, so there was no prolonged period of time in which the 186 represented the pinnacle of Intel's technology. It had the exact same pin package as the 80286, so it had no packaging advantage. Maybe it was cheaper than the 80286 (the 80188 certainly would have been)...that could have been its only impetus for use in computers.
 
At the time the 80186 was released (1982), it wasn't clear if the IBM PC would make any sort of market impact, so at least in the beginning, PC compatibility wasn't a consideration for most manufacturers. Since the world was doomed to run MS-DOS for years, it wasn't even clear if the incompatibilities would matter (based on earlier experience with CP/M).

Just trying to present a balanced picture from one who was there. We worked with pre-release steppings of the 186, so I remember the discussions pretty well.
 
For the millionth time the 80188/80186 can be incorporated into a fully IBM compatible design. No you can't utilize most of the onboard peripherals, so you can't have your cake and eat it. There were a limited number of units that incorporated them rather successfully.
 
And, the natural question would be "what's the point, then?" True, you get some speed over an 8MHz 8086 and a slightly expanded instruction set, but that's scarcely a significant reason to go with a design that will cost significantly more.
 
Didn't the V20/V30 offer all the instruction set/clock efficiencies of the '18x, without the compatibility problems? Research Machines made the RM-Nimbus for quite a while with it, and their own versions of MS-DOS and Windows with it.
 
Yes, it did--and boasted an 8080 instruction emulation feature as well. The V40/50 are even closer to a PC with their integrated peripherals; something that didn't come about in the 186 until the 186EB. When the 80186 debuted, it was sole-sourced, whereas the 8086/88 had several second sources (witness the number of 5150s that were shipped with someone else's 8088).

There's also the danger that some bit of malicious or careless code will stumble upon your Control Block and play with it. I wonder how many 186 vendors just kept the control block at I/O port 0ff00h...
 
Didn't the V20/V30 offer all the instruction set/clock efficiencies of the '18x, without the compatibility problems? Research Machines made the RM-Nimbus for quite a while with it, and their own versions of MS-DOS and Windows with it.

And people still can't grasp it. There were no compatibility problems. Regardless of whether there was a point in using it or not, it wasn't an in compatible chip. The Nimbus may or may not have utilized the 186s onboard peripherals. Most of the incompatibilities had to do with the video subsystem. And other issues. The Tandy 2000 for instance had all the same Intel peripheral chips that a 5160 mono had. But different video ics. That was usually the deal breaker
The TI Pro went as far as to use the 6545, compatible with the 6845, but configured it differently, resolution was different, etc. Hell the Ampro Little Board with the 80186 could boot stock MS-DOS 3, but was still largely not a compatible. That's my understanding anyway.
 
And people still can't grasp it. There were no compatibility problems.

There were plenty of compatibility problems with the PC architecture - specifically BIOS conventions. The 8086/8088 had a single maskable interrupt line relying on external chips like the PIC to place the vector index on the bus during an interrupt cycle. The vector base could be set in each PIC. So IRQs could be theoretically mapped to any vector. While the 80186 had the same functionality, it also implemented the internal embedded peripherals' IRQ lines with fixed assignments that conflicted with what IBM chose in their PC BIOS for other software services.

One of worst examples is in IBM's PC BIOS, interrupts 12h and 13h are get memory size and BIOS disk services - respectively. On the original 80186, PIT timer 1 and timer 2 routed to those vectors - and could not be moved. So if you planned on using those internal timers - and many 'work-a-like' machines did for things like memory refresh, you had to put BIOS disk services (int 13h) somewhere else. Tandy chose Int 56h for BIOS disk services and modified their DOS 2 OEM distribution to use that instead of Int 13h. That's a pretty major change required in stock DOS beyond a typical vendor's IO.SYS customization.

That's just one example of how it is extremely difficult to implement PC BIOS compatibility on an original 80186. Then there are hardware incompatibilities such as all the on-board peripheral block register files are lumped into a Peripheral Control Block (PCB) and you can't separate the PIC/DMA/PIT register files to non-contiguous locations like in the original PC hardware. It's a mess...

Later variations of the 80186, as Chuck pointed out, like the EB/EC were more flexible but the configuration register access was incompatible with the original version requiring the boot-strapping ROM code to be re-written.

-Alan
 
My understanding from inside Intel was that MS was told not to use specific interrupts but ignored Intel's advice. It was the horse driving the cart or maybe the other way around. In the early days, Intel and MS were not the best of friends. Now days they still take different steps but it is not as obvious as it was then.
Dwight
 
So if there were so many problems, how were essentially 100% IBM compatible computers made??? You cite numerous incompatibilities IF the onboard peripherals were utilized. They didn't have to be. And that's the whole point.
 
So if there were so many problems, how were essentially 100% IBM compatible computers made??? You cite numerous incompatibilities IF the onboard peripherals were utilized. They didn't have to be. And that's the whole point.

I believe you are mistaken. And if you feel you are not, please correct the Internet for us. I cannot find an example of an 80186 stand-alone machine that was 'essentially 100% IBM compatible'. The T2K wasn't. The RM Nimbus wasn't as Research Machines, like Tandy, had to provide heavily modified versions of DOS and Windows customized to the specific machine architecture. Can you make one by disabling everything in the PCB? Yes, Orchid seems to have done it with the PC Turbo 186. Did anyone make a ground up machine using a 80186 that was PC compatible and not just MS-DOS compatible (like the T2K and Nimbus)? ...please let us know.

You can't boot a non-OEM'd DOS by Research Machines on a Nimbus.. thus it is NOT 100% IBM compatible.. or even close.
 
I mean seriously every obscure 30+ year old computer can be easily found on the net in 2018???

Look up Computer Products United. They produced an 80186 based computer around 1989. Someone on the board has that Mobo or something close. And don't forget the HP palmtops and IBM PC Radio. The Ampro Little Board/PC used a V40 so I guess that doesn't count.
 
Thanks a lot for the help! I couldn't even imagine that Intel could produce IBM PC incompatible chip because I can't imagine the history of Intel without IBM PC architecture. The article has also Intel 8080 chapter. So if anybody knows the reason why 8085 was never used in PC it will be precious for me to get it.

There were plenty of compatibility problems with the PC architecture - specifically BIOS conventions. The 8086/8088 had a single maskable interrupt line relying on external chips like the PIC to place the vector index on the bus during an interrupt cycle. The vector base could be set in each PIC. So IRQs could be theoretically mapped to any vector. While the 80186 had the same functionality, it also implemented the internal embedded peripherals' IRQ lines with fixed assignments that conflicted with what IBM chose in their PC BIOS for other software services.

One of worst examples is in IBM's PC BIOS, interrupts 12h and 13h are get memory size and BIOS disk services - respectively. On the original 80186, PIT timer 1 and timer 2 routed to those vectors - and could not be moved. So if you planned on using those internal timers - and many 'work-a-like' machines did for things like memory refresh, you had to put BIOS disk services (int 13h) somewhere else. Tandy chose Int 56h for BIOS disk services and modified their DOS 2 OEM distribution to use that instead of Int 13h. That's a pretty major change required in stock DOS beyond a typical vendor's IO.SYS customization.

That's just one example of how it is extremely difficult to implement PC BIOS compatibility on an original 80186. Then there are hardware incompatibilities such as all the on-board peripheral block register files are lumped into a Peripheral Control Block (PCB) and you can't separate the PIC/DMA/PIT register files to non-contiguous locations like in the original PC hardware. It's a mess...

Later variations of the 80186, as Chuck pointed out, like the EB/EC were more flexible but the configuration register access was incompatible with the original version requiring the boot-strapping ROM code to be re-written.

-Alan

Thank you very much for this detailed information.
 
I can't imagine the history of Intel without IBM PC architecture.
First Intel's microprocessor was from 1971, first IBM PC - 1981.
So, there was the entire decade of Intel's history without IBM PC.
First x86 computers weren't IBM PC, either - they were probably those SCP machines DOS was originally made for.
 
So if anybody knows the reason why 8085 was never used in PC it will be precious for me to get it.
I'm not sure I understand...
If by "PC" you mean "IBM PC", then it's a machine based on x86 CPU.
8085 is not x86.
But it's possible that 8085 was used on some controller cards for PC, there was plenty of various 8-bit processors used for that.
 
Back
Top