• Please review our updated Terms and Rules here

Why wasn't the 386sx CPU developed as a drop-in replacement for the 286?

digger

Experienced Member
Joined
Jul 9, 2008
Messages
395
Location
Amsterdam, the Netherlands
It is well-known that Intel developed the 386sx as a more affordable variant of the 80386 (later retroactively called the 386DX), to allow the development of lower-cost 386 systems using 16-bit components from the 286 era. Yet the 386sx has a different socket than the 286, and apparently it's not trivial to make a socket converter between the two. Correct me if I'm wrong on that, though.

So why didn't Intel release the 386sx in a form factor that would allow it to be installed in any 286 motherboard as a drop-in replacement? It would have allowed Intel to sell even more 386sx CPUs, not just to OEMs, but also as upgrade kits to 286 owners wishing to upgrade their systems.

Why introduce a new socket, which required OEMs to develop dedicated motherboards for it? Didn't that at least partially defeat the purpose? It's not like the 386sx had a wider address bus than the 286, right? Weren't they both limited to 16MB RAM (24 bits)?

So what gives? Why a new incompatible socket? And why weren't there many common Overdrive-like 386sx upgrade kits made available back in those days? The ability to future-proof their investments by adding compatibility with 32-bit software through a simple CPU swap (and perhaps a BIOS update) would likely have been an attractive option for many 286 owners.
 
Quite simple: the 386sx needs at least 20 more additional pins than the 286, so no way to use the same socket. And that's also the reason why drop-in upgrades for 286 sockets with a 386sx are rather complex.
 
Most of those extra pins on the 386sx are additional Vcc and grounds, not signals. There was a thread on this topic pretty recently and while my memory is a bit fuzzy the long and short of it is the SX is *almost but not quite* 100% electrically compatible (if you used an interposer between the different footprints) but the behavior of a couple of the signals is different enough to need some glue if the chipset doesn’t know about it. The SX has no built-in cache control signals.

There’s a picture out there of a prototype 80386sx in PGA format; Intel demoed the SX in this format but it’s not clear if they ever intended to sell it as a pin compatible upgrade part and changed their mind.
 
I see. Thanks.

This site about the Kingston SX Now! CPU upgrade modules also provides some information relating to this topic. Apparently, the 286 CPU existed in more than one form factor as well, so a one-size-fits-all drop-in replacement wouldn't have been possible anyway.

Any idea what those 20 extra pins in the 386sx socket were for, other than an external CPU cache?

I tried one of these in an Intel Multibus 286/10 board and while it ran, there was something different about the way it handled interrupts (cant remember the detail now) such that it hung every time an IRQ was used by an external device (I do remember getting it to complete by using a wire to short out between the I/O card and the CPU)

It was a shame, even without the cache enabled it ran CUTLASS code faster than a vanilla 286/10A
 
@Eudimorphodon That's a very clear explanation. Thanks! :)

@Chuck(G) Wow, with so many different physical variants of the 286, I guess it made perfect sense for Intel to "start afresh" with a new socket for the 386sx. If anything, that must have brought more sanity to the industry.

Considering how rare those aftermarket CPU upgrades such as those from Kingston are these days, I wonder how much effort it would be to develop a drop-in CPU upgrade module for such older vintage 286 and 386 motherboards, using an FPGA with the ao486 core (or at least the CPU part of it, since ao486 is an entire SoC). But that's a topic that deserves a thread of its own. Perhaps such a project already exists somewhere on this forum? Or elsewhere on the Internet?
 
Now you guys have me wanting to figure out how to build a thing to put a 386sx into my PS/2 Model 25-286.
 
I think the 386 prototypes were tested by being inserted into 286 motherboards and ran in 16-bit mode. The need for a different BIOS and speed issues probably precluded making a shared 286/386SX socket. Clock multiplying is one necessary part of most of the 386SX upgrades that plugged into a 286 socket or the 386SX would be stuck at a very low clock speed.
 
Note that many 386 BIOSes have code to emulate the 80286 LOADALL variant--the instructions of the same name weren't the same between the 808286 and 80386.
 
I think the 386 prototypes were tested by being inserted into 286 motherboards and ran in 16-bit mode. The need for a different BIOS and speed issues probably precluded making a shared 286/386SX socket. Clock multiplying is one necessary part of most of the 386SX upgra des that plugged into a 286 socket or the 386SX would be stuck at a very low clock speed.

I had a Packard Bell 286 years ago that I really liked to play with. Back in the late 90's, I found a BIOS upgrade chip for the mobo which let me use larger ATA HD's, like maybe 455MB or such. I went further with a Evergreen 486 drop-in and it worked but the setup was doggy in performance. Something eventually went tango-uniform on the mobo but I still have that BIOS chip and the Evergreen 486. Maybe I'll get around to trying it again bit I can't remember the mobo specs so that BIOS chip is probably useless.
 
Something a lot simpler, from the same maker, is shown at [here].

Would that work in a Model 25? It specifically calls out 50/60/AT in the silkscreen and has what I assume to be a power connector, which the model 25 does not have to spare.
 
Would that work in a Model 25? It specifically calls out 50/60/AT in the silkscreen and has what I assume to be a power connector, which the model 25 does not have to spare.
I expect that to be a suck-it-and-see thing. For example, maybe some bad behaviour is caused by the 25's POST and BIOS executing faster.
 
As to "why?", the answer is simple. The 80386SX was developed after the 80386DX debut. The idea was to reduce the pin count and bus width to save money and power while using basically the same core of the DX. I doubt that there was any serious consideration given to using it as a drop-in replacement for an older CPU. Although there were PGA versions of the SX, the standard package was QFP, meant to be soldered down to the PCB, not socketed.
 
The 386 Hardware Reference Manual indicates the idea of having a 386SX was there from the beginning. Multiple chapters explain how to run the 386 with completely 16-bit interfaces and indeed almost plug into an existing 286 design.

Intel couldn't build the 386 in sufficient volume for several years to do their usual method of having a cut down version of the new processor filling in the higher speed budget line up instead of improving an older CPU. By the time Intel could get 386SXs to market, AMD and Harris had fast 286 designs just waiting to fill high speed 16-bit motherboards at much lower prices. A shared 286/386Sx socket might have cost Intel sales. The 386SX was $200 and the extra memory to take advantage of the V86 mode was another $100 per megabyte. A 20 MHz 286 at $80 looks like a much better value.
 
Add to that, that initially, very few people were running the 386 in 32-bit mode--many used V86 mode at best. So no compelling reason for a 386, particularly when typical system memory was a couple of megabytes at best. Running 8086 code, which CPU is faster--a 20MHz 80C286 or a 20MHz 80386SX. I seem to remember that it was a pretty close contest.
 
Back
Top