PDA

View Full Version : A list of support chips for the Intel 4004



Floppies_only
November 16th, 2010, 02:36 PM
Gang,

A friend of mine noted the anniversary of the release of the Intel 4004 and listed it's support chips. It's a brief but interesting read:

http://wcg.livejournal.com/763461.html

Sean

Chuck(G)
November 16th, 2010, 02:54 PM
Personally, I think too much is made of the significance of the first MPUs.

People tend to treat them as if they arrived like a bolt from the blue, but in fact, they were just part of the evolutionary process. From SSI circuits, complexity started increasing. I recall that the 1-chip ALU, the 74181 was greeted with quite a bit of interest, but it was bipolar because MOS was slow-slow-slow. For more ideas on where bipolar design was going, consider the Fairchild Macrologic family.

Let's not forget that the early micros right up through 8080 required a fair amount of external chip support with clock generation and system bus control. It's pretty much a stretch saying that the 4004, 8008 or even 8080 was a "complete" microprocessor.

A year or two after the 4004, National Semi came out with the IMP-16, a chipset that implemented a 16-bit minicomputer. Far more versatile than the 8008 or even the 8080. Yet almost no one remembers it because it wasn't a one-chip solution.

saundby
December 5th, 2010, 11:17 AM
I agree with Chuck. After all, the 8080 didn't even really start to get traction until the latter half of 1975. The IMP and PACE were both more capable, but didn't have the "mindshare". The 1801 was another good CPU that got overlooked because it was a two-chip solution. Low power, powerful register set (When I read the 8080 data sheets, my familiarity with the 1800 series had me actually break out in a laugh when Intel called their registers "general purpose".) These chips were RISC before there was RISC.

The adulation of the 4004 baffles me, to be honest. It was a step in the process, but its contemporary impact was far less than is made out. It's only in retrospect that it can be made to appear important. It was an expensive, complex, multichip solution. I avoided it back in the day and I'd never want to build around one today unless I was feeling especially masochistic. What it did the SC/MP did better in every possible way. I'd rather base a system on an 8041 or 8048 than a 4004.

The 8008 was crippled as well. Again, why waste your time with a chip that was crippled by Intel's limited packaging abilities so as to be spread among so many chips when you could get a nice two-chip solution like the 1801? I have to say that what I was really looking for in a processor at the time was a 12-bit processor, rather than an 8-bit. I ended up doing plenty with 8-bit chips and having a great time with them, more than I expected coming from an IBM/DEC background, but I wasn't along among those who at the time looked at the 8008 and 8080 and said "when are you going to make a real computer chip?"

The 8085 was the first of the Intel line that I really liked, and by the time it shipped in quantity the Z-80 was already out. The 8085 finally had a single supply, no external bus expander required, and for good measure an internal clock. But Intel was beat to market on all these features by other CPU manufacturers.

The single-chip CPU was coming, 4004 or not. There were pressures from several areas pushing toward the integration of the different functions. Witness the differences between the different early CPUs, like the F8 and 1802 and so on. Each was shaped by a different view of how the processor should be organized and used, and by each company's manufacturing abilities. What later began distorting the market was the effect of the software base. That has continued to this day, making certain processors stand out more in retrospect than they did in their own day.

Anyway, the 4004 is an ugly, ugly, ugly chip set. The instruction set is narsty as can be. It's technically interesting and it has an interesting history, but it wasn't the pinnacle of achievement in 1971-2 by any means--even the dates are mythical retro-inventions. But that's another story. ;)

Chuck(G)
December 5th, 2010, 05:19 PM
Now that I've thought about it, I'll go one farther by saying the one-chip microcomputer was not the greatest evolution of the early 1970s. In my opinion it was the MOS DRAM.

Consider a terminal that used the 8008, such as the Beehive SuperBee. What to use for memory for a screen buffer (24x80, with several pages)? Core was a possibility but was power-hungry and a bit difficult to use (after a read, core needs to be rewritten as reading is destructive). There were other types of storage, such as magnetostrictive (too slow ans sensitive to mechanical shock). In the end, a recirculating shift register was used. 1Kbit MOS shift registers were fairly inexpensive and fast, as long as you could wait for your bit to come along.

Remember that the original MITS Altair had 256, count 'em, 256 bytes of static RAM. The 2101 SRAM was used in the TV Typewriter and used 6 of them to provide an uppercase-only display--and they ran hot.

The 1101 and then 1103 DRAMs were the answer, followed by the 2107 (4Kx1) and then the 2116 (16Kx1)...

Memory is what made the personal computer practical.

saundby
December 7th, 2010, 12:11 AM
Memory is what made the personal computer practical.

Right on the nose.

Though I'll say that the PIA was up there in importance--but it was more of what made the uP successful than a stand alone technical achievement. Using general-purpose MSI for I/O has always been my preference over LSI and VLSI, but the programmable I/O chip brought in a lot of engineers, and therefore applications, to uPs than would have happened otherwise. The uP, whether a single chip like the sixers or 3-chip like the 8080A, had been reduced to a cookbook component for most designs. But they would have remained a curio without the sorts of apps you could get with uP + PIA (of whatever family or, often, mixed families.)

At the time I remember a lot of designers treating I/O as a black art. They considered the programmable I/O chips as a panacea, and offered design solutions using them with a uP that brought uPs into a lot of places they wouldn't have been otherwise.

Another point on semi memory's importance--it would have revolutionized the world even if we'd never developed a single-chip processor (doubtful as that may be with demand for microcontrollers, but imagine the 8041 as an apex there, perhaps.) There were so many things to do with it, that had nothing to do with tying it to a processor. Now we can hardly imagine anything else.

For those who can't imagine, take a look at a memory data/applications book from around 1969-1973. Intel was strongly targeting their memories toward use in mainframes and minicomputers, so it's almost all digital computer apps in their books, but Moto, Nat'l Semi and others had all sorts of other applications in their books. State machines galore, with no CPU.

Chuck(G)
December 7th, 2010, 09:21 AM
My memory is that the Altair 8800 used 8212 for parallel interface initially; my OEM Diablo 630 was driven from a 3P board that had no programmable I/O whatsoever--just ports where the direction was set manually. Serial I/O was from a TR1602-type UART, with everything set manually.

So, I tend to discount the PIA or PPI--the only one I ever used was the 8255--and it's got its own quirks that persist to this day.

On the other hand, the UART made connecting computers to each other and ultimately, the first internet access possible.

mark66j
December 7th, 2010, 11:13 AM
If you are talking about influential chips in that era, don't forget the 6502. Not so much technologically, but the way it was sold for $25 a pop by Chuck Peddle at a trade show. This not only helped bring out the Apple I, but more important I think helped create the idea of mass-market computers in general. Also, low chip count (including a single-chip CPU and good companion chips) doesn't make a computer better, but it does usually make it cheaper.

saundby
December 7th, 2010, 11:09 PM
Well, MOS Technology definitely broke the standard for immense margins on CPUs, but that was an effect of a competetive environment rather than technology. Prices had already been falling, CPUs that had been over $300 had fallen to about $100 before the WESCON where the 6502 was debuted.

I think what's at issue is the sort of after-the-fact idolization the 4004 and 8008 have gotten. Much more has been attributed to them than was the fact at the time. In my mind, the 6800, 1801, and 8080 were the real turning point for the use of processors in small/inexpensive system design. The 4004 and 8008 were no better than many other chipsets available at the time, treating them as if they were single-chip bolt-out-of-the-blue wonders is silly in historical context.

Integration and dropping prices were the nature of the business. Neither design had a level of integration that really stood out from other developments at the time--they were just a slightly different mix of processor components relative to other processor chipsets of the time, but which placed processing, control, and a register file on one of the chips of the chipset, technically filling the checkboxes for being a microprocessor.

Lest it be thought that I'm a 4004 basher, have a look at the 4004 article (http://wiki.vintage-computer.com/index.php/4004) I put in the wiki months ago. I'm quite familiar with it and the 8008. Both look more important in retrospect than they were contemporaneously. They were significant, but not technically dominant in any way.

When a pair of guys left Fairchild to build the semiconductor memories that Fairchild said weren't worth building, however...the Earth shook. ;)

carlsson
December 7th, 2010, 11:31 PM
To what degree do newer X86 designs (starting from 8086/8088 I suppose) share addressing modes and instruction set from the 4004 and perhaps even more the 8008 and 8080? Or in other words, is there any legacy or heritage from the 4004 in newer CPUs? If so, I would assume a bit of the idolization comes from that fact. Many of the other microprocessors you mention don't live on in modern times, at least not as obvious as the Intel line has been.

Chuck(G)
December 8th, 2010, 09:20 AM
A better question would be what do x86 CPUs suffer from the 8008 design? The answer to that is, of course, considerably.

And it's not as if Intel tried to get away from the x86--it was, after all, mostly intended as a stopgap design until the 432 was ready (a very advanced architecture). Even the N10 project that ultimately resulted in the i860 was started in 1986 showed that Intel was desperately trying to drop the x86 notion (BillG at Microsoft thought it would revolutionize computing). They tried again with the Itanium.

No soap.

It's sort of like replacing a cart and horse with a jet fighter, only to discover a feedbag for the horse is part of the fighter's standard equipment.

Dwight Elvey
December 9th, 2010, 06:39 AM
Hi
About the only thing one can think of that was brought
forward was the decimal adjust.
Still, one has to realize that the 4004 was more like the
typical mainframes and minis of the time than later
processors.
All the I/O and memory watched the instruction stream
and knew when it had action to do. The processor had
nothing to do during these cycles other than read or
write data to the correct register.
Later uPs would explicitly send I/O address and data
to ports ( even if memory mapped ).
As for the 4004, you have to realize it was a relatively
specialized design to run calculators and had to fit into
DRAM chip packages.
I agree about solid state memory as begin a significant
step. EPROMs were also a window opener. Before that
the only way to test code was a diode matrix or simulation
on a mainframe. Making a mask ROM had a high entry
fee. I doubt any of the PC would have even started without
the ability to develope code 'on the cheap'.
Dwight

saundby
December 9th, 2010, 07:47 PM
I agree about solid state memory as begin a significant
step. EPROMs were also a window opener. Before that
the only way to test code was a diode matrix or simulation
on a mainframe. Making a mask ROM had a high entry
fee. I doubt any of the PC would have even started without
the ability to develope code 'on the cheap'.
Dwight

Memory does seem to be it, doesn't it?

And yeah, EROM/EPROM/EEPROM were really important to development.

A funny thought, though. In an alternate history without user programmable nonvolatile memories, does the 1802 (usable for ROMless development thanks to LOAD mode) become a hobbyist favorite over the eighters and sixers in the 1976-78 time frame? :D

...Actually, I expect not. Instead I'd think a 1K masked monitor ROM would become a "standard" part for the other processors until user programmable memories do appear. The 1802 was interesting, but RCA just let too many opportunities slide to ever become a top tier uP supplier.

Chuck(G)
December 9th, 2010, 09:12 PM
The 1802 was awkward to program (everything funneled through the D register) and a bit spare in the instruction set. The LOAD mode was interesting, but could be emulated on other CPUs (e.g. the Altair and IMSAI machines didn't have any problems not having a CPU with that mode. And when cheap fusible-link PROMs came about, it got even simpler).

Another thing that hampered the 1802 was the lack of a comprehensive support chip set.

Where the 1802 was useful was where low-power remote telemetry applications demanded battery power supplies. Since it was a fully static CPU, you could turn off the clock and the power consumption would drop to microwatts.

RCA wasn't known for their business acumen outside of radio and TV. After "General" Sarnoff turned over the firm to his kid, Robert, it was all downhill.

They shed their profitable mainframe business (remember the Spectrola? (Spectra 70)), got out of micro-electronics and instead went into car rental and TV dinners, both of which turned out to be disasters.

Reminds me a lot of An Wang and his kid Fred, who essentially destroyed Wang Labs.

carlsson
December 10th, 2010, 01:25 AM
I have been eyeing the 1802 for at least five years. I printed Tom Pittman's short introductionary course and even managed to squeeze out a few instructions of my own but didn't really get into it. I found a reference about 1802 programs usually are more compact than the exact same program written for the 8080, 6800, 6502 and Z8, but also that 1802 uses more clock cycles than the rest.

Chuck(G)
December 10th, 2010, 10:41 AM
I'd seen the claim that since most of the 1802 instructions were one byte, that programs were more compact, but I took the claim with a grain of salt. The one-byte instruction claim comes about because most instructions are one-address, with that address being a 4-bit quantity specify a single register. The D register as an operand is implied. There are no 16-bit transfer instructions, so address registers have to be loaded 8 bits at a time.

I find the "smaller" claim suspicious because I've never run into a real large-scale application for the 1802, such as a PL/I compiler, relational database manager, accounts receivable package, etc. Implementation of data mechanisms to support such stuff is pretty cumbersome on an 1802. For example, write the code for a C-type calling sequence with on-stack local variable in the routine being called. It gets very messy.

It might be true for simple operations, but that's not characteristic of serious applications.

saundby
December 15th, 2010, 12:20 AM
If you try to program an 1802 like another processor, it'll be a disaster. If you approach it with the right mindset it will have tighter code than other chips of the era.

It's up to the same tasks as any other processor with the limited memory space of that time. With its register set it handles sophisticated software well and efficiently. Personally I'd put it at better than an 8080A or 68A00, about on a par or slightly better than an 8085 (comparable I/O capabilities, better code/register set in 1802, higher speeds in later 8085s), the Z-80 beats it by a nose and the Z-80A clearly trumps it, without considering the 8080 codebase--just on technical merits.

That said, the system implementations that RCA came out with for general purpose computing were pretty limited, giving the chip the appearance of being limited.

But it was a good, solid chip in 1976. ;)

Chuck(G)
December 15th, 2010, 06:36 AM
..and we're back to the "what's so special about a single-chip CPU" question again. By 1976, we had several 16-bit CPUs. Weren't the MicroNova and Fairchild 9440 a reality in 76? You could do whole database and business management software on those.

So are there any benchmarks, say, comparing floating-point BASIC performance on the 1802 with other CPUs of the time?

As far as 16-16 bit registers go, I think the GI CP1600 precedes the 1802 by a few months--and it was a 16-bit CPU.

MikeS
December 15th, 2010, 06:45 AM
...As far as 16-16 bit registers go, I think the GI CP1600 precedes the 1802 by a few months--and it was a 16-bit CPU.GI is a name that's largely ignored these days, but they were very much a presence way back then with their CPUs, PICs, UARTs, EEROMs, etc., not to mention their game, sound and speech chips, and some of their grandchildren are still around today...

Chuck(G)
December 15th, 2010, 07:19 AM
GI is a name that's largely ignored these days, but they were very much a presence way back then with their CPUs, PICs, UARTs, EEROMs, etc., not to mention their game, sound and speech chips, and some of their grandchildren are still around today...

...amongst which is the PIC MCU. Dates back to 1976 or so--the instruction set is clearly recognizable.

dave_m
December 15th, 2010, 09:35 AM
...amongst which is the PIC MCU. Dates back to 1976 or so--the instruction set is clearly recognizable.

Yes, I remember General Instruments, some good stuff. Are they still around?

Chuck(G)
December 15th, 2010, 09:53 AM
No, it was sliced and diced. The microelectronics division was spun off as Microchip in the 80s; the power semiconductor business became part of Vishay; the cable and antenna business became part of Motorola.

Donald Rumsfeld (yes, him) was CEO between 1990 and 1993.

saundby
December 20th, 2010, 01:47 PM
..and we're back to the "what's so special about a single-chip CPU" question again. By 1976, we had several 16-bit CPUs. Weren't the MicroNova and Fairchild 9440 a reality in 76? You could do whole database and business management software on those.

MicroNOVA (mN601) came in '77, but MicroFlame (9440) was out in '76. I've heard rumors of MicroFlame samples in '75. And, of course, the IMP-16 was first.

They're all interesting as technical achievements, in my mind. Though as practical processors they all suffered from poor performance for the overall system cost. They were all dog-slow. Even compared to other processors of their time, 16 bits or no. Plus you had to pay to put the full 16-bit bus into the system. At the time, that often meant building up the bus with 6-bit ICs since the MSI 8-gate chips were, if available, prohibitively expensive compared to the older 6-gate or 4-gate per chip ICs.

At any rate, if you were in the position of putting money down on apps that required a larger data space or a wider word, then the price differential between a micro and a mini would not be enough to suggest the microcomputer. Minis were quite cheap for the power, especially since VAX had moved into the top of the market with 32 bits.

16 bit micros weren't cheap enough to compete with 8 bitters in opening up new market at the bottom end, and weren't cheap enough to erode the market for 16-bit minis, either.

And the early ones were really slow, in part because the process technology they were implemented in was slow, and in part because other compromises made in the designs to get them on a single die hurt their performance. Not so much compared to an 8008, but they were competing with 8080, 6800, 6502, etc., not the 8008 which was already old and slow by the time they came out.


So are there any benchmarks, say, comparing floating-point BASIC performance on the 1802 with other CPUs of the time?

As far as 16-16 bit registers go, I think the GI CP1600 precedes the 1802 by a few months--and it was a 16-bit CPU.

The point I was trying to make with the 1802 relates to its in-built DMA. Basically I was positing that if we had the microprocessor without the simultaneous nonvolatile memory revolution, would microprocessors with a DMA feature like the 1802's--which allows loading memory and running with a bit of simple logic--have had a marked advantage over processors that are really only practical with a body of nonvolatile memory?

I wasn't trying to make out the 1802 as the be-all and end-all processor. I was referring to its LOAD mode DMA. With LOAD mode you can make the 1802 boot off practically anything--paper tape, cassette tape, artfully arranged bricks, you name it. This is part of what makes it a good chip for things like space probes, of course. The question was whether we would have seen more of this in a world without the EROM/PROM revolution of the mid 70s.

As to 1802 performance, I've already gone into that. The ability to take advantage of the voltage tolerance/high clock rate of the CMOS process you could outpace anything up to (and often including) a Z-80A with an 1802 system designed for performance. Most implementations of the 1802 traded away performance for simple video interfacing by running the system clock at 1/2 colorburst frequency (1.89MHz) and no faster.

Of course, if you wanted real compute performance, you didn't go to an micro if you could afford better. You bought time on a PDP, IBM, Eclipse--or a VAX once they appeared.

Getting back to the memory/microprocessor discussion, personally I'm convinced the real revolution was in memory. From the perspective of the foundries, the microprocessor is practically nothing more than a device designed to sell memory. Which is why the low-end, low-cost designs dominated. Once the market demand for semiconductor memory in mainframes and minis was growing more slowly than production capacity, the micro opened up new markets at the bottom end, where the costs of a mini weren't justifiable, and the advantages of computers were often unknown.

So we got microprocessors because semiconductor memory had already caused a revolution. And 8 bits dominated in the new low end market specifically because it was lot cheaper to build a reasonably competent system around than 12 or 16 bits. Supercalc and dBase II (or Spock, if you want to go a little earlier) were enough for that market segment.

What's interesting about the 4004 is how it's perceived today. It looks like it was more of a technical achievement than it really was. It's got a strange, crabbed design that looks like it was a result of really being out on the cutting edge. In fact, it was a design shaped by the constraints of Intel's packaging ability and inexperience at designing processors.

The much more sophisticated designs, based on long experience with processor design, and with better packaging options from chip houses that had a broader range of equipment than Intel don't have that unfinished, odd Victorian design appearance. Which somehow makes them look less "cutting edge." No matter how technically amazing it was to put a full 16 bit design on a single die. (And disregarding the fact that the price/performance ratio of a system built around it turned out to be a poor business investment for the manufacturer.)