PDA

View Full Version : Assmebly programming



Darshevo
September 3rd, 2009, 09:59 PM
I was at a point in my life a fairly accomplished transact-sql programmer. Later I spent a third of my day within the confines of visual studio and spoke VB as fluently as english. Later it was asp/asp.net and finally Java. Now some years later after laying under one too many broken cars and swinging one too many framing hammer I am thinking I want to get back into it. Not as a professional again, my desk life is behind me. I am far too restless to sit still that long now. I picked up a C++ book a few days ago and whipped up some basic game code (guess my number) While I certainly am not going to sit down and write the front end for Halo 37 anytime in the near future, I just don't see any challenge in it. C++ was my last high level language frontier. I'd used just about everything else at some point or another.

I think maybe I want to learn assembly. I tried many years ago (like in the C64 days) and couldn't seem to grasp it, maybe because I was only 8 or 9 at the time. Sadly since then everytime I have tried to pick up ASM I seem to have a mental block.

Woohoo! My challenge :)

Which brings me to my question (finally) What flavor of asm should I go to work on? And how similar are they? I have C64 and 128, an Amiga, a 5150 I could get running easily enough, my 286 (my primary vintage machine), my daily user P4, etc... or should I go out on a limb and learn something more obscure like dreamcast or NES?

And once I do commit to an asm language, how similar are they from system to system? If I chose an old C= would it be a matter of syntax to apply what I learned to microcontroller programming or the 5150 at some point?

-Lance

Chuck(G)
September 3rd, 2009, 10:54 PM
I'm probably not the one to advise a beginner on this, since I've spent most of my professional life working in assembly of one kind or another, but here goes...

(Yeah, I know, folks call it "assembler", but it's properly "assembly language" or just " assembly". I once heard someone say that they programmed " BAL" on a 360/91, but that's another quibble).

Assembly really isn't a language in the same way that, say, C or BASIC is--it's simply a way to write out machine instructions using text, allowing the assembler (not "compiler" ) to do bookkeeping for you. Most assemblers have macro processor facilities, which makes things less tedious, but at the root of it all, it's machine instructions.

So, to start with, you need to have an understanding of the way your selected processor works; that is, what the instructions do. After that, differences in assembly dialects are mostly matters of style.

Each instruction has a numeric value and operand encoding which is what the hardware understands and a mnemonic and operand syntax convention for the programmer to understand.

So, in 8080 assembly language, an unconditional transfer of control is called a "jump" and takes the address of the place to transfer control to as an operand. In machine language, if I want to write an instruction to jump to address 1234 (hex), the machine encoding would be C3 34 12 (in hex). In assembly language I can assign a name to location 1234--say, "THERE"--and write the following for the assembler:
JMP THERE
...which would result in the same code being generated.

So your question actually becomes "which machine architecture do I want to learn?" because once you learn one assembly language dialect, they resemble each other quite a bit. Sort of like moving between Spanish and Portugese.

The 8 bit machines are fairly easy; the 6800 or 6502 aren't too complex. The 8080 is another one that I'd recommend before learning the Z80, simply because there are fewer instructions (the Z80 will run 8080 code). I'd steer clear of the x86 processors to start with because the assemblers there try to be "smart", so there's not a one-to-one correspondence between a mnemonic and a machine instruction and that can be confusing. The 68000 and 6809 CPUs are very regular and easy to learn.

For whatever it's worth....

carlsson
September 3rd, 2009, 11:15 PM
Assembly or machine language depends on the CPU in question:

Z80 - found in TRS-80, Kaypro, ZX Spectrum, MSX, N8VEM and tons of other applications. An 8-bit CPU with plenty of development tools, documentation and active programmers.

X86 family - what you'll find in a PC compatible. Your 5150 has a 8088, your 286 has a 80286. They differ in size of address bus and possibly instruction set. Apparently these also are very common (duh, even modern PC CPUs use a X86 instruction set). The X86 is the successor to Intel 8080, which in its turn is a close cousin to the Z80 mentioned above. People who know one CPU usually can learn the other quite easily. For a beginner though, the design may seem a bit arbitrary at places: which instruction does what with which register or memory address.

6502 and variants: You will find this one in the C64 and C128. Even the NES has a slightly modified variant of the 6502. It has very few registers unless you count the whole first 256 bytes of RAM as registers too. This is the CPU I know best. I don't know if it is more symmetrical than the Z80, but just like it you will find lots of resources on the Internet.

68000: This is Motorola's 16-bit line, contemporary with Intel's X86. It is what drives an Amiga, Atari ST, early Macintosh, Sega Genesis/Megadrive and lot more. The 68000 (also known as 68K) is said to be a perfect beginner's CPU due to having an orthogonal design: eight 16-bit address registers, eight 16-bit data registers, a very big number of instructions to the point that sometimes you may have to consider which one to choose instead of how to solve a problem using fewest instructions.

Dreamcast on the other hand uses a much newer SH4 RISC processor. I don't know how common it is to find documentation, but I get a feeling with newer CPUs you'd rather use a higher level compiler than edit your machine code by hand.

There are plenty of CPUs I intentionally left out from this list. Not because they would be insignificant, but because relative to the number of users they would not have the same support in instruction books, online docs and user groups.

As to your last question, I would say that while you would understand the concept of machine language, the exact instruction set and registers differ a lot between CPU families. You could probably adopt the basics of a foreign CPU, but you would have to closely study the list of instructions and how they interact in terms of processor flags, register and memory use in order to learn something different. I would think you can make an analogy to human languages: once you have studied German and learned everything about grammar, you could take a course in .. Spanish or Finnish. Neither of these languages would remind you much about German, but you could identify the grammar elements of a language which probably would give you a head start over somebody who can't differ a verb from a noun, a case from a gender, an active tense from a past tense...

hargle
September 4th, 2009, 07:45 AM
as a professional programmer who still works in assembly language, I suggest starting with x86. It may not the greatest language in the world or the easiest to understand as a beginner, but there are a LOT of other programmers out there who can help with the learning curve. The main persuasion though, is that there are so many debugging and compiling tools available on PCs, and not so many for console systems/older computer systems. Soft-ICE alone is an invaluable debugging tool for PCs for me.

barythrin
September 4th, 2009, 07:55 AM
I've tinkered in it a few times and probably just had motivational issues after a certain point which cause a similar mental block. On the bright side, on a PC you can pick up a book on that processor or many old hardware reference books and find the set of "commands" that computer will understand and even (in the better books) a quick example of the usage.

I hear the 6502 and 68k were much easier writing code because of the memory addressing. The x86 is a bit odd with real-mode memory allocation then if you want to do current programming (this is more prevalent in writing your own bootable code, not OS/specific code) you have to enable the processor for protected mode and then write a memory table (it's in books, I just found it frustrating not knowing how to do it myself). This enables enhanced commands as well as more than 1MB of memory you can address. The irony (but interesting point) is if you write your own bootable code and turn on your computer, it's just about as advanced as a 5160/5170. You have a simple instruction set and 1MB of RAM to play with.

Fun wise, pick any of them and it's just learning what commands/macros are programmed into the machine. Depending what your end goal is if you do learn x86 then you have operating system interrupts you can choose (stuff MS or Linus or others added to make life a little easier and macro common stuff you'd do) but if you REALLY took off with it you could learn 32-bit or 64-bit assembly and write current apps as well. There are still books on 32-bit assembly for Windows, etc out there.

They all have their charm ;-) I never got that far but my original goal was to write some cross compatible code or at least port some sort of testing code for different vintage computers. You know.. do something for the community if I get off my keister.

Chuck(G)
September 4th, 2009, 08:31 AM
Also, let's not forget those platforms where assembly programming is still very much alive--embedded microcontrollers. There are many to chose from and some, like PIC and AVR have very large communities. The great part is that you can actually build something that's not just another desktop computer with them. You don't have to deal with arcane peripherals or operating systems either. Development kits are easy to come by and development software runs on x86 platforms. Instruction sets tend to be very simple and straightforward.

Nowadays, there's very little good reason to program x86 assembly, unless you're doing a project on an old CPU such as an 8088. Windows (NT, 2K, XP Vista) has virtually no assembly language code in it. I programmed a lot of x86 assembly in the 80s (I wrote my first 8086 product in late 1979) and early 90s, but could not find a compelling reason to do so after about 1995. The last new major assembly project I did was a VxD for Windows 95--and even that could have been done just as effectively in C.

BTW, Was the topic title intended as a slam against us "assmebly" programmers? :)

krebizfan
September 4th, 2009, 09:24 AM
A number of college courses used the MIPS processor assembly. MIPS is a clean design; the introductory books are readable and sometimes free; plus there is a working simulator that is also free.

I know that nowadays a minor mistake won't wreck a $500 disk drive but the simulator still makes catching the initial bugs much simpler.

Darshevo
September 4th, 2009, 09:32 AM
Oh man, nice typo. Sheesh

Thanks to everyone for the insight. I think maybe I will look into an inexpensive microcontroller proto set up and start there. I am thinking it will be easier for me to learn if I can throw together little projects as I go - small successes keep me motivated. Start by turning on an LED or making one blink, move on to spinning a small DC motor maybe. Work my way up to a working calculator or something.

-Lance

Chuck(G)
September 4th, 2009, 09:43 AM
If it's a cheap-and-easy microcontroller that you're interested in, try the TI MSP430. Complete development kits (plugs into a USB port) starting at $20. The MSP430 is a bit different from its other microcontroller kin in that it's a Von Neumann (like conventional desktop CPUs) architecture, rather than a Harvard (separate code and data address spaces and word sizes) model. Very simple, orthogonal instruction set and very easy to learn.

More info here... (http://www.ti.com/corp/docs/landing/mcu/index.htm?DCMP=MSP430&HQS=Tools+OT+ez430)

You can often find the eZ430 kits on eBay for $10 or less. There's a YouTube video showing the 430 running off of grapes (http://www.youtube.com/watch?v=ZxGZIiyyxrM).

Fallo
September 4th, 2009, 11:01 AM
Nowadays, there's very little good reason to program x86 assembly, unless you're doing a project on an old CPU such as an 8088. Windows (NT, 2K, XP Vista) has virtually no assembly language code in it. I programmed a lot of x86 assembly in the 80s (I wrote my first 8086 product in late 1979) and early 90s, but could not find a compelling reason to do so after about 1995. The last new major assembly project I did was a VxD for Windows 95--and even that could have been done just as effectively in C.

Modern hardware is too complex to be easily programmed in assembly. As far as PCs are concerned, 8086/8088 software was usually assembly (many applications and most games). With the 286, and especially when protected mode began to be used, developers switched to C. As I said in a post once before, Microsoft always used C for their full-screen applications after 1986. An application like Office would take forever to write in assembly, even with the large programming teams they have working on it. The speed gain from using assembly is not as great today as it was then, and modern compilers can produce code nearly as efficient as anything a human could write (unlike the slow, bloated code that was typical of 16-bit compilers).

Here's an interesting little bit of knowledge. In 8086/8088 programming, you always clear a register with XOR AX,AX (for example) instead of MOV AX,0. Obviously this is because the XOR is faster and uses one less byte. However, compilers always generate MOV AX,0. Eventually (I'm not sure what generation of x86 processor it was), Intel optimized the CPUs so that MOV AX,0 executes faster than XOR AX,AX because most new software was written with a compiler.

Regardless, Microsoft still updates their assembler to support the current-generation Intel CPUs and includes it free with Visual C++ (it stopped being sold as a separate product years ago). You could write Windows applications in assembly with it, but it wouldn't be practical for anything more complicated than Notepad (although that probably wasn't written in assembly either).

If we look at game consoles, 8-bit ones up through the Gameboy were almost always programmed in assembly, as were the 16-bit systems. C++ took over with 32-bit consoles like the PSX. I had heard that Sony's official development kits advised against programming in assembly. They use RISC processors anyway, which are really designed for a high-level language.

Chuck(G)
September 4th, 2009, 11:35 AM
Here's an interesting little bit of knowledge. In 8086/8088 programming, you always clear a register with XOR AX,AX (for example) instead of MOV AX,0. Obviously this is because the XOR is faster and uses one less byte. However, compilers always generate MOV AX,0. Eventually (I'm not sure what generation of x86 processor it was), Intel optimized the CPUs so that MOV AX,0 executes faster than XOR AX,AX because most new software was written with a compiler.

Oh, I don't think that was the reason for it--and XOR AX,AX and MOV AX,0 don't do the same thing anyway--one clears the carry and overflow flags and the other doesn't. I'm not so sure about the timing difference either. On a 386, both are clocked at 2 cycles. And the MOV AX,0 takes 3 bytes, while the XOR AX,AX takes only two.

Things started changing pretty drastically internally with the 386, so that for example, the LOOP instruction is actually slower than doing a DEC/JNZ. Similar examples abound, where the original "cycle saving" instructions become a liability. A lot of work went into optimizing the execution of the elementary operations, with the result that more complex ones got left behind. Using the elementary instructions has another benefit--they're easier to schedule efficiently.

Things only get more complex as the architecture advances. I compare the x86 to a Prius with a buggy whip. The P4 still has instructions that were introduced in the 8008 and Intel hasn't been able to be shed of. OTOH, Motorola took the opposite approach; the 6809 is a clean break from the 6800 and the 68000 is a clean break from the 6809. The 88000 of course, is nothing like any of the 68xx chips.

carlsson
September 4th, 2009, 12:44 PM
Ah, I didn't know the 6800 and 6809 were so different from eachother. Honestly I haven't tried to program neither, but looking at 6809 assembly source it reminds me quite a bit about 6502. Then again the MOS 6501 was a rip-off of the Motorola 6800, so they had to change the design somewhat with the 6502 to not get sued over their ears.

I notice this topic is in Vintage Computer Programming. There is nothing wrong with a Pentium 4 or a recent microcontroller, but they are not vintage. :-P I suppose the original poster should consider for which reason he wants to learn some form of assembler language. Apparently there still is a commercial market for new software for vintage computers, although it is very small and usually you spend at least 10-20 times as long as you will get paid for.

Fallo
September 4th, 2009, 12:55 PM
Things only get more complex as the architecture advances. I compare the x86 to a Prius with a buggy whip. The P4 still has instructions that were introduced in the 8008 and Intel hasn't been able to be shed of. OTOH, Motorola took the opposite approach; the 6809 is a clean break from the 6800 and the 68000 is a clean break from the 6809. The 88000 of course, is nothing like any of the 68xx chips.

Consider that DOS 1.0 will still run on a modern PC, while Macs haven't been able to run the original Mac OS from 1984 for years. Apple effectively did away with the last vestiges of the original 128k Mac by switching to Intel CPUs, thus eliminating the ability to run 680x0 code.

Regarding the 6809, it had the same instruction set as the 6800 (with enhancements), but used different opcodes. It's higher price meant that Tandy had to skimp on the Coco's sound and graphics capabilities, whereas Atari and Commodore used cheaper 6502s and were thus able to have better sound and graphics.

Chuck(G)
September 4th, 2009, 01:12 PM
Regarding the 6809, it had the same instruction set as the 6800 (with enhancements), but used different opcodes. It's higher price meant that Tandy had to skimp on the Coco's sound and graphics capabilities, whereas Atari and Commodore used cheaper 6502s and were thus able to have better sound and graphics.

No, the 6809 was internally very different from the 6800, with a different register structure and instruction set design (e.g. many more addressing modes). Motorola took pains to make sure that 6800 source programs could be assembled/converted and run on the 6809 to preserve the utility of the 6800 software base. (But then, so did Intel with the 8086 and Zilog with the Z8000; much early DOS software was nothing more than auto-translated x80 code).

One interesting aspect was that the 6809 has an instruction set of 59 instructions, while the 6800 has 78--it's smaller. When the 6809 came out (at roughly the same time as the 68K), people commented about the low clock speeds. Whereupon one of the Moto design engineers quipped that if they thought that their CPU was going to be judged solely on the basis of clock speed, they would have put a waveguide on it.

carlsson
September 4th, 2009, 03:25 PM
Interestingly the very first Macintosh prototype seems to have been based on a 6809, but rather soon they switched to the 68000.

dave_m
September 4th, 2009, 04:00 PM
Anders, The personal computing world would have been way ahead had the stars aligned so that IBM could have chosen The Motorola 68000 family rather than the Intel 8086 family for the PC. The 68000's linear address space was a cleaner solution. Do you agree? -Dave

Dr_Acula
September 4th, 2009, 05:55 PM
Dave_m - I'd have to agree with that. I started with 8080 and then Z80 assembly and then got lost with the strange memory mapping of the x86 processors from the mid 80s onwards. So I guess I'm stuck in a time warp with assembly programming. I'm happy to program new languages like vb.net, but there was something really nifty in the mid 80s about being able to combine machine code and high level C/Basic in the same program. Assembly is blazingly fast and imagine the power of a windows PC where windows itself were written in assembly and could boot in under a second and program sizes were measured in kilobytes rather than megabytes?

I've just finished writing assembly routines for the N8VEM for both strings and for math. There is something very satisfying about writing something that you know is not bloatware.

Addit - cardinal rule of assembly programming broken with the code below - no comments! My excuse is that I don't understand it enough to be able to put in any meaningful comments. Generally of course, comments are important in assembly otherwise it makes no sense at all. All I can say is that a divide routine in assembly works the same as decimal long division that you learn in primary school, except that it is binary instead of decimal.



ld hl,5000
ld de,0
ld bc,500
call Divide

; 32 bit by 16 bit unsigned divide DEHL/BC = DE r = HL
Divide: LD A,16
EX DE,HL
Divide1:ADD HL,HL
EX DE,HL
ADD HL,HL
EX DE,HL
JP NC,Divide2
INC HL
Divide2:OR A
SBC HL,BC
INC DE
JP P,Divide3
ADD HL,BC
RES 0,E
Divide3:DEC A
JP NZ,Divide1
RET


Here is some string code. Same as Left$() in Basic. String is stored at memory location stored in registers DE. Strings end with a $ character.



STRINGS_LEFT: ; check string de, returns string in hl, number of bytes in B
PUSH HL
STRINGS_LEFT_1:
LD A,(DE) ; GET CHARACTER
LD (HL),A ; MOVE IT
INC HL ; HL+1
INC DE ; DE+1
DJNZ STRINGS_LEFT_1 ; LOOP UNTIL B=0
LD A,'$' ; PUT A $ AT THE END OF THIS SHORTER STRING
LD (HL),A
POP HL ; RESTORE START
RET


I like 8080 and Z80 as there are only a few registers each containing a byte. A, B, C, D, E H and L. Then you can pair up registers eg HL for a 'word' of 0 to 65535. Memory locations are in brackets, eg LD A,(HL) looks in the memory location HL and puts the answer in register A. Then you could add1, eg INC A, and then store it back where it came from, eg LD (HL),A. Then add 1 to the memory location, eg INC HL. And even though there are hundreds of instructions, most of them are not needed and it is possible to learn assembly with only 30 instructions or so. So the steep bit of the learning curve is not so steep.

Fallo
September 4th, 2009, 06:05 PM
Anders, The personal computing world would have been way ahead had the stars aligned so that IBM could have chosen The Motorola 68000 family rather than the Intel 8086 family for the PC. The 68000's linear address space was a cleaner solution. Do you agree?

It was, especially since the 68000 could address 16MB without the complicated mess of protected mode. However, it wasn't well-suited to multitasking software, since flat addressing means that applications can run into each other and crash. Macs long suffered from this problem (except on the early single-tasking versions of the Mac OS) until OS X finally introduced paged memory, where applications can not touch anything outside their own segment.

The fixed 64k segments of the 8086/8088 and 286 have the same problem in that in a multitasking OS, there is no clear boundary between where one application begins and another ends. This caused the 16-bit versions of Windows to be quite unstable, as well as Windows 9x, which although 32-bit had large amounts of segmented 16-bit code. By contrast, Windows NT was completely 32-bit and thus much more stable.

Chuck(G)
September 4th, 2009, 06:31 PM
It was, especially since the 68000 could address 16MB without the complicated mess of protected mode. However, it wasn't well-suited to multitasking software, since flat addressing means that applications can run into each other and crash. Macs long suffered from this problem (except on the early single-tasking versions of the Mac OS) until OS X finally introduced paged memory, where applications can not touch anything outside their own segment.

The 68K was far ahead of any other monolithic microprocessor of the time. It implemented supervisor-vs-user modes, something the x86 didn't have until the 286.

If you wanted memory protection between tasks, you added a 68451 MMU. Some vendors even implemented paged virtual memory by running a second 68K a clock phase ahead of the other one. The Lisa simply enforced a coding standard that avoided the non-restartable instructions.

About the only gripes anyone really had with it was (a) it came in a huge package (64 pin DIP); it was hard to get in quantity and many instructions were not restartable after a fault (see the 2 CPU solution). Peripherals were also a bit short in availability. And it was expensive.

But the 8086 inherited a whole bunch of "serious" applications that had been written for the 8 bit CP/M world; something the 6800 world had never enjoyed. Even GEM on the Atari ST looks more like MS-DOS internally than anything else.

I remember that just before the announcement of the 5150, the product announcement of the IBM 68K-based laboratory computer caused a hopeful buzz that maybe the 5150 would be 68K based. I couldn't understand why IBM would use an 8088 and compete with its own DisplayWriter.

Apparently that had been the plan all along...

Fallo
September 4th, 2009, 07:45 PM
The 68K was far ahead of any other monolithic microprocessor of the time. It implemented supervisor-vs-user modes, something the x86 didn't have until the 286.

The lack of memory protection on the 8086 didn't really matter since with only a 1MB addressing space, there was no room for any decent multitasking OS. The 68000 could address 16MB, which did make multitasking feasible. However, the Mac OS didn't ever use the 68000's memory protection features. Even if it is used, you still have the inherent problem that in a flat memory scheme applications can overwrite each other. Windows 3.x and 9x certainly had memory protection, but the use of 64k segments still made for an unstable setup.


But the 8086 inherited a whole bunch of "serious" applications that had been written for the 8 bit CP/M world; something the 6800 world had never enjoyed. Even GEM on the Atari ST looks more like MS-DOS internally than anything else.

The CP/M trio of WordStar, Supercalc, and dBase were ported to the 8086 early on. You also had the very CP/M-like DOS.

On the other hand, the 68000's technical superiority meant that it was used in many different machines: Mac, Lisa, Amiga, Atari ST, workstations, arcade games, Sega Genesis, etc. Aside from PCs, the x86 line was only used in rather obscure PC-lookalikes such as the Tandy 2000 and Sanyo MBC-550.


I remember that just before the announcement of the 5150, the product announcement of the IBM 68K-based laboratory computer caused a hopeful buzz that maybe the 5150 would be 68K based. I couldn't understand why IBM would use an 8088 and compete with its own DisplayWriter.

The x86 line was really the only choice for the IBM PC, since in 1980-1981 the 68000 and Z8000 were too new and not ready for use in a production machine.

Chuck(G)
September 4th, 2009, 09:05 PM
The lack of memory protection on the 8086 didn't really matter since with only a 1MB addressing space, there was no room for any decent multitasking OS.

There was MP/M-86, concurrent CP/M/concurrent DOS.


Aside from PCs, the x86 line was only used in rather obscure PC-lookalikes such as the Tandy 2000 and Sanyo MBC-550.

The NEC APC, which morphed into the PC98 line, the IBM DisplayWriter were two notable mainstream examples. I don't think anyone ever considered the 8086 for a game machine. The Mitsubishi Multi-16 preceded the 5150 by a few months and sold well in Japan. Columbia, Eagle, Televideo and Xerox all had 8086 systems out during this time.


The x86 line was really the only choice for the IBM PC, since in 1980-1981 the 68000 and Z8000 were too new and not ready for use in a production machine.

Actually, the Z8000 and 8088 came out the same year. One thing that worked in Intel's favor was the availability of second sources (which Motorola was not willing to do) and bundling of existing peripheral chips with CPU sales which Intel was happy to do. I think Intel initially made more money from their sales of peripheral chips to IBM than from the CPUs.

The Z8000 had only AMD as a somewhat confused and unwilling partner, together with Siemens (the venture was called AMC). Zilog was not doing well in 1979-1981; they'd lost Ralph Ungerman in 1979 and Federico Faggin in 1980 and were hemorrhaging money. AMC didn't last long--Siemens became frustrated and pulled out (they ended up using the 8086 in their PLCs).

Similarly, Intel had smoothly provided software support for the 8086--the first coding I did was assembled on an MDS-200 8-bit system. They had a converter in place--the first sample job I submitted at the local sales office took 7 hours to translate about 3000 lines of 8080 assembly. It was slow, but it was there.

One could not say the same thing for Zilog or Motorola.

I agree that if IBM needed to introduce a system in 1981 and use low-cost commodity parts, the 8086 was the only 16-bit system that could practically be deployed. But I had the suspicion that IBM corporate didn't really take the project seriously for quite a while and that the sales, even in the face of the Apple Mac that came along a couple of years later, surprised them.

So the question for me boils down to "Was there a compelling reason for IBM to introduce the 5150 in 1981?"

krebizfan
September 4th, 2009, 09:21 PM
I started off using UCSD Pascal running on a PDP-11. 64k code; 64k data; segments don't scare me. The 8086 did all that at a nice cheap price.

The 16-bit versions of OS/2 were quite stable. I guess having functions to handle cross segment addressing worked better than trusting programmers to do math correctly.

Fallo
September 4th, 2009, 10:43 PM
There was MP/M-86, concurrent CP/M/concurrent DOS.

What I should have said was that there wasn't enough memory to multitask major applications, not that a multitasking OS wasn't possible. You could add Windows 3.0 to the list of multitasking OSes that run on an 8086, but you can't really do much with it unless you have a 286.


The NEC APC, which morphed into the PC98 line, the IBM DisplayWriter were two notable mainstream examples. I don't think anyone ever considered the 8086 for a game machine. The Mitsubishi Multi-16 preceded the 5150 by a few months and sold well in Japan. Columbia, Eagle, Televideo and Xerox all had 8086 systems out during this time.

As I said, those are all pretty obscure machines compared to the Mac or Amiga. Most would fit into the category of PC workalikes that ran DOS or CP/M-86. Although they usually offered better hardware than the IBM PC, the lack of compatibility killed them. People came to believe that any x86 machine that ran DOS should be IBM-compatible.

I'd think one look at the 8086's segmented addressing was enough to scare anyone away from using it in an arcade machine or a game console. The x86 line did nonetheless see use in that area: on the Xbox long after paged memory had arrived and 64k segments were a thing of the past.


Actually, the Z8000 and 8088 came out the same year. One thing that worked in Intel's favor was the availability of second sources (which Motorola was not willing to do) and bundling of existing peripheral chips with CPU sales which Intel was happy to do. I think Intel initially made more money from their sales of peripheral chips to IBM than from the CPUs.

From what I've heard, the 68000 did not yet have a complete set of peripheral chips in 1980. It would have also necessitated IBM's using all 16-bit components, which were rare and expensive at the time. For that reason, they finally settled on the 8088, which allowed the use of cheap, readily available 8-bit components.


The Z8000 had only AMD as a somewhat confused and unwilling partner, together with Siemens (the venture was called AMC). Zilog was not doing well in 1979-1981; they'd lost Ralph Ungerman in 1979 and Federico Faggin in 1980 and were hemorrhaging money. AMC didn't last long--Siemens became frustrated and pulled out (they ended up using the 8086 in their PLCs).

Unlike the 8086 and 68000, the poor Z8000 was condemned to obscurity. It was used mainly as an embedded processor and in workstations. Some of Namco's arcade games used them as well.


I agree that if IBM needed to introduce a system in 1981 and use low-cost commodity parts, the 8086 was the only 16-bit system that could practically be deployed. But I had the suspicion that IBM corporate didn't really take the project seriously for quite a while and that the sales, even in the face of the Apple Mac that came along a couple of years later, surprised them.

There was supposedly an IBM spokesman who said in 1983 or 1984 that (and I'm paraphrasing) "When we started, we expected to sell at most 100,000 PCs. Now we're selling 100,000 a month."

If that's true, it would work out to a little over a million PCs a year, which is not an unrealistic figure considering nearly that many Commodore 64s were sold per year at their peak in 1984-1986.

Chuck(G)
September 5th, 2009, 10:33 AM
There was supposedly an IBM spokesman who said in 1983 or 1984 that (and I'm paraphrasing) "When we started, we expected to sell at most 100,000 PCs. Now we're selling 100,000 a month."

That would fit. The 5100 from a few years earlier was a huge flop for IBM and the idea that IBM was going to sell a system using nothing but commodity components using third-party software must have been a major culture shock to the suits at IBM. Everything about the 5100 was IBM, right down to the CPU and the peripherals.

The introduction of the PC RT and MCA systems shows that IBM had a hard time shaking the NIH mindset.

Fallo
September 5th, 2009, 05:51 PM
Everything about the 5100 was IBM, right down to the CPU and the peripherals.

The 5100 was intended for the engineering and scientific markets (as evidenced by its $10,000+ price tag) and never meant to be a general-purpose home or office computer. It was made entirely out of custom hardware in traditional IBM style. Lee Felsenstein (the designer of the Sol-20 and Osborne) said, "My experience was that whenever you found IBM parts in a junk box, you threw them away because they were all custom jobs and you couldn't find any information about them."


The introduction of the PC RT and MCA systems shows that IBM had a hard time shaking the NIH mindset.

The MCA was a well-meaning idea as the 386 had just come out and needed a 32-bit bus for optimum performance. Making companies pay IBM a licensing fee to use it was not a bright idea (Tandy and NEC were the only takers), but then the open-standard EISA bus proved no more successful than the MCA was. When the PS/2 line came out, IBM also foolishly tried to demand that all clone makers pay them retroactively for using the ISA bus (sound of crickets chirping).

Even the original IBM PC had a nonstandard parallel port and also introduced the cable twist to set drive letters (instead of using jumpers).

Chuck(G)
September 5th, 2009, 10:02 PM
Was IBM the first with the cable twist? I thought that was pretty clever--it allowed the sytem to control the drive motors individually--you can't do that with a "flat" cable.

Unknown_K
September 6th, 2009, 03:18 AM
I kind of like the MCA and EISA bus, makes setting up cards much easier then ISA.

ziloo
September 6th, 2009, 04:33 AM
Was IBM the first with the cable twist? I thought that was pretty clever--it allowed the sytem to control the drive motors individually--you can't do that with a "flat" cable.

Alright Chuck...you set yourself up :wink: ! I have heard this story so many
times, and now is a good time for a great master to give us a plain explanation
as to what the "twisted cable" is all about.

Thank you in advance

ziloo

mbbrutman
September 6th, 2009, 06:18 AM
No, now would be a good time to resist going off topic in a thread labeled 'Assembly programming'.

;-0

Chuck(G)
September 6th, 2009, 09:24 AM
Alright Chuck...you set yourself up :wink: ! I have heard this story so many
times, and now is a good time for a great master to give us a plain explanation
as to what the "twisted cable" is all about.
Thank you in advance

I'll post in another thread a bit later today.


No, now would be a good time to resist going off topic in a thread labeled 'Assembly programming'.

You didn't read the thread title, did you, Mike ? ;)

mbbrutman
September 6th, 2009, 10:23 AM
I did notice the spelling, but that's what contributes to the character ...