PDA

View Full Version : Most important landmarks in computing history



carlsson
April 27th, 2005, 08:06 AM
One of my aspirations is to be a symphonic hobby composer, i.e. I am totally self-trained when it comes to composing music and I'm only doing it on irregular basis.

I've thought that some day I should try to compose a suite of "serious" music, i.e. for symphonic orchestra or band. I know the instruments quite well when it comes to useful range and a little about technical difficulties. But to make a suite, I'm thinking that I need a theme to get inspired by.

Now, I don't know if any modern composer has already written works inspired by computing history, but no matter what, this was one idea that came to my mind last week. I could probably think both about computers, the conditions and environment around when it was made and much more.

Typically I think a suite should be about 3-5 movements (parts). Of course I should not take on a larger task than I could finish, but given those conditions, exactly what do you think should be covered? My thinking is something like this:

1. Babbage (Difference engine, analytic engine, Ada Lovelace etc)
2. ENIAC (or maybe that German computer that predated it?)
3. Cray (and/or PDP, something around the age of VLSI technology?)
4. Today? (IBM Blue Gene, advanced video games?)

Is there any important landmarks I have overlooked? Maybe something more about video gaming, like Pong / Atari 2600 / C64 / SNES etc?

vic user
April 27th, 2005, 08:51 AM
your project sounds (no pun intended) really interesting!

here is a link with a timeline for computers:

http://www.maxmon.com/history.htm

maybe that wil help you a bit.

chris

patscc
April 27th, 2005, 09:39 AM
Definately, you should use the Zuse Z3
I think, to add a sense of struggle, something on the interplay between the market force of Microsoft, which did help unify the industry, and so on, even though they're monopolistic, and the stuggle of innovation to break out of the Microsoft mold.
Of course, this sounds very Wagnerian

patscc

EvanK
April 27th, 2005, 10:44 AM
Neat idea! I really think you should include at least one microcomputer (the Apple II) and, perhaps, some circus (or funeral) music at the end to represent Windows.

Terry Yager
April 27th, 2005, 01:52 PM
Frank Zappa was a pioneer in the electronically synthesized music field. Captain Beefheart once accused him of "writing music for instruments that haven't been invented yet". In another interview, some years later he retracted the comment, noting that it no longer applied, since the invention of the synclaviar. On one of his albums, (I forget which one) FZ included a disclaimer to the effect that none of the music on that album had been artificially produced, but that all of the sounds had been created by electronically altering the sound of real instruments. I dunno what all this has to do with your project, just thought I'd throw it out there.
BTW, FZ has more web space dedicated to him than to any other composer, modern or otherwise.

--T

mryon
April 27th, 2005, 02:47 PM
oh, you almost have to use the SK-1 in there somewhere ;)

patscc
April 27th, 2005, 03:12 PM
I was always partial to Tangerine Dream(TD), and Laurie Anderson, myself.

Of course, one of the reasons for this is that TD toured with a massive setup of blinking-light equipment, patch cords flying all over the place, and Laurie Anderson was on of the early adopters of the Farilight CMI and Synclavier, the 1st generation samplers. For those of you've that have never heard of TD, if you've ever seen the film 'Close Encounters of the 3rd Kind' , the scene where they're asking Stevie Wonder something about his modular Moog, sort of gives you an idea what TD on tour looked like.

Just out of curiosity, does anyone have any idea what the first production computer was that you could do more then just "beep" with ?

patscc

Terry Yager
April 27th, 2005, 03:41 PM
Just out of curiosity, does anyone have any idea what the first production computer was that you could do more then just "beep" with ?

Wasn't that an IBM mainframe at Bell Labs? At least, that was the first to do synthesized speech, back around 1963. I think the Moog machine was the first music synthesizer, wasn't it?

--T

carlsson
April 27th, 2005, 03:52 PM
Thanks for all the comments so far, and I suppose more will come. Seen over a 200+ year time perspective, does Microsoft really pose such a large milestone? I know what they have done to the IBM PC over the last 20 years to make it into a household item, but would this not have happened even without Bill Gates?

Maybe the Arpanet (now Internet) as a such also would be worth thinking about? Oh well, so far it is just my brainstorming.

patscc
April 27th, 2005, 04:32 PM
It might have happened, anyway, you're right.

I was thinking more along the lines of the crossroads we're now at, Microsoft's predominance in the marketplace has focused a lot of developer attention along similar paths, analogous to the way a laser focuses it's energy into an amplified wave.
Sure, it could have happened differently. It's not the 'Microsoft', it's the dynamics we now have. A driving force behind development is now surpressing evolution, and now, more than ever, developers are choosing different paths, some of which Microsoft is trying to surpress.
This struggle is what I find interesting. It's nothing new, it's why PC's ran DOS instead of CP/M 86, it's why RIMM, even though initially endorsed by Intel, was very quietly given the 'coup-de-grace' when they relaesed the first chipsets supporting DDR, and abandoned efforts to bring out server-class RIMM chipsets.
If you want to tell a story, in whatever medium, you need some sort of conflict to keep the whole thing going. And what bigger conflict than the one between the most widely used platform today, and the avatars out there ?

(Note: Before this turns into a flame-fest, I'm not a particular fan of any particular platform out there, especially Wintel. You just, unfortunately, can't ignore they're Triffid-like presence.)

patscc

alltare
April 28th, 2005, 09:57 AM
Have you considered including analog computers/calculators in your opus? Specifically, I'm thinking of a kind of non-programmable computer- the slide rule. What would be more suited for its musical representation than a slide whistle or a theremin? Napier's bones preceeded the slide rule, but how would you musically represent them? Wood blocks?

It might be interesting to contrast "analog music" against "digital music". I don't mean music produced by those two kinds of computers- I mean music representing the two types. Smooth and flowing vs. staccato, melodious vs. cacophonous, etc.

But what do I know? I can barely play the flute-O-phone.

alltare

carlsson
April 28th, 2005, 01:55 PM
I'm open for all suggestions. I definitely not know all the history behind computing. When you say slide rule, I also start to think in terms of the abacus. I thought where to set a starting point; the Greek seem to have been more into the actual maths than making a machine to do it (which I can see if it was impossible back then).

John Napier's slide rule (1614) seems to predate Charles Babbage's difference engine (1823, analytic engine described 1837) by a little more than two centuries, although there was one J.H. Müller in 1786 which never built his machine.

We'll see if I get to do something about it, maybe for the summer vacation.

Micom 2000
April 28th, 2005, 06:41 PM
One of my aspirations is to be a symphonic hobby composer, i.e. I am totally self-trained when it comes to composing music and I'm only doing it on irregular basis.

I've thought that some day I should try to compose a suite of "serious" music, i.e. for symphonic orchestra or band. I know the instruments quite well when it comes to useful range and a little about technical difficulties. But to make a suite, I'm thinking that I need a theme to get inspired by.

Now, I don't know if any modern composer has already written works inspired by computing history, but no matter what, this was one idea that came to my mind last week. I could probably think both about computers, the conditions and environment around when it was made and much more.

Typically I think a suite should be about 3-5 movements (parts). Of course I should not take on a larger task than I could finish, but given those conditions, exactly what do you think should be covered? My thinking is something like this:

1. Babbage (Difference engine, analytic engine, Ada Lovelace etc)
2. ENIAC (or maybe that German computer that predated it?)
3. Cray (and/or PDP, something around the age of VLSI technology?)
4. Today? (IBM Blue Gene, advanced video games?)

Is there any important landmarks I have overlooked? Maybe something more about video gaming, like Pong / Atari 2600 / C64 / SNES etc?
_________________
Anders Carlsson


WOW. Only a Swede or Finn could have conceived such a musical aspiration. I'm an old trumpet-player (still have a Committee-Custom but few teeth). Went to Berklee in the old building in 61 and married a New England Conservatory starlet who wanted more security than an itinerant Canuck musician could offer and left the field, physically at least. She,
on the other hand remained in the field until her recent death.

Now retired in a canadian backwater, but with a multitude of computers including an ST Mega with the midi ports, a Korg Ex8000 and a ream of
old Hi-Fi equipment. Alas no 4-track recorder. Still have some scores I wrote and altho my strength was improvisation, I have aspirations to put my computer and musical background to use, and likely simply need an
impetus to get my strong musical juices flowing. While I don't have the classical composition background you appear to have, I would be willing to contribute what I could. I'm overwhelmed by the score of The 3 Sisters of Belleville for example, including the use of definitely non-musical instruments such as a vacuum-cleaner. Of course there is also the use of the Commodore 1541 as a musical instrument, and numerous other equipment which would never be considered as a source of musical tones.
I could see a much expanded timeframe than what you present here. Perhaps we could take this to private E-mail.

Computer nerds, being what we are, could wind up debating like astrologers would, over the make-up of Gustav Holst's "The Planets".

Lawrence

Micom 2000
April 28th, 2005, 06:59 PM
ISR an old thread on classiccmp where Allison wrote about her and some other DEC programmers writing some music programs on a "Mark 2 ?"
It was way beyond my comprehension at the time and still is. Perhaps
she can ellucidate and make more sense than this garbled version is.

Lawrence

Exluddite
April 28th, 2005, 10:17 PM
What about player pianos, music boxes, and that sort of thing? After all, they are essentially programmed instruments.
Come to think of it, it'd be pretty cool if you could make the percussion run off a difference engine.

patscc
April 28th, 2005, 10:40 PM
Or even the Jacquard Loom, the grandaddy of anything that uses punched cards ? Or Bach's structure of his canons and fugues ? Or, back to the greeks, developed algorithms, certainly a milestone, the discovery that different tasks can share a common element, and that you can break down the common element into easily-describable, 'atomic' tasks.

Where do you start ? Where do you stop ? carlsson, this might take more than a summer or so. J.R.R. Tolkien springs to mind, who ended up taking 20 years or so to write his story. Good luck !

patscc

olddataman
May 1st, 2005, 12:21 PM
Wasn't the machine at Iowa State Universty built before the first Zues computer? I think so.
I think you should talk to a few more people aabot your chronology. In my opinion the computer industry has gone through several distinct phases. (Ignoring the "Boole and Babbage" days) Phase 1 was the years between 1937 and 1950, which were the "exploration and definition days". These were the days when no two machines were alike and virtually all were done in university or government labs. Phase 2 were 1950 to 1959. This was the start of the commercial computing era, when IBM, UNIVAC, Burroughs, NCR, Bendix, Royan/McBee, ALWACS, Ferraanti, and aa few more made production machines. None were alike in design or specifications, there were few if any standards and the most of any single model sold was probably the IBM 650. ( Woould you believe that even Philco, Westinghouse and General Mills got into that early act?)
The next big event was the maturity of the production of yhe transistor after 10 years of R&D. At the same time, the availability of the transistor forced the completion of the design and production of the Coincident Current Core Memory which was absolutly required in order to take advantage of the switching speed of transisotors. These two events caused the opening up of the industry to many more creatve people because they did not have to work around some scheme for making a memory that was the controlling influence on the design of the rest of the computer. The "Stored Program" concept came into it's own becase the program instructions and the data they processed were both written to, and accessed from memory in parallel "words" Suddenly, almost overnight, good computers could be designed that were inexpensive, fast and reliable. The makers of the traditional huge "big iron" computers could make theirs even bigger, faster and more reliable too, and they understood that their market was the corporate and governmental data processing business. The door for the rest of the computer applications was wide open. Between 1960 and about 1972 more computer companies were formed and often acquired after a few years than at any time except MAYBE the first 10 or so years of the microcompter era. I can think of probably 25 or more companies that came out of the woodwork in those years, and I have seen a list of just about every computer ever made, and there are at least two or three times as many as I can think of. Those of us who were working in that part of the industry were having the same kink of fun as we foond later when the personal compter era arrived.
Of course the next phase is the era of the microcomputer and more specifically, the Personal computer which started in about 1973 or 74 and as far as we know (because we are too deep in the forest to NOT see the trees) ia atikll going on. But I suspect we are in thevery beginning of the nest phase which is the s"wireless ea".

Incidently, in the spring of 1961 I was working in Wshington, DC and we had to set up an exhibit at a national Cartographer's convention. As we were setting up oour booth we heard music coming from the end of the long aisle we were in. We went to see what it was sisnce it sounded nice, but strange too. Lo and behold, i was a new macine being shown for the first time in public, called the DEC PDP-1 and it was running a program to play Bach Fuges. (is that spelled right?) That is the first machine I ever heard that was playing a programmed score. I had heard many machines playing music that was developed by the clock and unintentionally broadcast to a radio.

As usual I do run on and on, but I can't seem to help it and mabe I've given you some food fo thought. Good luck with it and I hope you do it!
Ray

alltare
May 1st, 2005, 07:22 PM
Keep running on, Ray- no one's complaining. You oughta write a book.

alltare

carlsson
May 2nd, 2005, 01:16 AM
While I don't have the classical composition background you appear to have
Heh. I don't, I'm self-trained but have acquired the taste through playing myself and listening to such music.

My intention may not to give a 101% accurate reprensentation of computing in terms of music, but more something to inspire me. Recently I found a (probably not published) suite of music inspired from paintings by van Gogh, and it encompassed four paintings, however maybe not the four art professors would claim as the most important ones.


Computer nerds, being what we are, could wind up debating like astrologers would, over the make-up of Gustav Holst's "The Planets".

But Holst tried to encapture the character of the Roman gods who gave name to the planets, not the planets themselves. Considering the means to explore each planet back then, there was not much else he could do. :)


Phase 1 between 1937 and 1950 [..]Phase 2 were 1950 to 1959. [..] The next big event was the maturity of the production of yhe transistor after 10 years of R&D. [..] Of course the next phase is the era of the microcomputer and more specifically, the Personal computer which started in about 1973 or 74
Good points, many thanks.

Micom 2000
May 2nd, 2005, 07:24 PM
[quote="olddataman"]Wasn't the machine at Iowa State Universty built before the first Zues computer? I think so. quote]

Yeah, John Atanasoff who had Mauchly's patent thrown out. Mauchly had freely "borrowed" much of Altanasoffs research. But the Altanasoff-Berry computer wasn't actually functioning till the early 40s and ISTR a nerd at the University of Saskatchewan had one functioning in the 37-39 period.
I remember seeing a photo of him and his machine but can't remember where I saw it. On the other hand it could be tricks of memory, and the
picture was of Atanasoff.


[ I think you should talk to a few more people aabot your chronology. In my opinion the computer industry has gone through several distinct phases. (Ignoring the "Boole and Babbage" days) Phase 1 was the years between 1937 and 1950, which were the "exploration and definition days". These were the days when no two machines were alike and virtually all were done in university or government labs. Phase 2 were 1950 to 1959. This was the start of the commercial computing era, when IBM, UNIVAC, Burroughs, NCR, Bendix, Royan/McBee, ALWACS, Ferraanti, and aa few more made production machines. None were alike in design or specifications, there were few if any standards and the most of any single model sold was probably the IBM 650. ( Would you believe that even Philco, Westinghouse and General Mills got into that early act?)quote]

Agree with you completely on the time periods, but I know that IBM only
leased the machines, at least up to 1956, when I stopped working on them. I think it was UNIVAC's competition that forced IBM into outright selling.

Lawrence

carlsson
April 23rd, 2006, 02:14 PM
I haven't put much effort into this, but tonight a musical thought struck into my head: C-H-A-R-L-ES B-A-B-B-A-G-E

You need to know the German scale, which uses H for B and B for Bb. Es would be equivalent to Eb. I decided R means the whole note to the right on a piano, and L to the left.

Here is what could be the opening of my masterpiece, if I ever write it:

http://www.anders.sfks.se/mp3/babbage.mp3 (715 kB)

mbbrutman
April 27th, 2006, 09:11 AM
Sorry about the interruption .. this is back in General Off-topic, and I'm whacking a few of the 'What happened here?' posts to cut down on the clutter.

CP/M User
April 27th, 2006, 10:43 PM
carlsson wrote:

> One of my aspirations is to be a symphonic hobby
> composer, i.e. I am totally self-trained when it
> comes to composing music and I'm only doing it on
> irregular basis.

> I've thought that some day I should try to compose a
> suite of "serious" music, i.e. for symphonic
> orchestra or band. I know the instruments quite well
> when it comes to useful range and a little about
> technical difficulties. But to make a suite, I'm
> thinking that I need a theme to get inspired by.

> Now, I don't know if any modern composer has already
> written works inspired by computing history, but no
> matter what, this was one idea that came to my mind
> last week. I could probably think both about
> computers, the conditions and environment around when
> it was made and much more.

> Typically I think a suite should be about 3-5
> movements (parts). Of course I should not take on a
> larger task than I could finish, but given those
> conditions, exactly what do you think should be
> covered? My thinking is something like this:

> 1. Babbage (Difference engine, analytic engine, Ada
> Lovelace etc)
> 2. ENIAC (or maybe that German computer that predated
> it?)
> 3. Cray (and/or PDP, something around the age of VLSI
> technology?)
> 4. Today? (IBM Blue Gene, advanced video games?)

> Is there any important landmarks I have overlooked?
> Maybe something more about video gaming, like Pong /
> Atari 2600 / C64 / SNES etc?

Something to do with Spacewar! would be good (in the Video
Games mould). How 'bout the Abacus - one of the best
calculators of it's time!

Why's this thread being moved to General Off Topic?

CP/M User.

DOS-Master
May 8th, 2006, 12:08 AM
ibm pc was a major event in the hstory of the pc

alexkerhead
May 8th, 2006, 12:19 AM
I am going to have to say the IBM 5150 with DOS 1.0 was the number one recent computing landmark. It marks to end of cp/m, basic, etc as the most popular OS's and compatibilities, and began uniting software makers to use a single platform. When IBM released the 5150 with DOS, the decline of cp/m, basic, etc. was accelerated to almost ridiculous levels. It also marked the end of inconformity. It also marks the climb of the microsoft empire as a software overlord.

nige the hippy
May 8th, 2006, 01:02 AM
just a thought, i don't know if the colossus has been mentioned, but it's way of decoding the enigma codes was by modifying the original code with a (paper) tape loop then offsetting it by one character, and running it again. Certain amount of minimalist music inspiration there i think.

Nig

CP/M User
May 10th, 2006, 10:28 PM
alexkerhead wrote:

> I am going to have to say the IBM 5150 with DOS 1.0
> was the number one recent computing landmark. It
> marks to end of cp/m, basic, etc as the most popular
> OS's and compatibilities, and began uniting software
> makers to use a single platform. When IBM released
> the 5150 with DOS, the decline of cp/m, basic, etc.
> was accelerated to almost ridiculous levels. It also
> marked the end of inconformity. It also marks the
> climb of the microsoft empire as a software overlord.

I disagree & I think this is an absolute joke to the whole
computing industry that this should be seen as a landmark. Of
all things it was a rip-off of an OS (CP/M) and was marketed
in such a way that is was guaranteed to take over from where
CP/M started!

CP/M User.

alexkerhead
May 10th, 2006, 10:41 PM
alexkerhead wrote:

> I am going to have to say the IBM 5150 with DOS 1.0
> was the number one recent computing landmark. It
> marks to end of cp/m, basic, etc as the most popular
> OS's and compatibilities, and began uniting software
> makers to use a single platform. When IBM released
> the 5150 with DOS, the decline of cp/m, basic, etc.
> was accelerated to almost ridiculous levels. It also
> marked the end of inconformity. It also marks the
> climb of the microsoft empire as a software overlord.

I disagree & I think this is an absolute joke to the whole
computing industry that this should be seen as a landmark. Of
all things it was a rip-off of an OS (CP/M) and was marketed
in such a way that is was guaranteed to take over from where
CP/M started!

CP/M User.
:oha: I was wondering when you were going to say that.
A landmark doesn't have to be positive Sir, it can be negative and affect something with equal force, although be it negative forces in your opinion.
I see it as, if Microsoft hadn't done what they did so many times, we would still have 12 different operating systems floating around. Microsoft's marketing technique helped unit hardware manufacturers. So, even if someone who uses CP/M doesn't like what MS did, there is still that fact that most people don't care to learn 12 different computing languages. Thanks to microsoft we have DOS, which is very very easy to use compared to CP/M for the basic user and we have windows, an excellent graphical interface OS. Xerox may have invented it, and Apple may have first distributed it, but microsoft marketed it and improved it, making microsoft the good guy here. I personally would dread learning anything else besides dos or windows as a main OS. I am right though, microsoft obliterated the OS wars, now even window's biggest competitor is running windows..lol
See what lack of initiative got apple? They are now begging for microsoft's table scraps. Microsoft's iniative, although be it a wrong initiative, did help unit hardware manufacturers.
Ya gotta lay off teh haterade.

CP/M User
May 10th, 2006, 11:10 PM
alexkerhead wrote:

> A landmark doesn't have to be positive Sir, it can be
> negative and affect something with equal force,
> although be it negative forces in your opinion.

Sure - we can all become turn into some kind of Media figure
now & celebrate - or simply sit around blind to what is really
going on & simply destroy the world a little bit more - while
humans celebrate it. Sounds clear as mud to me!

> I see it as, if Microsoft hadn't done what they did
> so many times, we would still have 12 different
> operating systems floating around. Microsoft's
> marketing technique helped unit hardware
> manufacturers. So, even if someone who uses CP/M
> doesn't like what MS did, there is still that fact
> that most people don't care to learn 12 different
> computing languages. Thanks to microsoft we have DOS,
> which is very very easy to use compared to CP/M for
> the basic user and we have windows, an excellent
> graphical interface OS. Xerox may have invented it,
> and Apple may have first distributed it, but
> microsoft marketed it and improved it, making
> microsoft the good guy here. I personally would dread
> learning anything else besides dos or windows as a
> main OS. I am right though, microsoft obliterated the
> OS wars, now even window's biggest competitor is
> running windows..lol

Well, this simply proves what I just said, a clear
understanding of how someone can be so obvious to the issue
that they don't even see the whole picture (sure who would
around here), but CP/M can easily be something which can just
as easily be used as DOS - don't you think CP/M ever advanced
either - or had a GUI for it? It's all work towards
Microsoft's favour simply because it's won those important
trials & simply paid a small fee for some known incident
involving them many years ago - surely at a time when
discovery would have closed them down.

I simply don't see it as a good thing & where this will end -
gawd only knows.

CP/M User.

carlsson
May 11th, 2006, 06:04 AM
For that matter, IBM may have been the ones who really created the landmark, by issuing a personal computer. Sure, there were similar computers prior to IBM PC, but I've read many companies and people disregarded personal computing as an expensive hobby up to when even IBM entered the stage.

Then whether you ran some DOS version, CP/M, Unix or whatever was available at that time would be secondary. IBM was also into the OS/2 project with Microsoft, which later took off separate ways.

Landmarks in computing doesn't have to be strictly technical innovations. It is just as important to make products user friendly and have a clever way of marketing, creating a desire. In my point of view, the World Wide Web is one of the most significant landmarks in computing over the last 20 years, because it does something else than just increasing technical specs.

But back to the original topic, the biggest issue is to find some way to illustrate (in music, pictures, dance or whatever) whatever message you want to convey. Once one has an idea, I believe it should be expressed even if it doesn't cover the theoretically top 5 moments.

alexkerhead
May 11th, 2006, 09:38 AM
alexkerhead wrote:

> A landmark doesn't have to be positive Sir, it can be
> negative and affect something with equal force,
> although be it negative forces in your opinion.

Sure - we can all become turn into some kind of Media figure
now & celebrate - or simply sit around blind to what is really
going on & simply destroy the world a little bit more - while
humans celebrate it. Sounds clear as mud to me!

> I see it as, if Microsoft hadn't done what they did
> so many times, we would still have 12 different
> operating systems floating around. Microsoft's
> marketing technique helped unit hardware
> manufacturers. So, even if someone who uses CP/M
> doesn't like what MS did, there is still that fact
> that most people don't care to learn 12 different
> computing languages. Thanks to microsoft we have DOS,
> which is very very easy to use compared to CP/M for
> the basic user and we have windows, an excellent
> graphical interface OS. Xerox may have invented it,
> and Apple may have first distributed it, but
> microsoft marketed it and improved it, making
> microsoft the good guy here. I personally would dread
> learning anything else besides dos or windows as a
> main OS. I am right though, microsoft obliterated the
> OS wars, now even window's biggest competitor is
> running windows..lol

Well, this simply proves what I just said, a clear
understanding of how someone can be so obvious to the issue
that they don't even see the whole picture (sure who would
around here), but CP/M can easily be something which can just
as easily be used as DOS - don't you think CP/M ever advanced
either - or had a GUI for it? It's all work towards
Microsoft's favour simply because it's won those important
trials & simply paid a small fee for some known incident
involving them many years ago - surely at a time when
discovery would have closed them down.

I simply don't see it as a good thing & where this will end -
gawd only knows.

CP/M User.

Sure, CP/M may have had potential, but nobody ever took the initiative, I mean ever took the initiative, to make it so.
I believe you don't see the entire picture, blinded by a fondness of CP/M, you refuse to understand that only initiative gets something going. If henry ford hadn't developed a mass production auto factory, would ford be a large car company? NO
So, I still stick to my conclusion that IBM's 5150 is the most important landmark in recent computing history. CP/M is just a blot of ink on the history in comparison, so cp/m isn't even any kind of landmark.
Let me put it this way, right now, hardly anyone knows GEM exists right? What if I rewrote the base code to be compatible with lots of windows/unix software, and sold it for $10 a copy. Well, it would then become pretty popular to the custom computer community. So, I took GEM's base code, so what, they never did anything good with it, but I did, I deserve something for that. Maybe I was wrong for not building all the way up, but GEM shouldn't have wasted 20 years doing nothing for it's software. If CP/M is so great, why did nobody ever take the initiative to improve upon it? That is basically what Bill Gates did, he made a peice of crap OS into something usable with dos and windows. You see the potential, I see the facts. Since CP/M is the bomb, why not develope a gui for it yourself? It will never become popular again if you just do nothing.

carlsson
May 11th, 2006, 03:20 PM
Sure, CP/M may have had potential, but nobody ever took the initiative, I mean ever took the initiative, to make it so.
I'm quite sure if Digital Research had got the honour to deliver an operating system to the absolutely outstanding - smashing - IBM PC, they would've put some efforts to improve it with a bit more user friendlyness, a newer version built for expandability and as the years passed, graphical shells like GEM. It would still need to be binary compatible with older software to start with.

Now, they lost out (for whatever reason according to the urban legend) and at best became an alternative distributer of an OS rather than the main one. Speaking in terms of profits, it surely made a difference how much they could spend on further development.

If the QDOS had not been very functional or Bill Gates had not known the people working with it, he could not have offered a system to IBM to start with. Microsoft would've lived anyway, in terms of Basic interpreters and other compilers, perhaps even productivity software like Word, but they would had a bit harder to get a market share.

mbbrutman
May 12th, 2006, 04:53 PM
The IBM PC is significant, but for much different reasons.

There were lots of personal computers before the IBM PC. Tandy had a line. Apple was well established already. I can't even beging to start naming the multitudes of 8080, 6800 and Z80 machines, many of which are highly sought after today.

The PC is significant because IBM decided the market was worth getting into. They took pre-existing parts and made a very polished entry into the field. Note that IBM put very little into the machine:


They chose an existing microprocessor from Intel
They bought their operating system
The BASIC interpreter came from Microsoft
Almost everything except for the BIOS code was sourced elsewhere.


Why? Because IBM was risk averse in this new market. If it flopped, they could walk away with very few sunk costs.

As far as CP/M goes, it was IBM's first choice. I'm not sure how much of the urban legend is true, but apparently Gary Kildall did not meet with IBM to license CP/M to them. Seeing the oportunity, Bill Gates repackaged a very clone of CP/M from Seattle Computer works and licensed it to IBM. That became DOS 1.0. Only in DOS 2.0 did DOS start to pick up more of the Unix features that it has today.

CP/M was very significant, and DOS borrowed from it alot. (And later it borrowed even more from Unix.) CP/M ran on many different machines and architectures, and had some design features to make it easily portable on those machines and architectures. (Sound familiar?) At the time, if it was to be considered a machine for business use, it pretty much had to run CP/M. (The Apple series was a notable exeption - it had enough sales such that CP/M capability was not a must.)



That is basically what Bill Gates did, he made a peice of crap OS into something usable with dos and windows.


As noted above, Bill Gates didn't start with CP/M. If he had, the world would be much different. He did start with a buggy OS, but it wasn't CP/M as you imply.

alexkerhead
May 12th, 2006, 10:07 PM
Of
all things it was a rip-off of an OS (CP/M)

and


Bill Gates repackaged a very clone of CP/M from Seattle Computer works and licensed it to IBM. That became DOS 1.0. Only in DOS 2.0 did DOS start to pick up more of the Unix features that it has today.

Then




As noted above, Bill Gates didn't start with CP/M. If he had, the world would be much different. He did start with a buggy OS, but it wasn't CP/M as you imply.

Huh?
You got me confused.
That appears extremely hypocritical. I never implied windows was a rip off of cp/m, I claim that dos was a improved rip off of cp/m.

mbbrutman
May 13th, 2006, 11:43 AM
I missed the word 'poor', but those of us who remember it would have subconciously filled it in.

Like we said, Bill Gates did not start with CP/M. He started with very poor clone of it which he obtained from Seattle Computer and repackaged as DOS for IBM.

So DOS is actually an improved version of a ripoff of CP/M, not an improved ripoff of CP/M. There is a difference.

Please explain your use of the word hypocritical? Did somebody say to do one thing, then do something else?

carlsson
May 13th, 2006, 11:48 AM
A clone is not the same as the real thing, i.e. it is possible that the QDOS mimicked CP/M to a great deal, but maybe not as comprehensive as the real CP/M. But no matter what, an operating system is an almost transparent part of a computer system to most people, both back then and still today. Those using personal computers around 1980 depended on which applications you could use, not the command system unless you were a technician or programmer. Same thing today, if the operating system works without flaws, few people will reflect whether it is a new or old version of Windows, some Linux, OSX or any custom developed system for a special appliance. We who hang here will certainly care, both back then and now, which operating system we're using, but I'm sure we belong to the minority 1/4th or even less of people who really put a value to it.

CP/M User
May 13th, 2006, 02:13 PM
To me, DOS & CP/M are two seperate Operating Systems - prompt
wise it's the same, until you dig down you discover that CP/M
uses User areas - as opposed to Directories in DOS 2.x & up
(DOS 1.x didn't even have this). Programs themselves which
came with CP/M work differently from simular programs found in
DOS simply on the command line - options wise, which can
really set apart differences between DOS & CP/M.

On an IBM DOS & CP/M-86 v1.x were competing with each other in
the beginning - CP/M-86 v1.0 was being sold through IBM & IBM
were continuing to sell 1.0 when DR released V1.1. However
v1.0 was being sold for a lot more than what DOS was. Unlike
the hacked 8bit version of CP/M - CP/M-86 on an IBM is
different in which programs aren't known as COM files - they
have CMD files instead - which from what I can tell exceed the
64k limit found in DOS COM files & CP/M-80 COM files.

So to me I just don't see the connection - an improvement of
CP/M would show compatability with CP/M - which DOS doesn't
really do.

Terry Yager
May 13th, 2006, 02:15 PM
I don't think we have a clear enough picture as to how Quick&Dirty came about. Was it based on reverse-engineering CP/M-80, or did SCP just write a whole new 16-bit OS based on the known (well-documented) functionality of CP/M?

--T

Terry Yager
May 13th, 2006, 02:23 PM
<snip>

So to me I just don't see the connection - an improvement of
CP/M would show compatability with CP/M - which DOS doesn't
really do.

Most, if not 'all' CP/M-80 software can be easily converted to 16-bit with the software tools DR shipped with CP/M-86.

--T

alexkerhead
May 13th, 2006, 03:23 PM
I missed the word 'poor', but those of us who remember it would have subconciously filled it in.

Like we said, Bill Gates did not start with CP/M. He started with very poor clone of it which he obtained from Seattle Computer and repackaged as DOS for IBM.

So DOS is actually an improved version of a ripoff of CP/M, not an improved ripoff of CP/M. There is a difference.

Please explain your use of the word hypocritical? Did somebody say to do one thing, then do something else?
I see what you are saying now, Bill Gates based DOS off of something quite like DOS, but not like CP/M, but the ideas came from CP/M. Did I get that right?
I meant the statement seemed hypocritical after the previous statement, I should have used a more appropriate word like contradictory instead, but I was very tired..lol No offense was intended.

Edit:

So to me I just don't see the connection - an improvement of
CP/M would show compatability with CP/M - which DOS doesn't
really do.
You may have took my statement too literally. I meant he improved on the basic principles(ease of use and such), as opposed to the actual OS....I gotta get more sleep.

mbbrutman
May 13th, 2006, 04:39 PM
No more late night posting ;-)

Terry Yager
May 13th, 2006, 04:51 PM
What about early-evening half-fit-shaced posting?

--T

Micom 2000
May 13th, 2006, 05:17 PM
Do it all the time. Posting while wasted can elicit in you interesting reactions the day after. Crushing embarassment or interesting insight to explore.

Lawrence

Terry Yager
May 13th, 2006, 05:21 PM
Now why do I find that so easy to believe, L.?

--T

carlsson
May 13th, 2006, 05:44 PM
So, does anyone know of any technological advances or software breakthroughs originally achieved under influence or lack of sleep? Surely a lot of recent demos and probably games have been developed under such state, and I've read stories about development of computers in the 80'ties where the team worked night and day.

Micom 2000
May 13th, 2006, 06:52 PM
Heh, heh, heh. These days do seem quite staid when compared to the wild days of the BBS or free-nets. Possibly some inhibiting factor of big HD posterity.

But back to the thread.

Gates did back then what he continued to do ever since, like most others, including Jobs. Steal. Even Lotus was founded on code from a couple of canadian Northern Electric programmers who lost their 10 year suit only on a questionable technicality. He was still damaged by the loss of his Basic code (see his famous advertisement regarding it) and never recovered psychologically. Quick and Dirty Dos was a rip-off from CP/M and Gates jumped at IBMs sponsorship.

Big Blues entry into the personal computer market was the kicker. Kildall recognized the challenge but CP/M had a vast catalogue of applications and he saw himself a giant-killer. Gates was a small-time player with only MSDOS Basic and a traffic-control program IIRC.
DEC also entered the fray with it's Rainbow but quickly withdrew.

Jobs also saw the threat and retreated into zealously defending his GUI turf. Compaq jumped onto the IBM bandwagon by doing a safe legal clone. IBM had immense clout with the business community and played the game perfectly for the last time. Thier history since the has been mismanagement in the personal computer market. The PS/2 fiasco for example. MSDOS became a monster which eventually overwhelmed even Big Blue.

DR still had some entries including GEM but Apple managed to cripple that. MSDOSs' early enties into the GUI sweepstakes were pretty pitiful when compared to most existing platforms like the ST using GEM, Amiga, Coco's OS 9, not to mention Apple. The difference was the IBM clout with big business and government. Big Bucks.

When Jobs was forced out of Apple he developed what was the best computer, Unix-based OS, and GUI ever. The present Mac system is really just an upgraded Next-Step OS.

IBM and MSDOS joined forces to develop OS/2 but then split and even IBMs clout couldn't rescue it from the obviously inferior Windows behemeth which had become comfortable with millions of users. Like the popularity of a Big Mac hamburger despite it's questionable nutritional value or quality. Anyone who has ever used Caldera's DRDOS can recognize it's superiority to MSDOS.

MSDOS Windows may have standardized the personal computer but it is questionable whether that has been a positive thing. Many more promising developments have been savaged by it's ravishing power, as illustrated in the court investigations, but now buried by the pro-corporate Bush administration and put on the back shelf.

This is why so many of the cogs in the IT world have so enthusiasticly embraced Linux.

Lawrence

Terry Yager
May 13th, 2006, 07:04 PM
So, does anyone know of any technological advances or software breakthroughs originally achieved under influence or lack of sleep? Surely a lot of recent demos and probably games have been developed under such state, and I've read stories about development of computers in the 80'ties where the team worked night and day.

Have you seen the book Where Wizards Stay Up Late? It's all about the formation of the I-Net, and the people involved. An Amazon search should turn up a reasonably-priced copy (sorry, I borrowed the one I read from the local library).

--T

CP/M User
May 13th, 2006, 08:06 PM
mbbrutman wrote:

> No more late night posting ;-)

Must confess - I've done a little bit of that. Most people
think I write like a drunken geezer - sometimes I wonder what
the h£!! I've written! ;-)

Sorry about the mixup. :-(

CP/M User.

CP/M User
May 13th, 2006, 08:15 PM
carlsson wrote:

> So, does anyone know of any technological advances or
> software breakthroughs originally achieved under
> influence or lack of sleep?

Well I don't think anyone has researched this properly - but
the main reason I'm drawn to this site (apart from being a
long-time member) is all because of the Internet - for me it's
an addiction - I've been addicted to it since I first started
using it in 1996! It's all about the atmosphere - you come
here - have a chat - have a good time. Perhaps it's a soothing
way and a way to do stuff most of the day - and just forget
about all your other troubles?

> Surely a lot of recent demos and probably games have
> been developed under such state, and I've read
> stories about development of computers in the 80'ties
> where the team worked night and day.

Sounds fascinating - has anyone got a story which confirms that?

CP/M User.

alexkerhead
May 13th, 2006, 10:09 PM
I usually goto bed at 3:00 or so, I believe I have some posts somewhere that are just stupid I made at 2 or 3...lol
If I look hard, I have made some at 4am...that were bad too. I gotta goto bed earlier..lol, but then I have to wake up earlier, when absolutely nobody is posting.
Anyway, back to topic.
Uh, I also believe the creation of microprocessors spurred unprecedented developement in the computer world.

CP/M User
May 13th, 2006, 10:25 PM
alexkerhead wrote:

> I usually goto bed at 3:00 or so, I believe I have
> some posts somewhere that are just stupid I made at 2
> or 3...lol
> If I look hard, I have made some at 4am...that were
> bad too. I gotta goto bed earlier..lol, but then I
> have to wake up earlier, when absolutely nobody is
> posting.

Sounds like you need to be one of those Night Shift
Workers! ;-)

CP/M User.

alexkerhead
May 13th, 2006, 10:55 PM
alexkerhead wrote:

> I usually goto bed at 3:00 or so, I believe I have
> some posts somewhere that are just stupid I made at 2
> or 3...lol
> If I look hard, I have made some at 4am...that were
> bad too. I gotta goto bed earlier..lol, but then I
> have to wake up earlier, when absolutely nobody is
> posting.

Sounds like you need to be one of those Night Shift
Workers! ;-)

CP/M User.\
LOL, good idea. I wish I could do customer work in the wee hours of the night. Mean customers wont even wake up for me to fix their comps. errr.

CP/M User
May 14th, 2006, 01:32 AM
alexkerhead wrote:

> LOL, good idea. I wish I could do customer work in
> the wee hours of the night. Mean customers wont even
> wake up for me to fix their comps. errr.

Theorically the pay should be good - as long as they pay
Overtime! :-)

CP/M User.

carlsson
May 14th, 2006, 03:13 AM
Surely a lot of recent demos and probably games have been developed under such state, and I've read stories about development of computers in the 80'ties where the team worked night and day.
Sounds fascinating - has anyone got a story which confirms that?
Bil Herd made a couple of posts on Compuserve 1993, here collected into one read. It is the amusing story of how Commodore 128 came to life, and it sounds like some people on the design team didn't sleep much at times.

http://www.ffd2.com/fridge/chacking/c=hacking17.txt (at the very end)

I'm sure that other teams working towards impossible deadlines took similar approaches, but the question is whether a true landmark can be achieved if you work under intensive stress.