PDA

View Full Version : The de-evolving computer



Dms12444
February 14th, 2010, 01:15 PM
Ive noticed that in the modern world, a lot of computers and their parts are made of very cheep materials. The odd thing is I have a 7 year old XP machine that is supposedly a "good computer" because it lasted that long. Have computers de-evolved (or a better term would be de-engineered) over the years. We still have 30+ year old computers on this website which are still operational. Another thing I have noticed is that a lot of computer parts don't seem to last as long as they used to. Now granite i'm only 14 and haven't been around long enough to actually see computers as they are now, but I do know that I have had many HDD's and Processors, motherboards, ect. which seem to fail a lot faster in newer computers than in older computers. And that's an accomplishment since the number of vintage computers in my house outnumber the recent computers 2 to 1. A lot of recent computers I have seen have odd styling and things that just look plain ugly. Vintage computers often look good and are plain at worst, but not ugly like many computers I see in stores today. Is it just me, or have computers been getting worse since 2000? Have any of you noticed anything similar?

tezza
February 14th, 2010, 05:08 PM
Well, my knee-jerk reaction would be to say "yes", computers (along with many other things) are poorer quality and that's a reflection of our throw-away disposable society. Computer components aren't built to last, as they will be obsolete very soon.

Personally I have no proof of this though and the opinion above is probably simplistic . In theory, given surface mount and LSI technologies, computers SHOULD be a lot more reliable nowdays than say 30 years ago and perhaps they are. I haven't owned enough newish computers to know one way or the other.

Someone who has been dealing with computers at the hardware level for 30 years or so should have an idea.

Tez

linuxlove
February 14th, 2010, 06:42 PM
I find that if I just stay away from PC Chips, I'll be good. However I don't buy brand-new, in the box systems. If I'm going to get a brand-new system, I would want to build it myself to make sure I get good quality stuff.

My IBM PS/2 model 25 has been chugging along for 23 years and still works fine. So far I haven't managed to kill anything in it yet and let's hope it stays that way. Yeah it most likely will fail sometime soon (<joke>oh look it just did</joke>) and with as knowledgable as the peopple here are, I feel I can learn about how to repair.

One thing I have learned is that if the computer is older than you, ask questions about it, not answer questions about it.

Chuck(G)
February 14th, 2010, 07:28 PM
A laptop or desktop is expected to put in, what, at most 2 years of service before it's replaced? So there's no point to over-engineering them.

In the 80's, many computers were upgraded with new motherboards and peripherals, but that's very rare today. And laptops, which are now outselling deisktops, aren't upgradable anyway.

It's still possible to build a good looking system, but it'll cost ya. I like the look of some of the HTPC cases, for example.

Vint
February 14th, 2010, 07:36 PM
Interesting topic, and I can only speak from my limited experiences.
What I'm finding is that I still have several computers I've bought since 2002. Mine aren't wearing out - they simply become obsolete, or un-upgradeable for one reason or another. My current PC is a Compaq Presario SR1620NX, with a pretty basic 2 gig Sempron CPU. I've upped to 1.5 gig RAM and a eVGA Geforce GS7300 video card since my wife bought me this computer as a retirement gift in January of 2006. I must say, I've been impressed with it since the first day I pulled the side cover.

http://www.vintage-computer.com/vcforum/attachment.php?attachmentid=3125

It's a breeze to work on with it's simply slide trays for DVD and HD. The front pops right off too. I accidentally got tangled in a joy cord while walking by it one day and as I tripped the PC took a head over heels tumble from a 3' high desk onto the floor, landing upside down. Didn't hurt it one bit inside. It dented the case a smidge, but nothing to brag about. It takes a lickin' and keeps on tickin' :) My point is it still has all it's original stuff, with some add-ons and I don't figure this machine would wear out anytime soon. I suppose my explanation of 'wear out' would be a defective motherboard. Everything else is pretty much an easy household replacement part.
I also have perfectly running - an Emachines T1100 and an Emachines T2200. They are just somewhat obsolete to me. So all the machines I've had since 2002 are still running fine.
I have many pristine 25 year old computers and they still run fine too. One reason may be that many of the vintage machines I own, I suspect were left in closets for much of their life. The machines I've been using since 2002 have undergone high traffic use and so I can't compare these apples to those oranges.
I do believe hard drives made today are of much lesser quality. Internal drives seem ok, I have yet to lose an internal - but external drives for me, are dropping like flies! I've lost 4 externals in the past couple years. In fact I won't buy another external!

Chuck(G)
February 14th, 2010, 08:09 PM
Compare your Presario with some of the 90's era Vectras and Deskpros in terms of quality construction. There's a big difference.

pontus
February 14th, 2010, 10:36 PM
Well, there are a few things to consider.

Computers used to be expensive. Today you can buy a really nice machine (performance- and featurewise) for very little money. A lot of money might not get you quality, but a little money gives you crap.

Computers used to be built to last for a long time. Today you get a service contract for 2-3 years and when it runs out you are expected to buy a new one. Quite sensible when the performance gained in an upgrade was worth it.

Computers used to be serviced! Today you throw it out when it breaks (which is why I find machines with simple problems in the trash often). The manufacturer are inclined to make serviceable computers if they expect to get them in for service.

So yes, computers have poor quality today compared to yesteryear.

krebizfan
February 15th, 2010, 05:11 AM
I think the overall quality is about the same now as compared to 30 years ago. Budget systems were meant to be tossed away once a connector broke; no real change from super cheap Sinclairs to super cheap Dells. Initial models of any disk technology were unlikely to work; whether looking at 5 1/4" floppy or the latest hard disk, the problems should be expected. I think there are fewer systems that just plain won't work than back in the past. No currently sold computer was as unreliable and useless as the Pascal Microengine.

I don't like the change to easily reconfigured software. Flashing a BIOS is much quicker than waiting for IBM to ship a replacement planar board. But that very ease leads to a wholesale ship before testing mentality.

Dms12444
February 15th, 2010, 07:39 AM
I find that if I just stay away from PC Chips, I'll be good. However I don't buy brand-new, in the box systems. If I'm going to get a brand-new system, I would want to build it myself to make sure I get good quality stuff.

I do that alot though it is usually quite a bit more expensive than buying an "in the box" system. They are of a good quality however (most of the time).


Computers used to be expensive. Today you can buy a really nice machine (performance- and featurewise) for very little money. A lot of money might not get you quality, but a little money gives you crap.

That has just about always been true, in 1983 you could buy a Timex Sinclair 1500 for $79.95 (counting inflation that would be about $172.71 today), it was not the best computer (by far) but it was cheap. The original IBM PC from 1981 went for about $3000 ($7080.43 today), which was expensive and had many cheeper clones which were faster, and had more memory at a lower price (ex, compaq desktops).

vwestlife
February 15th, 2010, 08:59 AM
Some of it has to due with "Restriction of Hazardous Substances" (RoHS), which went into effect in 2006. The high levels of lead, mercury, and other contaminants that were used in electronic components now must be reduced to very low levels. Circuit boards, integrated circuits, resistors, and other components all had to be redesigned to comply with RoHS. And just like removing trans-fats from foods, some say the quality isn't as good anymore. Also because of RoHS, some electronic products which were made for years and years are now no longer available, because rather than redesign them to comply with RoHS, they simply ended production in 2006.

MikeS
February 15th, 2010, 10:35 AM
Well, my knee-jerk reaction would be to say "yes", computers (along with many other things) are poorer quality and that's a reflection of our throw-away disposable society. Computer components aren't built to last, as they will be obsolete very soon.

Personally I have no proof of this though and the opinion above is probably simplistic . In theory, given surface mount and LSI technologies, computers SHOULD be a lot more reliable nowdays than say 30 years ago and perhaps they are. I haven't owned enough newish computers to know one way or the other.

Someone who has been dealing with computers at the hardware level for 30 years or so should have an idea.

TezIn my experience and opinion computers are certainly a lot cheaper these days, both in price and in construction, but I don't think they're necessarily less reliable. Sure, the electrolytic caps issue caused a lot of failures, but that was an anomaly; hard disks fail now and they failed then, but I've had a lot fewer IDE drives fail than the MFM & RLL and early IDE drives of old.

I think the higher level of integration, fewer connections and sockets, technological and engineering advances, etc. all add up to more efficient and more reliable hardware, even considering that to be fair you'd have to compare a 5150 to a top-end $30,000 server of today, and your dual-core laptop to a Rapidman calculator of yesterday. And why is a part made of plastic necessarily less reliable than the same part made of steel or aluminum, as long as they both do the job for which they were designed?

So, my experience says if you leave them alone they tend to just chug along fine; the systems I deal with run 24/7 and years ago I'd be on the road fairly regularly to deal with a hardware problem, but I think in the last five or six years all I've had to do was replace a few fans.

barythrin
February 15th, 2010, 12:15 PM
I'm still miffed that a system I built last year had the motherboard blow a cap within 9 months. That and somehow stores only allow returns on computer parts for 14 days around here (Fry's, Bestbuy). I mean .. *REALLY?!* you don't trust it to last more than 14 days? wtf.

I do wonder if the lack of quality is due to consumer demand vs the 80's though. I'm not sure folks would buy computers at the inflated price they'd cost today if it hadn't been reduced but since 1-2 computers is common in most American households I can't help but wonder if the commonality vs novelty has reduced the expectation of quality in these items. Not that I accept the lack of quality per that excuse but I certainly am curious.

Dms12444
February 15th, 2010, 12:21 PM
I'm still miffed that a system I built last year had the motherboard blow a cap within 9 months. That and somehow stores only allow returns on computer parts for 14 days around here (Fry's, Bestbuy). I mean .. *REALLY?!* you don't trust it to last more than 14 days? wtf.

I do wonder if the lack of quality is due to consumer demand vs the 80's though. I'm not sure folks would buy computers at the inflated price they'd cost today if it hadn't been reduced but since 1-2 computers is common in most American households I can't help but wonder if the commonality vs novelty has reduced the expectation of quality in these items. Not that I accept the lack of quality per that excuse but I certainly am curious.

Ive seen many instances where companies dont expect things to last, in some cases large scandals are invloved see here (http://redhill.net.au/w/w-cents.html). As for the cost cuts, that likely has a large play in the quality of the components and computers costing several thousand dollars simply wouldnt be practical.

Chuck(G)
February 26th, 2010, 01:37 PM
It's not your imagination. According to an online article one in three laptops fails within three years of purchase (http://www.designnews.com/blog/Made_by_Monkeys/31032-One_in_Three_Laptops_Fail_Within_3_Years_.php?nid= 4871&rid=1504665).
3188

donutty
February 26th, 2010, 02:10 PM
Everything is designed to a price... that's what the customer wants; cheap. Pay more and it generally lasts longer. It's the universal rule (applies to cars, electronics, buildings, the chair you are sitting on...)

Also, I think when computers were the new big thing, and not yet widespread, they had more money spent on them because they were fewer buyers, hence the companies had to have a reason to charge the same price as a new car! Today, every man and his dog wants 'the thing', whatever that is, so it is made cheaply (also read 'quickly') to satisfy the market's appetite.

Chuck(G)
February 26th, 2010, 02:50 PM
Everything is designed to a price... that's what the customer wants; cheap. Pay more and it generally lasts longer. It's the universal rule (applies to cars, electronics, buildings, the chair you are sitting on...).

The chair I'm sitting on is made by Steelcase and is over 20 years old. My rear end will wear out before it does.

I don't know how long you've been around, but for me, to see HP come in after Acer is a real shock--but then, Apple comes in a couple slots after Dell. So much for the argument that "you get what you pay for..."

Fallo
February 27th, 2010, 07:59 PM
Computers used to be expensive. Today you can buy a really nice machine (performance- and featurewise) for very little money. A lot of money might not get you quality, but a little money gives you crap.

The most unreliable computers are commodity ones that you get from Radio Shack and things like that. If you assemble your own machine, you can put much better components in it, as well as not having it full of adware (a way of keeping prices down).


Computers used to be serviced! Today you throw it out when it breaks (which is why I find machines with simple problems in the trash often). The manufacturer are inclined to make serviceable computers if they expect to get them in for service.

Electronics in general. Many people throw out LCD panels (for example) if the backlight goes bad rather than replace it.

Consider also that in the '80s, electronics weren't being made in China yet. When that started in the '90s, quality really went down. Even if you wanted to repair something, like say a DVD player, the company usually won't sell you any replacement parts.

donutty
February 28th, 2010, 02:18 AM
The chair I'm sitting on is made by Steelcase and is over 20 years old. My rear end will wear out before it does.

I don't know how long you've been around, but for me, to see HP come in after Acer is a real shock--but then, Apple comes in a couple slots after Dell. So much for the argument that "you get what you pay for..."

Everything taken into account, the argument still holds true. Granted, in a few cases, the design effort and strive to make a better product can still produce a value-for-money product that will outlast similar products. Nowadays, most things are rushed out and it shows (especially with software; I can't remember a cellphone that I've had in the last 5 years that didn't have a inexcusable software glitch. That covers Sony Ericsson, Nokia, Samsung and even my current Blackberry).

I used to work for a high-end plasma TV manufacturer. Their products were one of the most expensive on the market, and the average person in the street would not have recognised the name, unless they were a 'videofile'. Consequently, that explains why the factory eventually closed down and moved abroad and why they re-designed (down-scaled) their product line. But they were the highest-ranked manufacturer in product reviews, and the return rate was very small compared to other brands.

For example, the capacitors used were rated at (I can't remember exactly... just an illustration) 5 years lifetime which was 2 years more than was common in the industry. Those capacitors cost more, hence the end product cost more. Couple that with good design (which makes sure those capacitors aren't over stressed) and the product will 'in theory' last longer.

Contrast that to a cheap 12v -> 230v power inverter I recently had given that lasted about 3 months from new. Inside, the bank of FETs weren't even screwed to the aluminium casing (they were in the right position to be!), which would have acted as a heatsink. Had the (Chinese) manufacturer spent a few more pennies on some damn screws, the parts woudn't have overheated and killed themselves. I never buy the cheapest of anything, but never the most expensive. To be honest, rather than choose something in the middle, I tend to side with the 'slightly more' than average. I've never owned the top-of-the-range anything, but have always stayed clear of the cheapest.

Fallo
February 28th, 2010, 02:02 PM
I used to work for a high-end plasma TV manufacturer. Their products were one of the most expensive on the market, and the average person in the street would not have recognised the name, unless they were a 'videofile'. Consequently, that explains why the factory eventually closed down and moved abroad and why they re-designed (down-scaled) their product line. But they were the highest-ranked manufacturer in product reviews, and the return rate was very small compared to other brands.

I'm guessing it was Pioneer. They were considered the royalty of plasma TVs, but two years ago quit the business entirely and just make stuff like car stereos now.

Rather too bad that there are few plasma TVs left, as the picture quality is better than LCDs and they have no motion blur.

donutty
March 1st, 2010, 12:20 PM
Yes Fallo you are correct. They sold their plasma technology to Panasonic.. a more commercial player who thought they could make it pay.
Yes, plasma was the better technology then (I have no bias nowadays), but now I have a Samsung LCD which is as good, for the price.


I'm guessing it was Pioneer. They were considered the royalty of plasma TVs, but two years ago quit the business entirely and just make stuff like car stereos now.

Rather too bad that there are few plasma TVs left, as the picture quality is better than LCDs and they have no motion blur.

Fallo
March 1st, 2010, 01:52 PM
Yes Fallo you are correct. They sold their plasma technology to Panasonic.. a more commercial player who thought they could make it pay.
Yes, plasma was the better technology then (I have no bias nowadays), but now I have a Samsung LCD which is as good, for the price.

I have been completely unmoved by LCD TVs. Works fine for computer displays, not for TV viewing. Upscaling and HD lag is a major problem (although plasma will have this as well). I wouldn't get one without also having an external scaler box (which aren't cheap).

What turned a lot of people away from plasma was the fear of burn-in, which is not so much of an issue on newer sets. Also, they banned them in California (but fortunately I don't live there).

Ole Juul
March 1st, 2010, 01:58 PM
There are still high quality items being manufactured these days. I don't watch TV, but I think the BeoVision series is about as high end as you would want, and the old B&O are not showing any signs of going out of business. Their 60" plasma model is about $40,000 in Canada. B&O has always made quality stuff.

To me, one problem in this "de-evolution" that we are talking about here, is that we tend to think of the world from our own point of view. What I'm trying to say is that our immediate environment, be it physical or economic, is what we consider as the reference point. We are still getting lots of stuff, but it is cheapened. If you step away from that local perspective you will see that what has really happened is that we have gotten poorer.

strollin
March 7th, 2010, 09:36 AM
... That and somehow stores only allow returns on computer parts for 14 days around here (Fry's, Bestbuy). I mean .. *REALLY?!* you don't trust it to last more than 14 days? wtf. ...
I think you are confusing a store's return policy with a warranty. Why should a store accept a return after 14 days? If the thing fails after 14 days, how is the store responsible? I personally agree that after 14 days you should deal with the manufacturer rather than the store.

I agree that construction quality may not be as good today as in the past but I also think things were overbuilt in the past. I don't think the computers of today are any less reliable, in fact I think they are more reliable. I can't remember the last time I had a memory failure but had lots of memory failures on my early computers. Modern hard drives are amazing to me. They are larger capacity, faster, use less power, much cheaper and more reliable than drives of the past. This will be even more true when SSDs become more mainstream.

I recently realized that my work laptop, an IBM Thinkpad T40, was 7 years old. It still worked fine (never needed service) but was long out of warranty. I requested a new laptop and was given a new Dell Latitude. The Dell was less expensive than the IBM and blows it away in terms of performance and features, we'll see if it lasts 7 years. I also have 2 IBM towers under my desk at work that are working fine and have never needed any service. (Can you guess that I have been in my present job for 7 years?)

donutty
March 7th, 2010, 12:22 PM
Just got me thinking... maybe when computers were a luxury or an expensive toy, we treated them with more care and attention. Nowadays maybe we toss them around and then bitch when they break down. Bit like a car... if you feed it right (correct fuel, regular oil top-up and change) and look after it (don't thrash it's ass off) you could be motoring around in it for another 50 years. Look at all the vintage motors still out there.

Fallo
March 7th, 2010, 01:05 PM
I don't really think things have changed that much over time, because (unlike consumer electronics) computers are used to store vital information. Thus it needs to maintain a certain degree of reliability. The adage "you get what you pay for" has always been true, though. As an example, my Dell Pentium (from 1996) looks pretty sorry at first glance. It's been banged around, run hard, had almost everything upgraded, cards installed and pulled numerous times, but it still faithfully powers up with only the occasional problem of memory chips getting loose (the SIMM sockets aren't very tight).

You have to remember too that Dell always has been a better quality brand. A Packard Bell of this vintage would have died years ago.

MikeS
March 7th, 2010, 02:44 PM
I have to agree with strollin; I also think that computers are generally much more reliable today. I'm not sure I even agree that the construction quality is worse today; cheaper, to be sure, and very much different, but within the various applicable constraints I don't think it's necessarily worse when you consider the advances in automation and QA technology, the ISO qualification system etc.

And I certainly don't agree that you necessarily get what you pay for, unless you're shopping for a name label instead of a computer.

digger
March 8th, 2010, 01:47 AM
Regardless of the developments over the years and decades, the fact remains that the most unreliable part of a computer system is generally the software. This was true then, and continues to be true today.

Other than software, also all too often still PEBKAC. ;)

Some things never change.

lotonah
May 8th, 2010, 05:12 AM
Computers are definitely not made to last now.

I've worked in the computer retail industry most of my adult life. Particularily in the past 6-7 years, the industry has been using cheap capacitors. That was probably the number one reason I saw motherboards dying... blown caps. A few pennies more, and you'd have solid caps, and no problems.

Another problem is heat. Fans fail, they always do. It was around the end of the 486 computers that all CPU's started needing a heatsink/fan combination. Once they started relying on that, the reliability went down significantly.

The third problem is having everything integrated. Back when you had the video, sound, ethernet, hard drive controller, etc. as separate cards in the computer, the computer was generally more reliable. If the sound card failed, you most likely didn't have to replace the motherboard. Now if the sound card fries... well, maybe that chip controlled something else, too...

Although I have to admit, that's hardly a new idea. I think the VIC-20 sound chip controlled the I/O of the floppy drive, and so did the Atari ST. The more you rely on a chip to be multi-function, the more problems you're asking for.

MV75
May 11th, 2010, 03:45 AM
Electronics in general. Many people throw out LCD panels (for example) if the backlight goes bad rather than replace it.

Consider also that in the '80s, electronics weren't being made in China yet. When that started in the '90s, quality really went down. Even if you wanted to repair something, like say a DVD player, the company usually won't sell you any replacement parts.

When's the last time you saw a TV repair place? They just don't exist anymore. It's cheaper to just throw a whole TV than to try and replace a backlight these days.


Computers are definitely not made to last now.

I've worked in the computer retail industry most of my adult life. Particularily in the past 6-7 years, the industry has been using cheap capacitors. That was probably the number one reason I saw motherboards dying... blown caps. A few pennies more, and you'd have solid caps, and no problems.

Another problem is heat. Fans fail, they always do. It was around the end of the 486 computers that all CPU's started needing a heatsink/fan combination. Once they started relying on that, the reliability went down significantly.

The third problem is having everything integrated. Back when you had the video, sound, ethernet, hard drive controller, etc. as separate cards in the computer, the computer was generally more reliable. If the sound card failed, you most likely didn't have to replace the motherboard. Now if the sound card fries... well, maybe that chip controlled something else, too...

Although I have to admit, that's hardly a new idea. I think the VIC-20 sound chip controlled the I/O of the floppy drive, and so did the Atari ST. The more you rely on a chip to be multi-function, the more problems you're asking for.

There's no need for computers to last anymore. They become obsolete within 6 months, but usable only really up to 2 years. This is from the perspective of keeping up with the latest software.

Fans, went from sleeve to ball to fluid bearings that now last for like 5+ years. If they do fail, the cpu/vpu has built in thermisters and will either clock back or shut the system down if they overheat.

Have a look at motherboards now. You'll be hard pressed to not find solid caps. That trend started about 1-2 years ago. Video cards are the same.

Integration, mmmm, yea, you could just buy a new pcie sound card. But integration is happening big time. Motherboard chipsets alone have a lot of integration. Even CPU dies are becoming integrated with memory controllers and vpus. And VPUS are integrated with sound codecs so you get sound over hdmi. The only real problems encountered these days is outdated drivers or failed soldering, so people then bake their componets in the oven to fix them.

mark66j
May 11th, 2010, 07:28 AM
I think corporate IT departments used to do a lot more repair on machines, but with the declining cost of the hardware the labor costs of even swapping a board may make it cheaper to simply trash the machine and start over.

And for some high-end hardware I think the projected lifetime is actually lower for higher-end gear. Particularly with Apple, I just don't think they expect the buyer of a Mac Book Pro to keep it for 5 years. People buying on the high end of the performance curve usually want to stay up there, not be 2 or 3 years behind. Apple's OS upgrades generally cut off older hardware pretty fast, and I'm sure that's intententional. The same thing applies to a lot of gaming systems. They are more upgradable, but only to a certain extent. Spending more money for longer-lasting hardware locks you into a certain generation of it, and many people don't want that.

Raven
May 11th, 2010, 11:23 AM
I think corporate IT departments used to do a lot more repair on machines, but with the declining cost of the hardware the labor costs of even swapping a board may make it cheaper to simply trash the machine and start over.

And for some high-end hardware I think the projected lifetime is actually lower for higher-end gear. Particularly with Apple, I just don't think they expect the buyer of a Mac Book Pro to keep it for 5 years. People buying on the high end of the performance curve usually want to stay up there, not be 2 or 3 years behind. Apple's OS upgrades generally cut off older hardware pretty fast, and I'm sure that's intententional. The same thing applies to a lot of gaming systems. They are more upgradable, but only to a certain extent. Spending more money for longer-lasting hardware locks you into a certain generation of it, and many people don't want that.

I recently learned that lesson, and that's why my next board will be an industrial one - the long-life boards that are rated to last 20-30 years easily.. I'll pay a bunch more for it, granted, but it will have ISA on a modern C2Q and it will also last long enough to be one of the flawless vintage systems I own in a dozen or more years. None of my computers thus far have survived to now that I owned as a child (or were sold by my parents) so this will be a good change.