Year In Review: 2008 Computer Hardware Industry Failure |
News - Featured Website News | |
Written by Olin Coles | |
Wednesday, 31 December 2008 | |
2008: The Computer Hardware Industry FailureEvery technology starts with a motivation, and throughout the past year Benchmark Reviews has watched the computer hardware industry follow trends that seem all too familiar of yesteryear. In this article by Executive Editor Olin Coles, the past 2008 calendar year is summarized for it's accomplishments, but the real focus is on the coming year ahead. Will 2009 be the year that component computer hardware becomes important again, or will it be relegated to a niche hobby for enthusiast?
For most of the 2008 calendar year, headlines of powerful video cards and ultra-extreme motherboards have filled most tech blog news. If it wasn't the new third-generation (3G) Apple iPhone stealing the lime-light, it was a new HDTV technology or faster Solid State Drive. If you were to consume the industry buzz on a daily basis, you might become oblivious to the economic struggles everywhere else in the world. This past year has opened my eyes to a number of problems inside the consumer electronics industry, and there were several moments of clarity when I read 'the writing on the wall' telling me a storm was coming. I don't profess to have any more knowledge about the course of our economy than the next person, but whenever a clue appears I don't ignore it. This is why I am prepared to take the heat for making a bold assertion: component computer hardware is a smoldering industry, and will soon become niche only kept existent by enthusiasts. Let me explain why I've come to this determination. Back in May of 2008, I witnessed a major break-through for discrete computer graphics at NVIDIA's Editors Day event. What happened shortly thereafter? The traded value of NVIDIA stock (NVDA) sank an instant 33% almost overnight. Now the two aren't exactly connected, but in some relevant ways they are. NVIDIA offered a new level of graphics never before seen, and made software more efficient with their CUDA program language, but because they had fabrication flaws that effected a series of chips all other innovations were rendered meaningless. The lesson learned: money is what drives consumer electronics, not innovation. This lesson extends itself to other areas, too. Take for example the HD DVD and Blu-Ray Disc format war. HD DVD conceded defeat because of expired funding, not for lack of consumer acceptance. By most accounts, Blu-Ray Disc was not the favored format. But again, money decided that fate long before innovation or consumer favoritism could save it. The past year is spotted with evidence, which I will outline in the next section. 2008: The Reason PC LostThere are several key factors which have combined themselves to create a 'perfect storm' for computer hardware, but if I were to put things into a simple sentence, it would simply be that PC hardware lost because of PC software. When Windows 3 was launched a lifetime ago, it was heralded as the major breakthrough in computing for the 16-bit interface and a structure less-reliant on 8-bit DOS. The world was in awe when Microsoft later released the 32-bit Windows 95, which marked the beginning of a very long and exciting era for computing. But what changed between Windows 98, 98 Second Edition, Millennium Edition, 2000, XP, and Vista? The truthful answer is not much. Even though they would add small usability improvements and enhance security, the 32-bit architecture meant that 64-bit desktop processors were ahead of the consumer software curve by more than four years. This is where my tale of woe begins: 2008 was a make or break year for many, and instead of improving technology or offering a real innovation, software writers kept their eye on mainstream money and ignored 64-bit computing.
This doesn't affect very many enthusiasts, at least not yet, but the triple-channel platforms introduced by Intel recently have made 6 GB+ of system memory the most common amount for new systems. There's that writing again, right there on the wall: you can keep your old Operating System and software, but if you try using them on new hardware there will only be heartache. But this is still only a narrow view of how component PC hardware, namely motherboard, system memory, and processor parts up to this point, have progressed beyond software to the point of becoming meaningless enhancements. There's still more damage to be done, and this time it's vide games that are killing off the discrete graphics market. NVIDIA and ATI have been fighting the good fight for as long as most can remember. But when was the last time you really needed a new video card to play your favorite game? For me, it was back when Battlefield 2 came out. My GeForce 4 MX played older games just fine, but this new title required something like the Radeon X800 GT. To this day, that three year old technology can still push most of the newest games with high settings. Which leads to the real problem: software is not pushing the need for better hardware. If there's no reason to upgrade, there's no reason to buy. If there's nobody buying, then the manufacturers have no reason to sell. Combine the lot, and you've got our present-day economic disaster. But wait, it gets better. While weak software development has stymied hardware sales, the biggest problems are just ahead... in the next section. 2009: Console Makes GamingIn the last section I explained how status-quo software development over the past few years has led to the decline of hardware sales. In this section, I will detail why component hardware may have a difficult time returning to mainstream, even if software comes around. Heading into 2008 there were three major console gaming platforms: Nintendo Wii, Microsoft XBOX 360, and Sony PlayStation 3. More than any other reason I received, the one that was most given for console purchases was "because I'm tired of upgrading my hardware every few months to play a game". In all honesty, that was probably true up until about 2007, at which point PC video games stopped demanding an upgrade (as evidenced by Battlefield 2142, Call of Duty 4, and later Crysis Warhead).
So it would seem that gaming consoles were gaining ground because of an old tradition of costly computer hardware upgrades. In reality, I can see why this is happening. If you release a new high-performance video card, the same $400 that buys just one computer hardware component could also buy an entire console gaming system and a few games. It makes perfect sense: if you like to play games, you can do it on a bigger screen for less money than if you played them on a PC. But this is the only the tip of the console argument. It used to be that PC games held an advantage because of the heavy customizations and control configurations available. Flight simulation and racing games all had PC-only peripherals that kept enthusiasts attached to their computer. But even those days have been erased, since most consoles now offer at least two thumb pads and eight buttons on the bundled controllers, and the same flight yokes and racing wheels available for consoles. Somehow, the PC industry, which has been restrained by lackluster software, has also let those same writers sell them out in favor of console development. It used to be that a game was written for PC, and ported over to console. Now it's the other way around. NVIDIA still has enough muscle to impose itself on a few writers, which results in at least a handful of PC-only video games, but everyone else wants their paycheck. AMD/ATI, who has never done anything to keep gaming adjoined to PC, has helped accelerate the situation and cut off the hand that feeds it. So now, there's no reason to be loyal to PC gaming when console offers the same experience (or better). So with gamers quickly turning to consoles for their entertainment, what role does the desktop computer play anymore? Soon, it will be none, as the age of notebook computing has come upon us. Notebooks End Desktop EraConsole gaming systems are winning over gamers in large number, and often supply 3-5 years of top-end performance before a new system arrives. Desktop computers on the other hand, see a new video card offered almost monthly. The games remain the same, more or less, but the hardware keeps getting bigger and more expensive. So it's no surprise that desktop computer have left little reason for consumers to occupy an entire corner of the room with a collection of costly components. This section may not contain the historical trends that the previous sections have, but the statistics are showing a trend. Desktop computers, which for years have been a lifeline for web browsing, e-mail, and video games, are no longer necessary. Notebook computers, and to a lesser degree ultra-portable mobile (UMPC) devices and netbooks like the MSI Wind and ASUS Eee PC, have assumed these roles for less cost and smaller footprint.
Added for good measure are the enhancements made to the compact computing segment. Small LCD screen have better resolution and clarity than before, wireless cellular Internet access can be built-into a device, and DVD-burners are standard equipment. In some cases, even higher-end graphic are available for using CAD software. More than anything else though, the Solid State Drive technology has equalized the performance between platforms. Desktop sales have been in decline for the past four years, with notebook sales finally out-numbering them as of Q4 2008. For the same price as a low-end desktop PC, you can get a decent notebook computer system capable of meeting all of the same needs while at the same time being portable and compact. The writing, once again, has been on the wall for many years now. Back in 1998, I began my first job in a technology-related industry with a company named 1-800 Batteries. This company would later see the quick-money carrot that caused so many dot-coms to bust, but not before changing its name to iGo and using notebook computers exclusively for almost 200 employees. That was ten years ago, before consoles took away gaming from the PC, and before wireless Internet. Skip ahead to this past Christmas. I have two different friends who have each been early adopters of computer technology for many years. Neither of these people work in a cutting edge industry, and neither of them have anything more than middle-class incomes. Yet, somehow, this year they went and bought people in their family a laptop computer instead of a replacement desktop. This is the writing I'm talking about. It's there, and all you have to do is read it. Desktop computers are not dead, but they're going to be as useful as leaded gasoline in the very near future. Final Thoughts and ConclusionIf for some reason, at some point in this article, you asked yourself why the editor of a computer hardware review site was taking the time to explain why his industry was wasting away, the answer is here in front of you: awareness. If you don't know where things are moving, you don't have any idea where you might land. Benchmark Reviews is going to be 21 months old on New Years day, and for nearly two years I (and my staff) have enjoyed testing the hardware that fills the pages of our Featured Content section. But for every product I receive, every motherboard and video card, I always find myself asking "why?". Have you wondered why we keep seeing a new video card every month from NVIDIA or ATI? The games haven't demanded more for almost three years now. How about Intel and AMD, which keep producing more cores for a processor that needs to do less? The system disk drive (either HDD or SSD) is usually the biggest performance bottleneck, so a faster CPU with more cores doesn't mean a better experience.
But look beyond the manufacturer, and ask yourself the questions. One question that keeps bothering me is: why do we still overclock? Back when I bought my first computer with a 200MHz Cyrix MII processor, it made sense to get an extra 25% performance for faster video game frame rates. Even when I had an Intel Pentium 4, the Folding at Home project kept me interested in overclocking for high work unit completion. But why do I still overclock, knowing that video cards don't need it and performance is hardly improved by it? That's the real question, and it's also the last stand for desktop computing. Or is it? As this article goes to publication tonight, New Years Eve, I have prepare for the 2009 Consumer Electronics Show only a week away. The 2009 CES always seems to lift spirits, and this year they need a motivational boost like never before. But the real news doesn't come until January 9th, when I release a product that could swing attention back onto the desktop computing platform. Even if 2008 turns out to be the apex of development for the computer hardware industry, we've still got a lot to be enthusiastic about. All of the big names: Intel, NVIDA, and AMD; they'll all survive. Maybe they won't keep placing so many eggs into one basket, and maybe they will start paying closer attention to the writing on the wall. Heading into 2009, I know that Benchmark Review will stay the course for as long as possible. But being the planner that I am, we've already started devoting some time towards a new direction and new industry. I really don't want to do something different, at least not yet. So let's see if we can push back the launch for another year or two. Questions? Comments? Want to just flame me for sharing my opinion or industry insight? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.
Related Articles:
|