| AMD Radeon vs NVIDIA GeForce: Graphics Last Stand |
| Articles - Opinion & Editorials | ||
| Written by Olin Coles | ||
| Wednesday, 15 December 2010 | ||
AMD Radeon vs NVIDIA GeForce: Last Stand for GraphicsNVIDIA's Fermi GF110-powered GeForce GTX 580 sets a high standard, so can AMD's Cayman GPU with VLIW4 architecture save the Radeon HD 6970?Over the past several years of testing desktop graphics hardware, I've enjoyed a unique perspective of the internal happenings that remain hidden from the public. Much like politics, there is the truth, and then there's what you've been convinced into believing. I've watched their tactics, and witnessed their desperate attempts to sway consumer opinion. As competition for sales within the desktop graphics segment provokes fierce competition, it also raises the stakes for the companies involved. In this editorial article, I'll share my opinion of what appears to me as the last stand for desktop graphics between the AMD Radeon and NVIDIA GeForce video cards. On 15 December 2010 AMD will launch their latest flagship graphics processor, codenamed Cayman. This new GPU features dual graphics engines with an asynchronous dispatch and off-chip geometry buffering to 8th generation tessellation units. Equipped with a 2GB GDDR5 256-bit video buffer, the Cayman GPU can offer up to 24 SIMD engines and 96 Texture Units. Additionally, the AMD Radeon HD 6900-series will introduce several new MSAA modes including Enhanced Quality Anti-Aliasing (EQAA). But will this be enough for the upcoming AMD Radeon HD 6970 to compete with the NVIDIA GeForce GTX 580? It seems unlikely. Based on Catalyst driver 8.790.6.2000 (8.79.6.2 RC2) given to press, the Radeon HD 6970 delivers approximately the same performance as NVIDIA's GeForce GTX 570. Recently launched at the $350 price point, the GeForce GTX 570 and Radeon HD 6970 go back and forth between tests but at no point does the Radeon HD 6970 ever approach GeForce GTX 580 performance levels. According to AMD this won't occur until Q1 2011, when they unveil the Radeon HD 6990 X2 video card. Featuring dual Cayman GPUs, the codename 'Antilles' produce line will replace their aging Radeon HD 5970 video card. Unfortunately, missing the holiday shopping season could prove to be a deadly mistake for the AMD graphics team. NVIDIA was fortunate to launch their GeForce GTX 580 at the beginning of November (2010), allowing for plenty of time to reach store shelves and holiday wish lists. Their GeForce GTX 570 came a month later, and barely makes its way onto market just in time for holiday gift shopping. Launching only one week before Christmas and with inventory in scarce supply, the AMD Radeon HD 6970 and Radeon HD 6950 video cards might not see any gaming action over the long holiday break as a result. This brings me to the point of this editorial: could this be the last stand for significant desktop graphics matchups? AMD enjoyed a decisive head-start on the DirectX 11 desktop graphics market back in September 2009, but it didn't take long for NVIDIA to catch up. Holiday sales of the Radeon HD 5000-series helped propel Advanced Micro Devices, Inc. (NYSE:AMD) share prices by nearly 100% just three months later, compared to roughly 20% for NVIDIA Corporation (NASDAQ:NVDA). One year later both companies have maintained slow but healthy earnings growth, yet the effects of any short-term competitive disadvantage could prove harrowing. Adding stress to this likely scenario, sales of discrete desktop graphics (video cards) are expected to decline as this sector is slowly replaced with compact and mobile computing devices. It's a good thing then that top-end graphics sales only make up a small portion of overall revenue, while entry-level or mainstream graphics outsell top-end components all year long. Although AMD's Cayman GPU isn't going to beat the Fermi GF110 in terms of frame rate performance, there's plenty of opportunity for them to compete with NVIDIA in terms of product price and value. The Radeon HD 6850 and 6870 are still solid products that gamers can trust for good performance, but NVIDIA's GeForce GTX 460 has a stranglehold on the mainstream gaming market. Pending the actual retail sales price of AMD's Radeon HD 6970, the newly-introduced GeForce GTX 570 could be left to fill a large void in the $350 price segment. Price matters more than performance, more so now than ever before. This takes us to the conclusion: what becomes of AMD graphics after two fruitless attempts at winning over the top-end market? The AMD Radeon HD 6870 failed to reach top-level performance that many expected, and was relegated to fighting off an army of factory-overclocked GeForce GTX 460's that sell at a better price point. Now the Radeon HD 6970 proves that AMD doesn't have a single-GPU answer to NVIDIA's GeForce GTX 580. The DX11 discrete graphics market is already saturated with AMD Radeon HD 5000 series, diced into portions so thin that a mere $10 separates some products, leaving little room for additional growth. With the convincing likelihood that AMD's Radeon HD 6900-series will miss the fast-approaching holiday season and the sales opportunities it brings, could AMD be forced into once again competing everywhere but the top? UPDATE: Benchmark Reviews has published our test results for the AMD Radeon HD 6970 Video Card. Can AMD afford to miss this upcoming holiday season? Please leave your comments below.
Related Articles:
|
||
Comments
So, ATi might lose this round, but thats ok as long as they stay competitive...
1. crossfire scaling is better. Check the reviews of the the 6800 series in crossfire. it's miles better than sli, often getting 100% scaling in some games. many reviews are out there, this is on google showed quickly: ##tweaktown.com/reviews/3602/his_radeon_hd_6850_1gb_video_card_in_cro ssfire/index6.html
Crossfire didn't historically work better, but it sure does now.
2. Nvidia has other "cards" up it's sleeve. whilst those might matter short term, direct compute and open cl will render those obsolete in the hopefully near future.
They both make good cards, but I don't think that either company is better, and fanboi's are retards. buy what ever is better performance per dollar. Having the fastest card means jack if it's not affordable. I'd be purchasing in teh sub250 range, so anything above means nothing to me. As for ATI not being competitive, that's BS. their cards offer excellent frames per dollar.
I've posted a fair bit in regards to what I see as some pretty obvious bias in this site, and whilst Olin has always replied with what he felt, I'm not sure that this makes any sense at all. I do see you say that they can compete on price, but I'm not convinced that the 460 has a "stranglehold" on the gpu market. I don't know anyone who'd buy one, and I know quite a number of people who have ordered 6k stuff. Obviously, that's anecdotal, and may not represent the main stream, but ATI is making good chips, has been making good chips, and will continue making good chips. If not for AMD, we'd still be paying 800+$ for a high end GPU.
As for your claim that this site is "obviously bias", I'd love some proof. Whenever we publish AMD reviews, the NVIDIA fan boys claim we're paid to post the review. Whenever we post an NVIDIA article, the AMD fan boys cry the same thing. At the end of the day, we test and report our results. If they favor AMD/NVIDIA, so be it.
It may not be a knock out, but this mean breath slowly winding out henceforth a stranglehold.
PhysX allowing to match diferent GPU in a near SLI was a great move for NVIDIA, and many people that already had a NVIDIA stay with NVIDIA, this denial of brand move does not happen on ATI side, and people want the fastest they can, and in a overall NVIDIA is kicking butt or matching.
I really find sad ATI is getting an rear end spank, its bad for us consumers.
He also stated that it didn't take long for Nvidia to catch up to AMD during the first round of DX11 cards. I'd say a 6 month head start in any technology field is a lifetime and the product that Nvidia released was inefficient, loud and hot. The GTX480 is a fast card, no doubt about it, but it wasn't really a "success", especially when they released the cleaned up version another 5 months later.
Last stand for desktop graphics? Don't be so dramatic
To say that AMD hasn't been competing in the top end market is plainly false. Ever since the 4000 series AMD has been competitive with Nvidia in every segment. The 4870X2 was faster than the GTX280, the 4890 and GTX285 traded blows in certain games, and as stated before the 5970 has been the fastest graphics card you can buy since its release and before the release of the 580.
To also say that the 6870 "failed" to reach top end performance is another hyperbole that you threw in there that is also misleading. The 6870 was never MEANT to reach top end performance. It was stated many times before its release that it was replacing the 5770. Looking at current performance I'd say it's done is job quite well.
All in all it seems that you have a misunderstanding of the current strategy that AMD is using and are simply writing a dramatic opinion peace with talks of "dooms day" scenarios for discrete graphics cards to simply increase your hit count.
As for the GTX 460, it is a lost cause for nVidia. The card is now selling at $150, which is no longer very profitable for the company. This happened solely due to the Radeon 6800 series, which plain and simple demolishes both the GTX 460 and GTX 470. I hope that Radeon 6900 series brings the price of nVidia cards down, for this is what AMD is known for, pricing competitively and giving competition a hard time.
I have a 470 overclocked at 800-1600-1800 ( 25% easy gains ) and it gives me 570 stock performance, the 6870 has little to NO overclocking headroom, since AMD pushed their clocks so far to make it competitive. I have seen others get their 470's to 875 core for close to 40% - 50% gains over 470 stock clocks, can the 6870 say the same thing?
Can a 6870 provide 570 stock performance?
##hardwarecanucks.com/forum/hardware-canucks-reviews/36477-gigabyte-geforce-gtx-470-super-overclock-review-9.html
Before you moan about heat with the 470, I have an MSI Twin Frozr II, it idles at 33C, 65C MAX on full load, and is very quiet as it only needs 60% fan speed to maintain those temps. And oh yea, I got it for $270 SHIPPED!
Not to mention that CF is a pain in the rear to run, vs SLI.
The war rages on!
I have one in my possession, and it's already been tested. I've compared the results, and in four more days you'll see that I was completely accurate with my statement.
The rest of your comment is worthless as well, since you claim the "Radeon 6800 series demolishes both the GTX 460 and GTX 470". You should try reading a review every now and then for a real sense of the facts. Here's one to begin with: /index.php?option=com_content&task=view&id=633&Itemid=72
/index.php?option=com_content&task=view&id=606
Based on Futuremark's 3d mark dx11 benchmark enthusiast are eager to bench and patiently wait for new drivers for sli support. (Gtx 570 gets around 5600 on perfoamce). If you haven't benchamrked your dx11 cards check out this pretty and taxing benchamrk showing off some nice dx11 feature and what dx11 is cabable! Well see how cayman perfoms in this benchamrk as well.
Not sure about this last stand stuff?? Must be to get people to read. This is a war that is far from over.
By your logic, AMD has been 'losing' in this retarded 'war of the GPUs', but somehow they still keep making them. Yes, having almost the same performance with significantly less heat, power and die size obviously is defined as 'losing'. They just magically make money out of nowhere to support their GPU market.
Wheeeeee.
##hardwarecanucks.com/forum/hardware-canucks-reviews/37499-hd-6870-hd-6850-vs-gtx-460-1gb-overclocking-study-5.html
Also, keep in mind that these cards were substantially more expensive before the 6870's release.
Yes, I believe that $30-$50 is considerably less when the market is diced into $10 increments.
You need to keep in mind that the Radeon HD 6870's prices all went UP after the launch. How about that?
Now for the aforementioned 460':
That MSI card is $214. The Palit is $250 (LOL).The Gigabyte is $207 (AR) $226 regular price. The Galaxy is $191 (AR) and $212 regular. The Evga is sold out on newegg, and the cheapest I could find it was for $230 on Amazon.
Several of the aforementioned cards ( I can't recall) don't beat the 6870 at stock. Also, the 6870 is what brought most of these cards down in price.
I forgot to add the Sparkle X460G ($200) for $30 less. The Galaxy 60XGH6HS3GMW is $185, a difference of $45. The others are only about $10-$20 less. Still, less expensive is less expensive.
Regardless, expensive is less expensive.
Why are the EVGA sold out and there s plenty of ATI left on the counter ?
Doesn t ring a bell to you ?
Anyway, I don't what your point is, but meh......
Its not optimized, but gone are the days where sims didnt need CPU.
The FPS/RTS are not the most demanding games anymore.
See rise of flight, the upcoming SOW: BOB too.
The market exist, you just dont know where it is.
I am tickled to think DAAMIT sat around complacent while watching Nvidia resurge in this segment.
Don't forget the lessons of AMD when it got its clock cleaned by Intel's Conroe.
DAAMIT!!!
Cayman is nearly 1/3rd smaller yet it can compete with gtx 570 possibly 580 too, with more chips per wafer leading to better yields leading to better profits. Elegant efficient engineering.
I know who'll be getting my money.
Resurgence?, what resurgence?, nVidia has only designed a better more costly cooler and a power limiting device to compensate for Fermi's short comings and again... more expense.
What a mess...
As a consumer, what do you personally care about the size and complexity of NVIDIA's parts?
AMD is launching a major re-design of their product line that apparently falls 20% short of it's competition and your response is "Hurray! AMD can make a smaller chip that does less!"? AMD needed to win on performance, because they lose on features. (e.g. PhysX, 3D Vision, ambient occlusion, CUDA)
To launch a 3rd place part late as your flagship GPU, after your flagship Crossfire card has been trumped is hardly an engineering triumph.
NVIDIA FOCUS GROUP MEMBER
I receive free hardware and software from NVIDIA for evaluation, my opinions are my own.
Physx, 3D vision & CUDA do not interest me one jot, but eyefinity does.
The Gtx 465-480 are cards that should have stayed in the (chilled) R&D labs.
As for die size etc. Yes I do care about the fortunes of the company in which I invest, when I buy my next gaming part.
The only recent mistake Amd has made is delaying 69xx, because a lot of people will, like me, be wondering whether 2x68xx or 1x69xx / 2x69xx is the way to go. I don't even consider nVidia.
PhyxsX, 3D Vision and CUDA don't interest you one "jot", only EyeFinity matters? I use "EyeFinity" fine on my NVIDIA cards, but I have the option of doing it in 3D, and game developers code their games to make it look better in 3d. If you don't think 3d is the hot item right now, you weren't at CES last year and haven't been reading or watching the news. I'd also note that ATi is scrambling to get someone to make 3d gaming solutions for their products, because per usual, they couldn't afford to do it on their own.
As far as PhysX goes, ATi is scrambling on that one too. They promised their users hardware accelerated physics 3 years ago, then again 1.5 years ago, and still no games. Go figure.
You must not care about the "fortunes" too much. This morning NVDA has a market cap of $8.63B, and AMD + ATi is $5.5B. So riddle me this saavy investor: If ATi is doing so well, why is AMD (a much larger company than NVIDIA)now in a position that NVIDIA could almost purchase them after the acquisition of ATi? And why does NVIDIA maintain 2/3 of the desktop discrete market share if their engineering is so "poor"?
Not too shabby for a company you think is selling all it's products at a loss. (of course you have no proof of this)
NVIDIA FOCUS GROUP MEMBER
I receive free hardware and software from NVIDIA, my opinions are my own.
##guru3d.com/news/nvidia-posts-third-quarter-profit-of-849-million/#
Q3 2010: On a GAAP basis, the company recorded net income of $84.9 million, or $0.15 per diluted share. GAAP gross margin was 46.5 percent.
If you did even 2 minutes of googling, you would know that in Q2 2010 NV had incurred $476 million of non-recurring charges related to weak die-packing for mobile parts and pending lawsuit charges.
#news.cnet.com/8301-13924_3-20013543-64.html
Only AMD fanboys continue to claim that NV is losing $$ on GTX460/470/480 or any other GPU they sell. The reality is that from December 15, 2000 - December 13, 2010, AMD's stock performance is - 46% to the investor....##google.com/finance?q=NYSE:AMD
NV is simply a far better managed company in every way imaginable.
Care to post a link to those so called "rumours"? And please make sure they are credible.
"Cayman is nearly 1/3rd smaller yet it can compete with gtx 570 possibly 580 too, with more chips per wafer leading to better yields leading to better profits. Elegant efficient engineering."
"Elegant efficient engineering"? This reminds me of the DAAMIT fanboys who parroted the notion of "true quad-cores" in Barcelona and other similar nonsense. Turned out that Intel made a killing in profits while AMD stumbled with its "elegant" and "true quad-cores", leaving what slivers of credibility it had in shambles.
"I know who'll be getting my money." Oooooooooh...I'm really scared! Nvidia has made embarrassingly more revenues in graphics that DAAMIT! I doubt they need a couple of dollars from you.
"Resurgence?, what resurgence?, nVidia has only designed a better more costly cooler and a power limiting device to compensate for Fermi's short comings and again... more expense."
More expense...probably yes, DAAMIT's going to have to make concessions in this area.
"What a mess..."
I agree 100%. DAAMIT's got its work cut out for it!
Read more: AMD India develops new fusion chip - The Times of India #timesofindia.indiatimes.com/tech/enterprise-it/services-apps/AMD-India-develops-new-fusion-chip/articleshow/7082618.cms#ixzz17t2n6kNC
#timesofindia.indiatimes.com/tech/personal-tech/computing/AMD-India-develops-new-fusion-chip/articleshow/7082618.cms
To make money in this game your designs have to be small (many dies per wafer), efficient and powerful.
Cayman is 30% smaller than GF110 (an extra 20-30% chips per wafer perhaps?). More than enough to make up for any defects?.
nVidia on the other hand designs its chips as if they get they're wafers for nothing. Which results in huge, overly complex dies, poor chip per wafer count, naturally lower yields and this in turn leads shrinking profit margins.
There are a lot of rumours on the net saying that nVidia is not making ANY money on its cards and is having to subsidise it from other parts of its business, whereas Ati is keeping AMD afloat until Bulldozer / Bobcat appears. Not bad for a graphics company you think is FAILING!!!!.
It's true though, Nvidia seems to use the "brute force" approach of building the biggest dies to get performance. I'd bet that ATI/AMD has the upper hand on performance:area ratio, which is worsened by the expected lower die yields. Remember though, that like 80% of Nvidia's profits come from scientific computing and HPC, not selling high-end gaming GPUs.
There are a lot of rumors on the net that the world will end in 2012.
So what?
As to
"Ati keeping AMD afloat until Bulldozer / Bobcat appears. Not bad for a graphics company you think is FAILING!!!!."
I hate to break it to you, but Nvidia made $84.9 Million net income in Q3 2010
##sec.gov/Archives/edgar/data/1045810/000104581010000038/q311earningsrelease.htm
While DAAMIT's Graphics segment made a paltry $1 Million in net income in Q3 2010. You will have to look for Graphics Operating Income in Quarter ended Sep 25 2010
##sec.gov/Archives/edgar/data/2488/000119312510229536/dex991.htm
You are right there...Ati is keeping AMD afloat....in your dreams!
##noticias24h.eu/home/informatica/item/paginas_106410.html
Let's pull up a chair and have this conversation again in a year or two. It will be interesting.
6970 gpu is 30% smaller and has 15% less transistors and this number indicate 2 things
1 a cheap gpu
2 a powerful gpu
An interesting thing.. 6870 at low resolution is very close with fermi gtx 570
from this i can draw another conclusion
6xxx gpu architecture is very very good, 6870 is a very small gpu (engine)
but 6970 is a big gpu..so my estimation is this
cayman gpu will be in terms of performance in DX 9, 10 weaker than gtx580 (5-10%) and in dx 11 games very close or better then gtx 580
if Olen will make a performance summary per DX we, he will draw a nice conclusion
My question is "520mm2 die size??? What architecture will save NVIDIA?".
This series goes up to the GeForce GTX 460M, all based on Fermi and utilize NVIDIA Optimus technology.
For me NVIDIA needs seriouse change in the architecture as their die size is too big. That's why I'm asking "What architecture will save NVIDIA?". Making less profit is always a BIG deal. Sooner or later profit has incredible impact on future developments.
And that's also why I'm curious of what is going to be their flaghip performance (packed in less tha 520mm2).
Never mind! A lot of people a looking no further than their noses.
And just to make it clear. No fanboism in this comment! Just common sense (and worries).
The world is full of better solution RIP.
So babble you gizzmo, in the end of the day, bean counter wins.
Sorry.
##techreport.com/articles.x/20088/9
You still don't get what I have said, do you?! Fanboy...
Ah! Useless...
Ati/AMD's future is bright, small & powerful!
nVidia has performance crown? yes, but at the expense of their own company, where is the sense in that?
Lets hope Bulldozer / Bobcat is another example elegant, efficient engineering and will hopefully give intel a fright.
And knowing that the refresh is coming in a few weeks delays the purchase or perhaps justifies purchasing "second" best for a few dollars less.
I think that it is smart marketing to "spoil" Nvidia's 570 launch by announcing a hot product just a few weeks away.
570 is all that Nvidia has on the table. In Q1 2011 they will have to release something to try and regain top spot. All the while knowing that every one of those expensive designs have nowhere to go in the mid-range market. The Sandy Bridge motherboards will not accept Nvidia boards. Now the same argument can be made against AMD/ATI with one major exception. Every high performance GPU core design CAN be used for on-die APU fusion designs at some later date.
i think the autor expected the ati6xxx serie to be much faster than de ati5xxx serie..
but for the first.. the drivers are completly new. ( that always gives slower results, also for geforce, when they ship new hardware)
second: ati6xxx serie is made to reduce power, and support for 3d monitors, as i understand.
funny that the latest test i have seen for geforce, is not compted with the competing ati similar grafik cards. no most of them where actualy mostly compared with other geforce cards.
Seriously, some of you guys act like they've actually come up with something new here. All they've done is fix the original Fermi. If they'd done it right the first time, they would have 'overtaken' AMD with the original 480.
Dual GPU 5XX is likely to be a big bag of dudu for the reason that they already have next to no thermal headroom, even with their mega-cooler. They'd have to throttle the GPUs so much it would barely be worth it. The 6990 on the other hand, with the new improved crossfire scaling, is likelt to be a beast. So as long as AMD competes on value lower down the scale (which they should be able to do due to better yields and using less silicon), AMD should end up with better bang for buck AND the top performer in this series.
So would somebody like to tell me the way in which AMD are not going to be the winners for this series?
As to my loyalty towards one company or another, it doesn't exist and never did.
If you want loyalty, then buy a Dog.
##tomshardware.com/news/macbook-pro-radeon-mcp-gpu,11781.html
You're right, the GTX580 isn't new product, it's a tweaked full version GF100. Quite possibly the GPU NVIDIA hoped to launch a year ago, a revision of the product they did launch 8 months ago.
What does that say about ATi's engineering when they're just now catching up to NVIDIA's performance from 8 months ago with a total re-design of their chip?
NVIDIA gained 15-20% by improving their Zcull, using a better cooler, and enabling a cluster.
ATi gained 15% by totally re-designing their chip. They re-did the shaders,the tesselators, upped the clocks, and created a dual core design. All for a 15% gain. I don't think I've ever seen so little return on so much engineering change.
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
No clue why you guys keep bringing up stock prices and profit margins.(I guess other than the tech picture is looking bleak for ATi) You may not have noticed, but tech stocks in general have been down lately. We're in a recession. I do know NVDA is a much healtheir company than AMD- no debt, lots of cash in the bank, worth almost twice what AMD is worth in the market.
This "future is Fusion" stuff has a whole lot of Christmas wishes wrapped up in it. First, there would have to be a total redefinition of the computer gaming market where people all of the sudden said "We are tired of these great graphics! Please AMD, take us all down to console level graphics." Second, they seem to forget AMD only has about 19% of the CPU market- not exactly the lion's share to plunder. Third, any sales of Fusion as a single gaming solution cannibalizes AMDs far more profitable discrete market.
AMD should be careful what they wish for and concentrate on engineering some better hardware.
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
It's very easy to throw a square foot of silicon at a GPU and claim the 'single GPU performance crown'. But we all know the problems with that (cost, heat etc). AMDs GPUs have better performance per square mm up and down the range, considerably so in most cases.
AMD will have the fastest graphics card this generation, in the 6990. Why? Well, let's say that the best cooler that can be designed in a long double slot envelope can dissipate 400W of heat (doesn't matter what figure you pick, since it's going to be applied to both companies). The 6970 generates 250 watts of heat, and the 580 generates around 300W of heat (unless you fancy being lied to. If it was really 244, it wouldn't need to throttle with that monster cooler). Therefore, assuming that activity corresponds to heat generation, AMD will have about 60% more performance available over the 6970 before hitting the heat limit, and nvidia will have about 33%. 6970 is around 90% speed of 580, therefore 6990 will have around 1.54x performance of GTX-580. GTX 595 will be limited to 1.33x performance of the 580 (which is hardly worth it, and possibly why it doesn't look likely that they're going to build one).
So AMD will have the fastest graphics card, and since they use less silicon for their performance, will also be able to offer lower prices per frame on the rest of the range.
Please offer a counterargument if there is one.
Either way, the fact that you try to make this so dramatic about being the last stand is bull, and we all know it. Either way you put it, NVIDIA was late to the game. Fact is they caught up (For the most part... single gpu wise) and now AMD is going to play catch-up.
Lets also not forget the fact that there's only one GTX 580 card that I can get on Newegg.
As for the holiday season, I don't think it affects the top-end graphic card sales. If anything I have less money to shell out for my graphics card during the holidays. Lets face it, how many people are going to buy me a $500 graphics card?
But to answer your question: none. That is how many people who will buy you a $500 graphics card. That's because you clearly dislike one brand over the other, and AMD doesn't sell anything at the $500 price point. If you had said $700, I might suggest the 5970, but I doubt anyone will buy that for you either.
Why don't you take another look at Newegg. There's a 5970 for $499.99. $469.99 with a rebate.
Why don't you try creating a realistic title next time, trying to draw in viewers with a overly dramatic title clearly doesn't make them happy.
Right now it has nothing to do with the title, but with you. You keep going around the point and calling others comments untrue - here you are trying to tell me that $500 won't get me my 5970. Talk about saying something untrue.
But you are correct it is only one model, and several of the other 5970 models are sold out, with the exception of that ridiculous XFX special edition model that comes with some kinda toy gun XD
Once you're done with that, try to be a little more open-minded. I could care less which company wins or loses, because my job will remain the same: test the product, report the results.
And for your information, a simple google search inspired by your off the cuff remark has proven you wrong. The MSRP for the EVGA GTX 580 FTW Hydro Copper 2 is $700. Most of the Radeon 5970's above $500 are the similar OC editions, or Black Editions, or Toxic Editions, etc. So don't cry foul that I quoted price from a FTW model.
4800 series vs GTX200 series - 4870>GTX260+275 GTX280/285>4870
4890=GTX285(4890 had near 285 performance at a much lower price making it a much better buy, however if money were not an issue the GTX285 was better)
4870X2 vs GTX295 - 295 beats the 4870X2 however the 4870X2 was competitive enough to say it was a great bang for buck card.
5800 series vs GTX400 series - Ok at the beginning the 5870 pretty much owned this section it beats the 470 and with the heating and power hungry 480 alot of people still decided to stay with the 5870. Now for raw performance the 480 is still the best card in this category.
However when the GTX 460's came in and with its really low price alot of people were realising its capability's in SLI which in my opinion makes it the best card to pick up only of course if you were going to buy 2 and SLI them.
With that AMD was losing sales and needed to make a good bang for buck card to compete hence the new 6800 series. the 6800 series still in my opinion doesn't really match up. The 6800 series when crossfired although they did a great job with the scaling doesn't match up to the 460 when overclocked to 900mhz+ which can be done with ease not to mention the 460's are still cheaper than both the 6850 and 6870.
Now the 580 and 570 are out with the 560 to come shortly. Not sure if you have noticed the pattern here, Nvidia are pushing AMD to a corner and if the 5970 doesn't respond well its going to hurt.
AMD do have one thing going for them and thats there upcoming 6990 in the first quarter of 2011 however I do wonder if Nvidia will respond with a dual card of there own. 2010 has been a great year for competition and I think 2011 will be even greater fun.
In any case, besides all the hypothetical aspects of your argument, you're forgetting this review said the 570 is slightly faster than the 6970.
If the 6970 has a tdp of 250W, and the 570 has a tdp of 219W, which has greater headroom and starts out faster anyway? Even if the performance were exactly equal, the NVIDIA chave card would be the better buy.
Some GTX295s had 3 dvi outputs, if NVIDIA did this with a dual GPU solution theirs would have PhysX, NVIDIA Surround, 3d Surround, 3d Vision, CUDA, and forced ambient occlusionas a feature set. Not to mention superior multi GPU drivers.
The ATi card would have Eyefinity and be hobbled by ATi's troublesome F drivers.
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
Second stage (the Dream) 7GHz GDDR5 !!! gonna retail at GTX 460 prices, gonna bury nv ha! i sincerely *s#* hope it don?t beat up on nvidia 2 much *s#* cos u no competition is good*s#* GO AMD!!! GO AMD!!!
Third stage (GF110 launch)?Who cares? REFRESH!!! Itz a paper launch ?I cant buy 1 at my local store SO ITZ NOT REAL! Cayman gonna kick yo butt anywayz!
Fourth stage (the Leaks)?Hey!...this doesn?t make sense! It?s FUD! FUD!FUD! AMD woodnt do sit on its hands looking to beat a GF100 without thinking nv would refine their lineup?wud they?
Fifth stage (the Reviews)?nvidia got ALL the sites in their pocket*, BIASED!!! they used a twimtbp game/old drivers/new drivers/no drivers/MSAA/Far Cry 2 ? I?m calling a AMD Jihad, whose with me ?!!!! WE STILL WIN on pwr usage I?M STOKED!!!
Sixth stage (the Delusion)?We (AMD) was never competing against GF110 anyway?so there!, evry1 nose big die stratgee is ded, evry1 knows HD 6990 will be the GTX 580 KILLAH!!! cos 2 x Cayman die iz smallr than GF110 so there haterz!
Seventh stage (the Hope)?Southern Islands is gonna kill nvidia. U herd it here 1st.!!!
* no mean feat for a company on it?s deathbed
The GTX570 having a maximum thermal generation of 219W is about as likely as the GTX580 having one of 244W - unless nvidia has found some magical way of making substantially more silicon generate less heat at the same voltage than AMD that I've not heard about.
Clearly AMD has the GPU size and thermal advantage. This is why their dual gpu solution will be more potent that nvidias, if indeed nvidia decide to do one at all. This is also why they will be able to offer better price/performance. What exactly do you suppose that nvidia is going to come up with in the next couple of months whilst being limited to a 40nm process? More magic perhaps?
Of course what we're debating are hypothetical performance figures, but it would border on irrationality to dispute that AMD will have the die size and thermal advantage per frame.
Untill a couple of months ago I was playing games on an old single core ddr1 system with an Nvidia 8800.
On this system any games that implemented game physics on the CPU ( eg, Havoc ) would run glitchy and look terrible, where as ALL PhysX games I tried ran smooth as silk.
There might be a reason Nvidia make their cards more complex.
Basically I think anything that makes your games more realistic is awesome.
I'v upgraded the rest of my system but I think I'll wait for the GTX 560 to get a new graphics card.
And yes I'm cheap when it comes to spending on PC parts.
The issue at hand here for AMD is that if this reviewer is correct, and we have no reason to believe he isn't, AMD is about to launch a flagship GPU that barely is competition for NVIDIA's 2nd and 3rd fastest available GPUs.
Not that long ago, AMD was calling the 6970 the "new R300":
##fudzilla.com/graphics/item/20663-amd-promotes-cayman-as-the-new-r300
Now it's looking more like every other AMD launch. I feel bad for everyone who waited for this card, and get the feeling they'll buy GTX570s instead.
No Christmas at AMD.
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
I think both companies are brilliant at doing what they do.. providing graphics for mid to high end users. Nvidia has laid claim for some time for the single-GPU crown (460 vs. 5870) but the 4xx series has been plagued by heat & noise, making the 5870 overall a better choice despite a slight performance lag. Now Nvidia has the 5xx series, which is what 4xx SHOULD have been. I'm impressed with it and the GTX 570 looks really good... but a little out of my price range. I think it's true that AMD's upcoming HD 69xx won't be able to stack up against Nvidia's high end parts, but as someone mentioned before... who cares?? It's interesting from a performance perspective but the big sales come from the mid range market. (relatively speaking) Right NOW, the HD 6850 and 6870 offer a LOAD of performance for under $250. The 6870 beats the GTX 460 and trades blows with the GTX 470, and does it while sipping power. Not bad. I'm very curious to see the upcoming GTX 560 and what it can do. It could well be the "next 8800 GT" in terms of price vs. performance, and that would be awesome.
Also, to Rollo, who seems to think AMD is so fragile, let's be realistic here. Yes, Intel is the dominant force in the CPU market. That much is obvious. But AMD is a strong competitor, especially in value. AMD's fusion tech could make a big splash. Only time will tell of course. But don't forget that in the past, AMD has caught Intel with their pants down. They were the first to bring 64-bit x86 CPUs to the consumer market, and bested Intel at first with bringing dual-core processors out. I won't argue that Intel Corei series are cutting edge, but it all comes with a price. I have high hopes for the fusion tech. And let's keep in mind both AMD and Nvidia are anticipating the phasing-out of traditional low profile graphics cards. Both Fusion and Sandy Bridge will integrate nice graphics for the masses, but I severely doubt either will offer performance on par with say, an HD 5770 or GTX 460 if you wanna blow away aliens on high resolutions with some nice eye candy and AA cranked up. The mid and high end GPU market will be safe for some time to come.
Last quarter NVIDIA shipped 59% of all desktop discrete GPUs. They haven't "lost the whole graphics market" Techman.
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
source: ####semiaccurate.com/2010/09/06/what-amds-northern-islands/
If Cayman ends up "hot and loud" because it was designed to be on a 32nm node, grab your coffee cup at the time the NDA lifts. There will be 1000s of ATi fans bodies hitting th floor simultaneously as they fall off their chairs.
I also predict that, mysteriously,"hot and loud" will cease being relevant in the ATi universe, and the new key feature will be either "single card EyeFinity" or "cheaper!".
I, Nostradamus Rollo, have foreseen it!
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
If I had to say who is in the better position at the moment from a technological standpoint, I would have to say amd/ati. Nvidia is doing great also, eg 580 and 570.
amd/ati has smaller and more efficient chips. That means that at anytime they can increase the size of the design and add performance, they got some breathing room. Nvidia is running on larger chips and flirting with the line of being too big, this worries me. I ask myself how much more can Nvidia do and what if amd/ati released a bigger design, how would Nvidia respond?
amd/ati are able to compete with Nvidia while their focus is their fusion apus. If many of ATI?s brightest are working on the apus and amd/ati is trading blows with Nvidia, what does that say? I would expect Nvidia should be wiping the floor with most of their engineers focused on gpus. I am a little disappointed with my Nvidia lately. They should be wiping the floor with amd/ati, instead they are trading blows.
Probably why my next desktop build will be amd/ati is because of this main point. My favorite company now produces amd/ati and not nvidia anymore and they have always been the best and serve me well (eg nvidia's biggest mistake imho).
XFX FTW!!!!!!!
AMD needs to know when to hang up the towel and go home. Let Nvidia get the high end and have AMD fight for scraps there.
And rollo first you say amd has only the majority of the dx11 market and then you send us a link that shows they have the majority of the market.Stop being a fanboy and make sure facts are facts.
Rollo I keep bringing up stocks and profit margins because these are the only things that in business.amd sold more cards when they were the only ones who had dx11 cards on the market.stop giving false info.
And to Coles what a company does the previous quaters affects them greatly in the future.So the gtx480 and all of nvidia's mistakes and financial loss in the recent past are extremly relevant.
What mistakes a corporation makes takes a while to catch up to them.
The link I posted showed last quarter NVIDIA sold 59% of the desktop cards. It's not a hard concept Techman.
How about "when they were the only ones that had DX11"?
#hothardware.com/News/Despite-Yield-Problems-GPU-Sales-Surged-in-Q4-2009/
"NVIDIA's share of the discrete desktop market grew in the fourth quarter,}
LOL @ the comment about losing $200 million "catching up with NVIDIA", I can't remember the last time they didn't have a billion in the bank and no debt to speak of. If you want to talk about "the mistakes of the past" you should do some research and take a look at the last five years quarterly profits for ATi and NVIDIA. You'll see ATi very rarely makes money, and NVIDIA very rarely loses it.
You seem to be one of those who think that ATi making a good card equates to some glory days for their business. Unfortunately this isn't true. Like the guys who purchased them when they finally gave up (AMD) they rarely make significant money even when they have competitive products. (I say "unfortunately" because a monopoly in any market serves only the remaining company)
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
$170 for a card that can easily be overclocked to 5870 levels. Keep in mind that this is the same price point as a 5770 a few months ago and half the cost of most 5870s. Even without overclocking you get 80% of the 5870 performance out of the gate. I have this card and it's amazing, mildly overclocked in the ATi Overdrive I get 80 fps on Bad Company 2 with very little jitter @ all high settings, 4x AA and 8x AF DX11. I have the fan set to 45% and it has NEVER gone above 64* C, even overclocked at 100% usage. iCoolerV is actually tested to work better than Sapphire's VaporX.
First this cards will see users that will likely not use resolutions above 1080. As such all three cards deliver a great gaming experience. In this area AMD has held a solid lead for some time. While people are quoting the sales figures of the current crads they forget that for most of this year that area was DOMINATED by the 5770.
The reduced sales numbers of the 6800 series I feel shows less of a dominance of the GTX 460 than the fact that 5770 sales had saturated the market and the GTX 460 picked up the slack with the 6800 series hitting late enough to get the last left overs. This segment of the market does not jump at each new generation so the sales figures as posted here are meaningless.
As for which is best, it is a toss up. The 6870 clearly has the most horsepower in the group but in the targets gaming experiences it is not as clear a winner. All three cards can deliver so the real winners are not based on speed as much as experience to price. THAT is where the 6850 was winning. Prices of late see the 6850 price matching that of the GTX 460 1 Gig which makes the GTX 460 a better buy at the moment.
Whats does all this mean at the high end, NOTHING. AMD has prettymuch stated their focus is not on crowns but rather value and this makes since. Crowns may be nice but the value market is where you make money.
AN article like this in all honest is a waste at this time. Wait for actual benchamrks and real world usage can be seen and THEN make a determination.
After speaking with Olin and explaining my position on this he agreed and explained that the article was published by mistake ahead of time. Now with his numbers to back his conclusions the article makes more sense.
Also, on the drivers, ATi has come a LONG way from where they used to be in that department. Now every single driver update (every month) adds 3-5% in fps on both my 5850 and my laptop 4570.
##brightsideofnews.com/news/2010/12/13/amd-radeon-hd-6970-unveiled.aspx
I see this as confirming what Olin's testing has found because ATis press slides are always going to be targetted at making ATi products look best.
The ATi benches definitely show the 6970 beneath the 580, and in many cases not much higher than a 480.
So ATi put out an entirely new design that is really only 20-25% faster than their last chip. This is definitely the lowest ROI I've ever seen in the graphics world- usually they get 50% and up with next gen parts.
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
Master Rod
I aint no fanboy. Currently I am using a HD5770 and I am quite happy with it. But that does not necessarily mean if I had enough money for a GTX580, I wouldnt have bought it.
I thank ATI for giving me a good card for my budget, and I thank NVIDIA for the dreams I have at night of buying a GTX580 someday ;)
Oh, and one more thing........... lithography, hypothetical legspace for extra performance and stock-quotes are the last things on my mind when I go shopping for GPUs.
Kitguru is also reporting "slower/cheaper" for the 6970. Of course, we'll have to take benchmarks with a grain of salt now that AMD has admitted a (possibly hardware) flaw creates lower image quality at their default settings:
##tomshardware.com/reviews/geforce-gtx-570-gf110-performance,2806-5.html
I think Olin is on the right track.
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
Now, that last line was not entirely true. ATI makes some great GPUs too, but somehow they have always managed to stay in the second spot. Its not hard for us to see why but what intrigues me most is that the top honchos at ATI never seem to get it.
Now I am not going to criticize ATI performance-wise because I am no engineer but I will talk about what I can see in plain sight. They invest so much energy in developing crap and shamelessly try to promote useless things to cover their shortcomings, its almost painful to watch. NVIDIA has PhysX in their pocket, so ATI tries to market Eyefinity. Don't get me wrong, Eyefinity is a beautiful tech but unless I am a crazy gamer or a 3d developer or Hugh Jackman from SWORDFISH, I simply don't need it. And what the hell is wrong with their driver developers! Each time I switch from Nvidia to ATI, I almost want to kill myself after handling their control panels. It's inefficient, buggy and grotesque. They even tried to integrate some crappy video encoder in their CP but it's so buggy that half the people cant even open it. And the ones that can open it, regret it forever for coming in contact with such a horrible video encoder. And then there is that useless media explorer and a certain FUSION UTILITY. It seems as if by releasing FUSION utility they are admitting that their products do not offer optimum performance. The people who are behind these projects should be executed military-style because they are a threat to humanity. --CNTD
Once again AMD launches a second/third place card, this time without the advantages of lower power, heat, and noise:
#techreport.com/articles.x/20126/15
This may be the most lackluster new architecture launch in history- over a year in the making, chip redesigned, and all it manages to do is match a performance level the competition achieved nine months ago.
If this design is what AMD hopes to use to compete with Keplar, 2011 looks like a bad year for AMD. Currently, I can't think of any reason a customer would want to spend $20 more for one of these cards.
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
It should have been priced much (wayyyyy much) less. But then again, who knows how the optimized driver is gonna work out for it. I doubt it will be of much help other than curing the horrible inconsistencies. Right now, it looks as if AMD has a high priced loser of a card as their flagship.
However, I'm not entirely sure where all the extra silicon in the 6970 has gone, especially since this new VLIW4 (or whatever it's called) is supposed to be more efficient. Possibly AMD has splooged it on making big improvements to the tesselation capacity (up to 2.5x), anticipating a huge fail on games of the future if they don't address that.
AMD clearly has made more progress than nvidia in the last couple of years: they have introduced two completely new generations (5XXX and 69XX), whereas nvidia really has only introduced one (the crap fermi and the good fermi). But AMD hasn't got much return for the second of its two generations because it was supposed to be for 32nm, and good old TSMC ballsed that up. The end result is that AMD has done more work to end up with no advantage over nvidia.
However, since AMD has now had an opportunity to trial the architecture for use on the new manufacturing node (whenever it arrives), and nvidia hasn't, we might expect them to have an advantage in the next generation, just as the 4770 trial allowed them to best nvidia in the 5XXX vs crap fermi race (by which I mean getting to the market 6 months earlier etc etc).
Hopefully for AMD their cards will turn out to be good DX11 game performers, and since they do still use less silicon per frame, they should also have room to remain price competitive with nvidia.
##pcworld.com/article/213623-4/amd_radeon_hd_6900_s eries_right_performa nce_right_features_r ight_price.html
I presume you're referring to 2560x1600, which less than 0.001% of the gaming population uses as a standard resolution. Even then, it was ahead of the 570 twice, even once, and trailing twice.
They also say: "The GTX 570 probably offers slightly better performance in most games, unless you're using one of those big, 2560 by 1600, 30-inch monitors."
Let's please try to keep the comments on-topic.
No PhysX games? No problem! AMD has a demo of some Bullet physics!
No 3d Vision? No problem! AMD has a partnership with a company that used to make 3d monitors!
No forcing ambient occlusion? No problem! AMD will let you use it on games that have it built in!
No CUDA apps? No problem! AMD has STREAM, and someday someone will take that seriously.
Second place performance even thought the chip is next gen? No problem! It's "good enough for anyone, by golly.
You need to get used to the mindset "Whatever AMD doesn't have, they're really doing better!" Olin.
NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to
facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
they use only AMD RAEDON . EVEN iMAC(APPLE) THEY USE ONLY AMD RAEDON .. NOT GEFORCE .. I THINK AMD IS THE BEST ..