| NVIDIA APEX PhysX: CPU vs GPU Efficiency |
| Articles - Featured Guides | |
| Written by Olin Coles | |
| Friday, 17 September 2010 | |
NVIDIA APEX PhysX Efficiency: CPU vs GPUBenchmark Reviews tests NVIDIA APEX PhysX efficiency using Mafia II - compares CPU vs GPU performance.According to the August 2010 Steam hardware survey, PC gamers are using NVIDIA GeForce desktop video cards nearly 80% more than AMD/ATI counterparts. Great products have come from both GeForce and Radeon brands, yet based on this survey NVIDIA owns almost 60% of the entire graphics market compared to AMD's 33%. Gamers might rely on NVIDIA's hardware for its superior graphical processing power and affordable price point, but it's their gaming technologies that have helped deliver complete market dominance (among Steam users). NVIDIA's "The Way It's Meant to be Played" is a trademarked slogan denoting a direct involvement in software development as much as they focus on hardware. When the Ageia PhysX software physics technology was purchased back in early 2008, that commitment sharpened NVIDIA's growing double-edge sword. Adding 3D Vision only helped consummate their efforts. In this article, Benchmark Reviews will demonstrate how far PhysX technology has come using the recently-released Mafia-II video game by 2K Games. In this single-player third-person action shooter developed by 2K Czech for 2K Games, players assume the life of World War II veteran Vito Scaletta, the son of small Sicilian family who immigrates to Empire Bay. Mafia II makes use of DirectX-11 extensions on 2K Czech's proprietary Illusion game engine, which introduces NVIDIA APEX PhysX and GeForce 3D-Vision technology enhancements. NVIDIA's APEX PhysX modeling engine adds new Destruction, Clothing, Vegetation, and Turbulence physics into games such as Mafia II. While adding PhysX support to a video game is nothing new for NVIDIA, allowing APEX PhysX features to be computed by the computer's central processor is new territory. For this NVIDIA APEX PhysX: CPU vs GPU Efficiency demonstration, our tests compare GeForce and Radeon GPU's against the Intel Core i7 CPU.
This article isn't intended to become a NVIDIA vs AMD topic, but it becomes impossible to avoid since ATI does not license PhysX. NVIDIA offers a free software development kit so CUDA drivers can be built for AMD products, yet all ATI Radeon graphics cards (up to the HD 5000 series) still do not compute PhysX commands without using modified drivers. As a result, PhysX hardware acceleration is presently available only on GeForce GPUs unless gamers research unsupported options for their Radeon products. NVIDIA has opened their PhysX platform to AMD and Intel processors in Mafia II, allowing hardware acceleration to be calculated my the system's central processor. The narrative of this article is how well PhysX is processed by the CPU and GPU, and where the different GeForce Fermi graphics processors (GF100, GF104, GF106) stack up in regards to PhysX efficiency.
NVIDIA APEX PhysX Destruction, Clothing, and Particles in Mafia IIMafia II is the sequel to Mafia: The City of Lost Heaven released in 2002. Growing up in the slums of Empire Bay teaches Vito about crime, and he's forced to join the Army in lieu of jail time. After sustaining wounds in the war, Vito returns home and quickly finds trouble as he again partners with his childhood friend and accomplice Joe Barbaro. Vito and Joe combine their passion for fame and riches to take on the city, and work their way to the top in Mafia II. While this premise makes for an interesting storyline, it's the graphical effects that keep players immersed in very realistic settings. NVIDIA's APEX PhysX is the glue that binds in Mafia II, giving the game life-like physics to create a virtual reality. Mafia II was developed with NVIDIA's PhysX version 2.8.3, the available software development kit at the time. This version supports only single-core single-threaded PhysX CPU processing, which is minimal in comparison to the available hardware of most PCs. The current PhysX SDK (version 2.8.4) supports SSE2 instructions, but this feature must be enabled by the developer. According to NVIDIA, the forthcoming PhysX SDK 3.0 is said to introduce multi-threaded CPU support for PhysX extensions, and SSE is enabled by default. NVIDIA APEX PhysX EnhancementsPhysX helps make object movement more fluid and lifelike, such as cloth and debris. Mafia II is the first PC video game title to include the new NVIDIA APEX PhysX framework, a powerful feature set of graphical special effects that only GeForce video cards can currently deliver. Only the PC version of Mafia II supports NVIDIA's APEX PhysX physics modeling engine, which adds the following features: APEX Destruction, APEX Clothing, APEX Vegetation, and APEX Turbulence. This section demonstrates the differences between playing Mafia II with and without APEX PhysX enabled. Many of the latest video games are being developed with new graphical enhancement technologies, but Mafia II is the first to offer both NVIDIA APEX PhysX and 3D-Vision Surround features together. Each of these NVIDIA technologies are designed to work their best on GeForce desktop graphics solutions, but only the most powerful GPUs can make these high-quality special effects stand out in full glory. We begin with a scene from the Mafia II benchmark test, which has the player pinned down behind a brick column as the enemy shoots at him. Examine the image below, which was taken with a Radeon HD 5870 configured with all settings turned to their highest and APEX PhysX support disabled:
No PhysX = Cloth Blending and Missing DebrisNotice from the image above that when PhysX is disabled there is no broken stone debris on the ground. Cloth mesh from the foreground character's trench coat blends into his leg and remains in a static position relative to his body, as does the clothing on other (AI) characters. Now inspect the image below, which uses the GeForce GTX 480 with APEX PhysX enabled on high:
PhysX Enabled = Realistic Cloth Mesh and Falling DebrisWith APEX PhysX enabled, the cloth vertices neatly sway with the contour of a characters body, and doesn't bleed into solid objects such as body parts. Additionally, APEX Clothing features improve realism by adding gravity and wind effects onto clothing, allowing for characters to look like they would in similar real-world environments. This added realism makes a noticeable difference during game play.
Burning Destruction Smoke and Vapor RealismFlames aren't exactly new to video games, but smoke plumes and heat vapor that mimic realistic movement have never looked as lifelike as they do with APEX Turbulence. Fire and explosions added into a destructible environment are a potent combination for virtual-world mayhem, showcasing the new PhysX APEX Destruction feature.
Exploding Glass Shards and Bursting FlamesNVIDIA PhysX has changed video game explosions into something worthy of cinema-level special effects. Bursting windows explode into several unique shards of glass, and destroyed crates bust into splintered kindling. Smoke swirls and moves as if there's an actual air current blowing, and flames move out towards open space all on their own. Surprisingly, there is very little impact on FPS performance with APEX PhysX enabled on GeForce video cards, and very little penalty for changing from medium (normal) to high settings. We prove this in the next section... Testing and Initial ResultsVGA Testing MethodologyThe Microsoft DirectX-11 graphics API is native to the Microsoft Windows 7 Operating System, and will be the primary O/S for our test platform. DX11 is also available as a Microsoft Update for the Windows Vista O/S, so our test results apply to both versions of the Operating System. The majority of benchmark tests used in this article are comparative to DX11 performance, however some high-demand DX10 tests have also been included. According to the Steam Hardware Survey published for the month ending August 2010, the most popular gaming resolution is 1280x1024 (17-19" standard LCD monitors). However, because this 1.31MP resolution is considered 'low' by most standards, our benchmark performance tests concentrate on higher-demand resolutions: 2.30MP 1920x1200 (24-28" widescreen LCD monitors). These resolutions are more likely to be used by high-end graphics solutions, such as those tested in this article. In each benchmark test there is one 'cache run' that is conducted, followed by five recorded test runs. Results are collected at each setting with the highest and lowest results discarded. The remaining three results are averaged, and displayed in the performance charts. Intel X58-Express Test System
DirectX-11 Benchmark Applications
Video Card Test Products
PhysX Level ImpactMafia II offers comprehensive video settings, with everything from antialiasing (on or off) and anisotropic filtering (1/2/4/8/16x) to ambient occlusion (on or off) and APEX PhysX (off/medium/high). Our benchmark tests use all settings configured to their highest quality, and then added medium or high APEX PhysX. Measured at 1920x1200 resolution, Mafia II placed ample demand on all four graphics cards tested. The purpose of this test is to illustrate how much impact PhysX has at three levels: off, medium, and high.
Rather than grading one product against another as we usually do in our video card reviews, these results help us focus on individual impact. Here's an initial reaction to how each video card is affected by PhysX performance (more tests in the next section):
What we can surmise from this initial results is that AMD/ATI Radeon HD 5000-series video cards suffer obscene frame rate reductions when APEX PhysX is computed by the CPU, while GeForce products manage the physics models more efficiently. The NVIDIA Fermi GF104 does a much better job of computing PhysX in the GeForce GTX 460 than the GF100 does in the GTX 480. Now let's look at the finer details... APEX PhysX: GPU vs CPU TestsAt Benchmark Reviews, we test video cards using only the mutually available common settings for each product tested. If our goal was to compare video frame rate performance, we would ensure that anti-aliasing does not exceed 8x MSAA (maximum supported by Radeon GPUs) and PhysX capabilities were disabled (currently only supported on GeForce GPUs). This section isn't about comparing relative graphical processing power between competing products, but instead it focuses on how relevant GPU processing of physics models has become. Intel and AMD have battled for processor supremacy for years, but when it comes to video games NVIDIA has proven that the GPU trumps all. In this second series of tests, six different video card combinations are tested in Mafia II using disabled, medium, and high APEX PhysX settings. Pay careful attention to how CPU-bound PhysX performance declines in reverse order of performance:
Similar to the first set of test results, Radeon video cards suffer poor frame rate speeds when PhysX is enabled. Our Intel Core i7-920 quad-core CPU just doesn't compare to the hundreds of cores available in a graphics processor. Both AMD and NVIDIA products suffer heavily reduced performance when APEX PhysX is processed by the computer's CPU, although there appears to be an unexpected trend: the most powerful GPUs offer the inverse in CPU-processed PhysX performance. Mafia II is clearly dependant on PhysX technology to deliver the realistic scenery and physics gamers enjoy, and future DX11 video games may follow suit. This could relegate ATI Radeon products to rely on the system's central processor to help compensate, and thereby reduce overall performance when the settings include PhysX. For best results in Mafia II, players are suggested to use NVIDIA GeForce products to properly process PhysX features and produce high-enough frame rate performance to enjoy 3D Vision. You can still play Mafia II and future APEX PhysX-enabled PC video games with your Radeon video card, but as our results show there will be hell to pay. 2K Games designed Mafia II using NVIDIA's PhysX 2.8.3 SDK, which supports only single-threaded PhysX CPU processing. PhysX SDK version 2.8.4 supports SSE2 instructions (which are not enabled by default for backwards compatibility), allowing updated games to compute PhysX more efficiently if developers enable this function. Finally, the forthcoming PhysX SDK 3.0 is said by NVIDIA to introduce multi-threaded CPU support to PhysX with SSE enabled by default, which could really change the game for everyone. READER FEEDBACK: Do you play Mafia II on a NVIDIA GeForce or AMD Radeon DirectX-11 video card with APEX PhysX enabled? Tell us how it's working for you in the comment area below, and please share your quality settings. |
|
Comments
I might also add that you're clearly picking a fight without knowing the subject matter. Those are factual numbers I'm quoting from the Steam Hardware Survey. Look it up for yourself. Also, if you don't think ATI users are playing Mafia II with PhysX enabled perhaps you should read the comments in our review of that game: /index.php?option=com_content&task=view&id=582&Itemid=64
I apologize for missing the CPU driven physX statement. You are totally right there and although physx is good, its hardly revolutionary or even practical fps wise.
And there IS "actual PhysX" occurring, even with an ATI card, since the game allows you to use the CPU to run PhysX. You did read the article, right? Or are you just a frustrated ATI fanboi?
Try slowly reading the first six words of the above sentence.
A survey by "Steam" not Benchmarks.
Learn to read properly then you can have intelligent conversations.
The above statement is an "utter statistics", according to Mark Twain classification.
The steam market does not = the entire market nor can such an extrapolation be made.
Again your almost making an extrapolation saying because steam says it that means thats how the entire market is and then plugging your ears saying 'lalalala can't hear you'.
Yep, it's totally biased... towards Steam. I'm waiting on you to suggest another survey. Reply when you have one.
Steam don't ask what they prefer, but just do an automated check on what's actually being used.
Therefore it would be much more correct to state that gamers (currently) "use" more nVidia that ATI cards.
In regards to this post:
# RE: Haha ? Olin Coles ? 2010-09-16 17:40
Look it up for yourself. Also, if you don't think ATI users are playing Mafia II with PhysX enabled perhaps you should read the comments in our review of that game: /index.php?option=com_content&task=view&id=582&Itemid=64
===================
Olin I already apologized about not reading the part of cpu physx. It was my mistake, sorry.
Although I do find it intriguing that your other users share my sentiment about the nVidia sponsorship nature of your article writing. :)
If you believe that quoting the results of Steam's Hardware Survey indicate some level of bias, you're clearly fishing for something that doesn't exist. Point me to another large-scale survey, and I'll be happy to add that perspective.
I won't say this again, please read it as many times as it takes you to understand.
Thank you.
Satisfied now?
Just stay neutral and objective and you will be fine. :)
Your neither neutral or objective yourself.
Your mad at the Benchmark editors?
Get over it.
This is one of the best if not the best electronics review sites on the web.
You don't agree? That is fine.
Why come to a site you don't like?
Have a nice life.
Mike
We slammed NVIDIA for saying that DirectX 11 wouldn't drive game development, and for crippling anti-aliasing for non-NVIDIA cards in games like "Batman: Arkham Asylum".
That was then, and this is now. NVIDIA now has a significant advantage over ATI, especially with mid-range cards like the 460. Will this all change with the introduction of the rumored ATI 6800 series? Who knows? But we'll test it, and if it beats NVIDIA's offerings, someone else with an axe will be back accusing us of being in ATI's pocket.
They were kind of right about dx-11 so far. The reason? DX doesn't drive development... consoles do. For the most part as consoles advance so do pc ports and games advance (which is not much of course).
When the next-gen consoles arrive tech like DX-11 if its shared by them will be all the rage. Before that not so much.
Xbox 360 was released in 2005 and the PS3 was released a year later, around that time nvidia was making the the 6800 ultra NV45 core (360) and 7800 GTX G70 core (PS3) and ATI's was the Radeon X850 XT PE R481 core (360) and Radeon X1800 XT R520 core both of which use Direct X 9.0.
The PS3 uses a nvidia 7800 core for its GPU and the 360 uses a ATI Xenos (R500 core) for its GPU. Thats why games allways look better on a PC, which is where PhysX comes in.
Google translate : I hope that something will change (2.66 ----> 4 GHz)
1) No dedicated PhysX GPU was used -> APEX Clothing module is running on CPU in this case
2) Why no CPU scaling (cores/clocks) graphs ? Would be interesting to see if CPU APEX is multi-threaded at all.
3) Same with APEX tweaks, would be nice to see what effects should be removed, to make Mafia II playable on Radeon with APEX On.
and yes this is a fanboy statement :)
Nvidia forget this : they are GPU manufacture company in first place not software bribe company, for me is simple now they are a weak company and they play dirty shame again...
Physx should be a free software because physics belongs to the nature :) ageia or nvidia didn't invent anything
Please keep comments and discussion on-topic with this article.
Yes they are opening it up too other manufactures chipsets and they really don't have too. But what I think the author is trying to express is that it is a long way from being ready for use with CPU's. To get the full benefit of any technology you are stuck to what the developer has built it around.
If it's a matter of getting the hardware please let me know. -Lee
Nvidia doesn't sell CPUs so I understand this from their position. However I would hope to see pressure from objective reviewers, as well as game developers, to fix this.
"when it comes to video games NVIDIA has proven that the GPU trumps all"
"Our Intel Core i7-920 quad-core CPU just doesn't compare to the hundreds of cores available in a graphics processor"
Also, I'm not giving anyone a 'pass' as I've already stated in this article (and these comments) that Mafia II APEX PhysX was the focus and not NVIDA PhysX vs AMD vs Intel. I also mention how PhysX 3.0 will include multi-theaded processing and SSE instructions by default.
When PhysX 3.0 is used, your argument about CPU speed will matter much more than it does now.
The quotes I mentioned are still not accurate as they're comments about the GPU vs. the CPU, while the real reason for the difference in frame rates is due to software, not hardware.
The cute thing in reviews is that ATI and Nvidia go neck and neck in Framerates with a 480 GTX pulling ahead in games, but once you go into GPGPU and game programming...You see a different story.....An HD 5870 loses vs a 285 GTX...
(I'm trying to avoid more "lively" ATI vs NV "discussion" LOL)
##thinq.co.uk/2010/7/7/cpu-physx-deliberately-crippled/
#arstechnica.com/gaming/news/2010/07/did-nvidia-cripple-its-cpu-gaming-physics-library-to-spite-intel.ars
There is actually no reason that a GPU would run physics code better than a CPU other than intentionally crippling the CPU implementation, especially in the age of multi-core processing. One could argue that a CPU is actually more adept for these kinds of calculations, whereas a GPU is more tuned for rendering the graphics.
And yes, NVidia did in fact block out PhysX when an ATI GPU was present. There was a workaround where you could run an ATI GPU as your main board, and some cheap geforce just to do PhysX, but NVidia blocked that in a driver update. They even made a public statement regarding it.
I'd like to understand what you mean when you say PhysX supports better graphics?
(the "NVidia own PhysX" and "ATI is being blackballed" crapola was inevitable LOL)
But not to subside (or racket) the article like this one, as this is a 100% swindle - to compare top-notch GPU implementation with last century CPI version.
The "Unfair competition" is still a criminal cause and this article is a bright example of it.
If your article would reveal that NVIDIA is not supporting modern CPUs and is forcing the end-user to use NVIDIA products only, then it would be a correct one. Otherwise it it (or at least looks as) a sponsored and highgly incorrect in conclusions article, continuing NVIDIA swindles - demand to pay for SLI moderboards, blocking the work of NVIDIA cards as the secondary for PhysX or ind AMD/NVIDIA set with Hydra chip, demand for x16 slot, because they can not make the internal interface to work even at PCIe v.1.1, etc.
Unless I'm wrong (again), APEX PhysX is new to PC video games and Mafia II was the first time it's been used. You might know more about all of this than I do, so please tell me how PhysX and APEX PhysX are 100% the same thing, and how a statement like "how far PhysX technology has come using the recently-released Mafia-II" doesn't apply to the introduction of APEX PhysX.
Seriously though, if this wasn't about APEX PhysX in Mafia II, then why did I spend four pages using the game as my central discussion? The only people who think that this article was about NVIDIA vs AMD are those readers who really want it to be. That being the case, it's a lot easier just to ask me for an article that focuses on that topic.
:)
As far as APEX PhysX vs. PhysX, correct me if I'm wrong, but APEX is just a set of tools to allow graphic artists access to PhysX capabilities without having to know how to do low-level code work (vs the low-level API in use before), and has zero to do with it's performance on a GPU vs CPU since it's still the standard PhysX code base (albeit a newer version than previously used) telling the hardware what to do. APEX is simply about PhysX being easier to adopt by game makers, as it saves them money hiring developers.
Yes, they left some CPU possibilities - but again, intentially very out-of the date - one more examle of unfair concurension swindle - this time with CPU solutions.
Your article title is "NVIDIA APEX PhysX Efficiency: CPU vs GPU" - not "Mafia II is much better on NVIDIA". Subtitle also says that Mafia II is only an example - and I'd agree - an example of pre-paid by NVIDIA demand not to use other technologies.
On another note there's a rediculous number of ATI fanboys replying to this fairly innocent article, grow up chaps. They're two companies who both do a decent job of supplying graphical technologies to us, stop complaining about the one which you did not buy into.
nvidia is not an honest company.... why? is very simple
nvidia is censoring a lot of things....sincerely i don't want to be nvidia partner because on hard times it's want just "her"to survive
i think nvidia doesn't have an scalable gpu architecture in terms of efficiency and they are trying to remain floating with pieces of wood from his last ships aka: physx, cuda or other payed software
a say again: to remain competitive you have to discover new stuff not a new scheme
it's a phrase in my country "some lady different clothes"
So cuda and PhysX are the only things keeping Nvidia going even though they currently hold the best value for money cards and currently the most powerful single gpu?
"a say again: to remain competitive you have to discover new stuff not a new scheme "
Eh, that makes little to no sense, no idea what you're getting at.
I've owned cards from both manufacturers in the past and although currently own a GTX460 would not hesitate in the future to buy an ATI if they have the right card for the right price.
Stop being a fanboy.
Think, if NVIDIA will through away (nearly half of the code) "protecting" their hardware from "not supported" configuration use it will become even better.
2. Also traditionally NVIDIA was (and is) bthind AMD in hardware implementation. This is not the topic of the article in question, but may be prooved easile using Benchmark Reviews materials.
And w/o big changes in H/w architecter, the next generation chips will be the "swan song".
This is not your personal sounding-board.
but as for the other towering infernos of power hungry monsters no thanks.
also since when is physx an open standard....nvidia owns it
##google.com.au/search?sourceid=navclient&ie=UTF-8&rlz=1T4ADSA_enAU395AU396&q=physx+open+standard
anyway either way you look at it nvidia need to get off its ass and make a dedicated physx card
1) Do you realize there's a 'Reply' button? You could avoid all of the new threads.
2) Do you understand what 'bribe' means? It means to give money in exchange for something.
First you claimed that NVIDIA bribed me to write an article about PhysX (which is disrespectful and very insulting), and then you claim NVIDIA optimized their GPU with 'bribe'.
i did not say Olin Coles takes bribe , sorry for you to understand that
but a can say that you insult our intelligence with this kind of reviews we (few billions) know how a global company sell their products
and of course i don't understand exactly what is bribe, because i am not native English....but i saw this word use it often in this kind of debates
anyway you, me, other, people we have a chance to communicate the facts as it is ...and this is enough, dont you think?....and this is the real mass-media
...how can you advise million of people? after an review?
an GPU is a unit that can do a lot of things ... we don't buy cars (GPU) for few roads right?
your advise point in one and only direction
Well let me see, how bout a I7 2600k, Asus WS Revolution, one Quadro 5000, Three Tesla C2050's, Two Muskin 60 GB SSD's RAID 0, Four WD 640's RAID 10, 16 GB 1600 CAS7 Quad Channel memory. Sound Fast enough for ya, when you get an AMD/ATI setup to match, let dual....
I not saying AMD/ATI missed the bus, what i am saying is they took the short bus, not the ones the rest of us took to school!
:0)
Danny
Danny, is the winner!
nVidia is the world leading in visual computing technologies. There is no doubt, how you guys are fan boys or not, it doesn't give any sense. I ran my 4750 Crossfire cards for nearly 3 years, then boom, the Fermi launched and I jumped on that train. Then it seems that ATI is sort of making the 5970+ by just slapping on 2GB Video Memory to flex around with.
Even at the highest resolutions of benchmarking; through OCN, Guru3D.. I mean. Here is an old example: #img109.imageshack.us/img109/8202/vtgpuscp.jpg
We have one GPU chip on a card, almost outrunning ATI's monster with two chips slapped on the card. The rumors of nVidia's 590 is almost scaring me, and what is ATI coming out with? I might even change! As well as getting a free cloth-dryer when buying either of the monsters.
And as Danny said, Tesla C2050s.. supercomputing so far, I've haven't seen one machine using ATI/AMD; it's just IBM/Intel/nVidia chaos, and when even opening any GPU programming software, your card just choke without CUDA, but then again, the limitation is due to low texture rendering on early cards and so forth.
But check out ##geeks3d.com/20100606/gpu-computing-nvidia-cuda-compute-capability-comparative-table/ and perhaps you'll learn a bit of your favorite subject, or whatever you're discussing; it's like fanboy echos, or some sort of sci-fi-speak-stealth-war-N-vs-ATI-but-none-wants-to-yell-out-their-argument.
It's ATI's fault even back from 2006, putting out cards to sell on the market; unlike nVidia which lost revnue, due to focusing and developing next-gen- chips. AMD CEO Jen-Hsun Huang just left and joined nVidia for a pretty obvious reason...
Bla bla bla. Read the whole story about it where ever you like.. and this comment was totally off topic what so ever, but I've been reading it, and all the posts, and an article is an article; and thumbs up for contribution. What's up with people being enemies over some hardware? Just think: "Great, now have that, soon we're on the moon."
I grew up with black/white TV, and your thoughts of games? Stamped as sort of crazy.
Appreciate it! More educating, less arguing, it's 2011, enjoy it; thinking back, this was impossible-space-technology. (Star Trek fan!)