| NVIDIA nTeresting 18 December 2009 |
| Written by NVIDIA - Brian Burke | ||
| Friday, 18 December 2009 | ||
NVIDIA nTeresting 18 December 2009
Best. Peripheral. Of The Year.Year end means "Best of" awards season is here. IGN thinks ‘best of' means NVIDIA 3D Vision: "While the promise of 3D-capable HDTVs resides on a distant horizon for console gamers and cinephiles, PC gamers can step into the third dimension now with the NVIDIA GeForce 3D Vision. With advanced shutter technology and processing, the 3D Vision turns hundreds of popular PC titles into immersive 3D experiences." Yahoo likes 10 gadgets. 3D Vision is one of them: "2009 has been a breakthrough year for 3D, with most manufacturers showing off plans for new 3D TV sets, but most won't arrive on the high street until next year at the earliest. If you have a PC, though, you can get a 3D gaming experience today." Benchmark Reviews....you get the idea:"...it is the only product on the market the truly transforms the ordinary gaming experience into a realistic virtual experience." If you have not tried NVIDIA 3D Vision you should. It is awesome for games.....and movies: "Watching the 3D Blu-ray movie using NVIDIA 3D Vision was just like watching a 3D movie at the theater." NVIDIA's active shutter technology is the right technology for 3D and has many advantages over passive polarized solutions offered by others. "The Blu-ray specification is expected to include the active shutter technology as the preferred method for achieving 3-D viewing in order to achieve high-definition in the high-end 1080p resolution." NVIDIA is working with our partners to drive 3D technology in to homes. "On Wednesday, NVIDIA is expected to announce that it has partnered with ArcSoft, Corel, CyberLink and Sonic Solutions, all makers of software used for playing DVD movies on PCs. The partners will work together to be sure PCs - and later TVs - can play 3-D movies with Blu-ray using NVIDIA technology." And we are also finding more and more areas where 3D Vision works for business: "NVIDIA Corp. and Siemens Healthcare today announced that it has demonstrated an immersive 3D ultrasound viewing experience that enables expecting parents and their medical caregivers to view the fetus with incredible detail using 3D glasses "Agilent Technologies Inc. today announced a stereo 3D viewer capability for the Momentum G2 Element and FEM Element electromagnetic (EM) simulators in its Advanced Design System (ADS) EDA platform. The new stereo 3D viewer leverages NVIDIA Quadro graphics processing units (GPUs) and NVIDIA's unique 3D Vision quad buffered stereo technology, originally designed for immersive mechanical design, digital content creation and video games, to render electric and magnetic fields and currents for scientists and engineers in crisp, vivid, high-resolution 3D." You can't do 3D right if you do it as an afterthought. NVIDIA = WinnerLaptop Magazine has made a list for winners and losers for 2009. The winners took risks and out-innovated the competition. The losers either played it safe or failed to execute. NVIDIA is a winner. "NVIDIA proved to the world in 2009 that graphics aren't just for gaming. Its ION platform for netbooks transformed these once-underpowered devices into multimedia machines capable of playing high-def video. It's no coincidence that the Ion-powered HP Mini 311 has been the most popular review on our site for months, and ASUS, Lenovo, and Samsung are rolling out their own Ion systems. NVIDIA's CUDA technology has also impressed, dramatically improving such tasks as video editing. Meanwhile, NVIDIA 3D Vision technology has potential for a new breed of entertainment notebooks. Although wearing the required glasses is still geeky, I like the immersive gaming experience, and 3D Blu-ray playback is up next." Others did not fare so well, and landed on the losers List: "AMD also missed the netbook boat, and its Athlon Neo platform for ultraportables has, thus far, fallen flat. The company needs to provide some healthy competition for Intel's next-gen Core and Atom processors" There is a global movement underway to enhance the visual computing experience in every type of device. We're excited about the era of visual computing and all the new products and platforms that will come out of it. SuperComputers Get More Super with GPU ComputingNews out of the land down under is that Australia's national science agency has fired up a massive GPU supercomputer capable of delivering 256 Teraflops of peak performance. "The CSIRO supercomputer - which is powered by 64 NVIDIA Tesla S1070 GPUs - includes 28 Dual Xeon E5462 compute nodes (or 1024 2.8GHz compute cores), 500 GB of SATA storage, a 144 port DDR InfiniBand Switch and an 80 Terabyte Hitachi NAS file system." If 64 chips seems too gaudy, perhaps 13 in a desktop supercomputer is more your speed? "Dubbed the Fastra II, you get a regular Intel 2.66GHz Core i7 processor, but what makes it different is the number of video cards within. You will find half a dozen dual-GPU GeForce GTX 295 cards working alongside a single GeForce GTX 275, theoretically allowing it to handle up to 12 teraflops of general work and 3D graphics without breaking a sweat. So what does this move to parallel computing mean? More and more performance....stupid performance leaps, really. "It's always impressive to see what graphics chips can be used for, outside of just pumping out polygons and bump maps. Anti-virus makers Kapersky Labs in Russia is now claiming that its security software is a whopping three hundred and sixty times faster when it's using an NVIDIA Tesla S1070 GPU then when using an Intel Core 2 Duo processor. CPUs are no longer increasing in clock speed yet consumers are demanding more from their PCs today than ever before. In order to provide the much needed performance to deliver on these consumer expectations, the only path available is to go multi-core or parallel - ie: add more cores and split demanding workloads across them. Due to the very nature of computer graphics, GPUs excel at doing many things at once and as such are ideally suited to this new computing environment. What GPU's bring is a massively parallel approach to the problem with 100's of cores. CUDA, It Is Not just For GeForce AnymoreCUDA is the name of NVIDIA's parallel computing hardware architecture. The NVIDIA CUDA architecture is what allows ION, GeForce and Quadro products to perform so well at tasks that are parallel. Liliputing knows this should not be overlooked in netbooks. They have the numbers to prove it. "See those enormous blue bars? That's how long it took to transcode a 4 and a half minute video to H.264 without using the CUDA encoder. The green bars show how long it took using MediaCoder's CUDA encoder. The dual core Atom processor certainly helps speed up the task, but the GPU-enabled software made a much bigger difference. In other words, if you have an NVIDIA ION powered system and a choice of using GPU-enabled software, use it." CUDA is a big advantage for NVIDIA and is the reason we are the undisputed leaders in GPU Computing: "NVIDIA may gain an even larger advantage, given the maturity and breadth of its CUDA offerings and Nexus development tools that specifically target GPU computing." Parallel programming is THE next big thing for the world of computing - it has started already. The balance of power between the CPU and GPU is changing to adapt to a fundamental shift in the demands placed upon today's PCs and Workstations. More Intel Graphics FAIL(s)On the heels of Intel's Larrabee woes, comes news of more graphics let downs. Geek.com points out that while it is thought that the upcoming Pineview Atom processor would be better for netbooks, it too will under-deliver on graphics: "Unfortunately, it looks like that was an optimistic assumption. The new Atom Pineview D410 and D510 processors have been given some preliminary benchmarking tests, and the results are actually fairly disappointing, with the NVIDIA ION configuration actually beating Pineview in most results." Surely that was only a few selected benchmarks, right? Wrong. "...the ION soundly trounced the D410 and D510 in most real-world applications." NVIDIA's focus is on the consumer experience and delivering compelling solutions people want. That is why ION redefined the netbook category. Intel focuses on processing technology, not the consumer experience. "Intel's big move in 2010 will be to take a 45nm graphics die and integrate it with a 32nm dual-core CPU onto a single chip, both for desktops ("Clarkdale") and laptops ("Arrandale"), while otherwise sticking with its 5-series integrated graphics." And they continue to disappoint for graphics. AMD Continues to Struggle With OpenCLDespite what competitors want you to believe, NVIDIA loves open standards. We support open standard and we support standards that allow us to innovate in a timely fashion. Meanwhile AMD struggles with its GPU computing strategy and with OpenCL in particular. "If you are wondering what is the real deal with GPGPU API's, there is a telling tale of why Adobe opted to base its Mercury Engine on NVIDIA's CUDA language. While AMD will tell you that they're all for open standards and push OpenCL, the sad truth is that the company representatives will remain shut when you ask them about the real status of their OpenCL API - especially if you quote them a lead developer from a AAA software company with 10x more employees than AMD themselves that goes something like this: "I struggled to even get ATI's beta drivers installed and working, it was just problem after problem. Maybe once ATI gets their drivers out of beta and actually allow you to install them then I will have some performance numbers. I mean at this point AMD is so far behind in development tools they are not even worth pursuing right now." NVIDIA is supporting OpenCL with strong support for developers. OpenCL was developed on NVIDIA GPUs, and NVIDIA was the first to demonstrate an OpenCL app running on a GPU at Siggraph Asia in December. NVIDIA's Neil Trevett is the chairman of the Khronos Group. NVIDIA gave the industry's first OpenCL performance profiler for the GPU to thousands of registered developers. We did the first best practices guide. We were first with drivers for developers and for consumers. NVIDIA has a long history of embracing and supporting standards since a wider choice of languages improve the number and scope of applications that can exploit parallel computing on the GPU.
|
||