How NVIDIA GeForce GRID Will Change Video Games |
Articles - Opinion & Editorials | ||
Written by Olin Coles | ||
Thursday, 17 May 2012 | ||
How NVIDIA GeForce GRID Will Change Video GamesNVIDIA GeForce GRID and NVIDIA VGX Cloud Computing Technology Introduce Change to Video Games and the PC PlatformNVIDIA recently unveiled two new graphics technologies at the 2012 GPU Technology Conference: NVIDIA GeForce GRID and NVIDIA VGX. You're probably not familiar with either of these 'Cloud Computing' technologies, but in the next few years they'll develop to become quite commonplace. One day in the not so distant future, you could find yourself choosing between a NVIDIA GeForce GRID subscription or NVIDIA GeForce graphics card to play the latest high-end game. In this editorial I explore what these two new technologies offer, and how they will change the graphics industry. Not everyone has a computer device with the power to do whatever they want with it. If you're reading this then you're one of the lucky few, but for many people of other nations a modern computer is an unknown luxury. Most desktop PC systems can be upgraded with new hardware, but notebook computers lack this ability and cannot run may of the latest video game titles or play back high-definition multimedia content. Tablet and SmartPhone devices suffer a similar fate, and have limited capability. Traditionally, video games have been played on the most capable system, usually a PC or console. Now NVIDIA GeForce GRID technology removes this need for an expensive system, and gives gamers the ability to play video games from any connected device with a display screen (think: HDTV, monitor, notebook, tablet, SmartPhone, etc). Back in 2002 I built a Citrix MetaFrame server to conduct an experiment: connect several very old PCs into one mainframe computer system and enable friends to play video games on each using the high-end GPU installed on the Terminal Server. The experiment was successful with low-end 2D multiplayer games (this was before the era of countless browser-based games that now cover every genre), but multiplayer 3D games such as Quake III lacked OpenGL support on our server platform. Using a Local Area Network connection to join several friends together via Ethernet cables into a multiplayer game worked very well, up until high-speed Internet became the household standard across the planet and made LAN parties virtually obsolete. Ten years later NVIDIA has made my experiment into reality, and one remotely-located high-end system can offer the performance that other connected devices lack. The Internet is Everything!
Some of us can remember back to a time when video games only offered one mode: arcade. It wasn't for many years later when dial-up Internet service was common that video games offered a multiplayer mode, and then many more years before broadband Internet helped make multiplayer gaming the standard (suggested reading: Before PC and Console Games: The Video Arcade). Now everything is connected, and it's difficult to find a modern video game title that doesn't feature an online multiplayer mode. Multiplayer games helped fuel the home entertainment market, while wiping out an entire arcade industry. Although my Terminal Server experiment back in 2002 was uninspired, cloud-based gaming isn't new. First-generation cloud gaming platforms suffered high network latency times (ping) referred to as lag, had low quality graphics, and were expensive. NVIDIA GeForce GRID technology promises to overcome these issues and make game streaming as common as renting a movie online. Cloud-based NVIDIA GeForce GRID servers will be outfitted with several of the latest GeForce GPUs to provide the best graphics processing power available, while having lower server latency so that cloud-based video games play just like they were installed locally. America is one of the greatest developed nations in the world, but it's also one of the worst for network infrastructure. NVIDIA GeForce GRID technology requires fast Internet connections in order for it to work well with real-time multiplayer games, especially first-person shooters where response and reaction times mean life or death. GRID networks need to offer a server cluster near to your Internet Service Provider to reduce packet transfer hops and travel time, or the network latency will get your character killed. NVIDIA VGX technology, which focuses on the Enterprise segment and relates to software applications, visual presentations, and multimedia playback, will require only a standard broadband Internet connection. Cloud Computing Kills the Desktop PC... and Console.Would you spend money on a new gaming console or desktop PC system upgrade if you could simply power-up any Internet-connected HDTV and start playing a library of video games using the best graphics settings possible? Most people would make the obvious choice to move with technology, but there will always be a few die-hards out there who refuse to keep with the times and cling to their old devices. Some readers may recall that several of my past editorials foretell a future where the desktop computer platform no longer exists, and this section ties into the others. If you're not up to speed on this topic Benchmark Reviews has published related articles that go into detail on each topic: Fears and Predictions, Statistical Obituary, and How Video Games Killed Desktop PC Computing. If you haven't already read these articles, you should, and might also consider Killed By Overclocking and Saved By Overclocking as well. I might even convert you to my school of thought after you've read them all. So getting back to this matter of killing off PCs and game consoles... NVIDIA GeForce GRID and NVIDIA VGX technology, once made available and adopted widespread, will mean that users won't need to buy that high-end computer system to play games or run complex design applications. They won't need a powerful processor and graphics card, extreme amounts of system memory, constant updates and patches, or expensive network servers... they'll merely need an Internet-connected device. Because some businesses may fear downtime and resist complete dependence on their Internet connection, the desktop computer platform could survive longer in the Enterprise segment than everywhere else, but the landscape for enthusiast hardware is certain to change. Use Your ImaginationIt's still too early to rate the value of NVIDIA GeForce GRID technology since there are competing product already available (such as OnLive and Evolve), but if we consider these announcements at face value there's plenty of potential for NVIDIA VGX server-side graphics to become the world standard for connected businesses. Websites are already server-side, as are many online games and productivity programs, so it's really not much of a stretch to imagine a day where most complex tasks are handled by an offsite super-computer. Most companies dread mandatory hardware upgrades made for the sake of operating on the latest version of essential software, and Cloud Computing's offset in costs could make sense for cash-strapped businesses. Of course, this would mean steep declines in consumer revenue for everyone involved in selling computer hardware. Intel, AMD, NVIDIA, and everyone else in the supply chain would be hard-pressed to continue selling desktop hardware in a world where only servers need powerful components and everyone else uses a simple Internet-connected device to access them. All this time I've been worried that mobile phones, laptops, and game consoles might be killing off the desktop platform, but in reality it could be the server that does us in. Only time will tell how accurate my predictions will be, so weigh-in with your comment below.
Related Articles:
|
Comments
And even though digital speeds through wires or fiber approaches the speed of light, the routers and switches etc are what gives us the lag. If we could get rid of all of the hardware between two point, we'd have virtually no lag at all, and that problem would be solved.
I'm thinking based on those two problems, this is going to require some sort of technology breakthrough of alien proportions. Bio-routers?
The only problem with what you say is that it still takes electrons which are traveling at only a fraction of the speed of light, or light itself which then has to be converted into an electrical signal, time to travel, so there is no such thing as "no lag at all", even with no equipment in between which is science fiction. It's not instantaneous and it's not "no lag", which is also science fiction, and even aliens on other worlds are still bound by physics.
The article title should read "How NVIDIA marketing says NVIDIA GeForce GRID Might Change Video Games"
You're right though, some day.. this will be reality.
I am one of those people who live in a very rural area, but am lucky in that a local entrepreneur decided to set up a radio network with reasonable speeds. The biggest caveat to the system is the cost. Both the installation of the radio - I live in dense forest and the radios must be mounted in the tops of the 200 foot trees - and the monthly fee make the system very expensive.
For example, to get 786k downloads (and 256k uploads) I pay $80 a month. The radio cost $200 and I had to pay a tree climber $150 to put it up in a tree. I also had to pay for almost 300 feet of cable (200 up the tree 70 more to the house) and, as most tech-savvy people know, that is close to the limit of cat5 capability.
Most of my neighbors do not have high speed access to the Internet. Instead, they still rely on antiquated dial-up (yes, 28k baud modems) to check email - they cannot surf the web, or play even simple flash games. The nVidia grid will not help people who cannot get or afford high speed Internet connections. Those people still need good GPUs and fast computers to play games.
It would require that the "Cloud" rendering farm have at least the power of all the players on the net such that no player would suffer from significant / noticeable lag due to video degradation.
For example: It would mean an Average of 25,000 to 100,000 times the processing power of a single high-end Geforce or ATI card, with upload capability on the internet that equates to at least an average of 1080p for most gamers at 30 fps or better.
Its simply over reaching into the existing infrastructure, which cannot handle that much upload, and over reaching into the existing rendering farm technology that it would be capable of keeping up with the technology and total numbers of users.
While I admire the "vision" for such a technology, it will be relegated to a very select few who can afford it.
This type of technology will also have to overcome the constant desire for gamers to increase their screen resolution,and FPS.
Ultimately, it will take at least 10 to 20 years before the internet can handle this amount of upload.
To be clear, this is not just a traditional upload model. Each individual game must have their own frame rendered. The bandwidth requirement would be huge.
Of course, once the internet is capable of such a massive upload, the "gamer" will have upped the ante, and started demanding 10 times the resolution that exists today.
I just don't see this model being adopted by the masses.
Yes, I should have specified "server upload".
I was referring to the upload that the "rendering service" will have to perform.
As you know, a traditional desktop client receives information from a server (downloads) and sends to server (uploads)
However... the same is true for a server.
A server must "upload" or serve the actual data, in this case the data must be served, and this uses upload bandwidth from the server point of view.
My apologies for not specifying that I was talking about the massive amount of server upload that would be needed to handle the frames.
The client will have little or no client side upload, but their download will be quite significant.
We are talking about a different frame render for each gamer, not a frame that is shared and broadcast or multicast.
I look at my current screen: 1920x1200 for gaming, and my requirement for gaming is at least 30 fps ( I prefer >40 FPS)
720p streaming is 1.5 to 2 mbps compressed
and 1080p is around 5 mbps compressed.
(roughly)
A typical game like call of duty, world of tanks, eve online, or any other popular multiplayer game might have 10,000 to 30,000 people actively playing (or more).
At the finest level of granularity, each cloud service would have to minimally serve the current game zone for multiplayer.
For MMO games this could be hundreds of players.
You would need a service for each set of players...
Now matter how its done, the server bandwidth requirement is staggering.
With just 30 players that's 150 mbps in video rendering.
Also, I am going to side with the other posters here, and say...
Most gamers will not appreciate having to log in to a video render service to see their games.
I am not sure we need these now. The truth is the rush forward in hardware has slowed quite a bit and the high end components are now more than ever a pure nitch market.
Lets begin with CPUs; at this moment the budget gamer is in CPU nirvana because we have reached the point from a gaming persecptive, in fact the total home user perspective, that anything over $200 for a CPU is a waste. Intel's i5 and i3 provide more than enough horse power for an amazing gaming experience. AMD might lag behind in benchmarks but even they have shown the FX is a very solid gaming CPU.
The GPU is no different because the mad rush for higher resolutions is over. Oh sure we have the multimonitor push but again very nitch market. Most gamers use a single monitor and 1080 seems to be the base we work from. Here we see essentially the same thing as the CPU, the $200 price point is were things go from great game experience changes to less for more as it were. The GTX560 and GTX 560ti deliver amazing gaming experience at 1080 as does the HD 6870 and even the HD 6850. Plus as the newer higher end models have hit prices have dropped, I saw a HD6950 for $200 on Newegg today.
Finally RAM, 4 gig is more the enough for a great gaming experience and GOOD RAM kits can be had in 4 Gig models for under $30.
Finally realise this is not a new idea, we have had a working service doing this same thing for a while now. Onlive actually delivers a very solid service. In oour look at the Sapphire Edge we were able to play some very modern games that the system could not play itself with ease using Onlive.
Bluray will become the standard disc drive and the TB will become the GB and the GB the MB and so on.
4K Resolution is about to come out. Just to reproduce that it currently takes an ENTIRE Thunderbolt port on a Brand New Motherboard with Decent video support.
Can you imagine sites which would STREAM VIDEO In 4K? OR Sites which would download you a Movie in 4K?
And the system you named would be HORRIBLE to Run Crysis 2.
Also 4GB of Memory is NOT enough memory even for today. I have 16 in my system and have ran out of Memory.
Programs like Catia 5 Will take all the memory you can give it and keep running faster.
Look how Diablo3 is getting kicked in the groin by DRM, server # and aleatory internet quality connection around the world.
Not mentioning those who move to no internet served place which are still 70% of the planet.
There is also the psicological aspect of it and securities issue, internet traffic is easely cracked, extremely monitored by all kind of agency and spying organs from around the world.
720p resolution, latency, bandwith caps, broadband availability, etc....too many reasons why it wont be the defacto gaming solution anytime soon.
What I think might be really interesting is if a gaming network like Xbox live and PSN use cloud streaming in game, rather than the entire game. Remember in Minority report where advertisements were geared to individuals thanks to eye scanners? We could have just that when playing a racing game, billboards with advertisements streamed straight to the game, constantly changing and geared to individual interests.
In game levels and items could be created on the fly due to your choice's in the game with everyone having a different experience. The building parts of the levels like textures, audio, etc could be on the disk while the geometry data could be streamed on the fly while in a game, allowing for a very custom personal experience. It's the crossover stuff that interests me rather than a replacement of a pc or console. If developers could figure out what to stream via the cloud quickly, having non latency import geometry like backgrounds cities etc could be really interesting.
Imagine a Mass effect scene where the level you are on is in the game, but the extremely detailed backgrounds and animations are in the cloud
I think I'll write something about this on my blog now lol :)