Go Back   nV News Forums > Software Forums > Gaming Central > Console World

Newegg Daily Deals

Closed Thread
 
Thread Tools
Old 05-18-12, 02:46 AM   #1
MikeC
Administrator
 
MikeC's Avatar
 
Join Date: Jan 1997
Location: Virginia
Posts: 5,364
Question Thoughts from console owners on NVIDIA's GEFORCE GRID

Simple, yet compelling, question. Is GEFORCE GRID going to make consoles obsolete?
MikeC is offline  
Old 05-18-12, 03:14 AM   #2
FlakMagnet
Registered User
 
FlakMagnet's Avatar
 
Join Date: Mar 2005
Location: Farnborough, UK
Posts: 335
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

I still don't see how they can eliminate the latency involved. A button press on your controller or keyboard still needs to be sent to game servers before it can be processed, while on a console or PC that same button press is registered immediately.

I tried OnLive a little while ago and the experience was not good - the game was always behind whatever I did on the keyboard. The streamed image was also nowhere near as sharp as an image generated on the local PC. It simply didn't work for me. I don't see how these issues can ever be resolved in a 'Cloud gaming' environment. Yes, the servers can be places 'close to major cities' which would improve the latency, but it would still be there. The ONLY way you could get a console-like experience using the cloud would be to reduce to zero the time it took to get user input to the server, and then zero time to compress and send the rendered images to the gamer.

This solution is also no good for gamers who have capped broadband (I'm lucky that mine isn't capped, but many are not so lucky).

So in answer to your question, I would say that it's not going to make consoles obsolete.

The following is my simplistic view of what happens on a console compared to what needs to happen in the cloud. When the user presses a button, what needs to happen before you see the result on your screen.

Console :

1. User presses 'Fire'.
2. Button registered by the game - very little delay.
3. Game processes button and updates game state.
4. Game renders next frame based on result of new game state.
5. Game displays completed frame to the user.

'Cloud' :

1. User presses 'Fire'.
1b. User input sent across the internet to the Cloud - this can take in excess of 30-40ms.
2. Button registered by the game.
3. Game processes button and updates game state.
4. Game renders next frame based on result of new game state.
4b. Server compresses frame and sends across the internet. Will be a 'significant' amount of data if quality is not to be compromised too much.
4c. Frame is decompressed locally.
5. Completed frame is displayed to the user.

In the above very simplistic view, 1b, 4b and 4c are all additional tasks that MUST happen in the order presented above. Yes, multiple frame can be prepared as in double or triple buffering, but this does not help in the case of user 'seeing' the result of his/her actions.
__________________
ASUS P6T Deluxe V2 | i7 920 @ 3.6GHz
POV GTX480 | 6GB Patriot DDR3 @1443
X-Fi Fatal1ty Pro | 120GB OCZ Vertex SSD
2xSamsung 7200 750GB RAID 0| Akasa Eclipse 62
FlakMagnet is offline  
Old 05-18-12, 08:08 AM   #3
MikeC
Administrator
 
MikeC's Avatar
 
Join Date: Jan 1997
Location: Virginia
Posts: 5,364
Lightbulb Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

Quote:
Originally Posted by FlakMagnet View Post
I still don't see how they can eliminate the latency involved. A button press on your controller or keyboard still needs to be sent to game servers before it can be processed, while on a console or PC that same button press is registered immediately.
Exactly. The latency may have been the biggest hurdle that NVIDIA needed to overcome. A lot depends on how much data is going to be sent and NVIDIA could even offer a tiered pricing system to customers.

One way to reduce the amount of data being sent from the cloud to a gaming device would be to compress it. NVIDIA is very good at developing complex algorithms to compress different types of data like 3d, video, hpc, etc.

A second method could offer tremendous savings, but is a little harder to code for as it involves calculating net change from frame to frame. Instead of sending all of the data in the frame buffer (pixels) for each frame being rendered, only send the data (pixels) that changed. Some developers code using a net change technique. It's like occlusion culling. There's no need to process objects that are occluded, or hidden, behind other objects.

This type of programming is perfect for processors with a lot of cores like a GPU. It's still very difficult to develop applications that use all of the cores on a CPU. Most of today's apps are still single threaded, but Intel has been bringing new tools to market to ease the burden for developers.

But the bandwidth to process 3d data, with all the driver and in-game graphics bells and whistles enabled on a PC, is staggering - hundreds of gigabytes of data per second - even in parallel. But this processing will be done on their high-performance supercomputers in the cloud.

I'm only throwing out ideas since, to my knowledge, NVIDIA hasn't briefed web sites on the technology behind the grid. But I'd love to test such a system when it becomes available!
MikeC is offline  
Old 05-18-12, 08:59 AM   #4
Yaboze
Registered User
 
Join Date: Oct 2005
Location: United States
Posts: 2,057
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

I don't think it will anytime soon. Not everyone has fiber optic internet. Latency is the problem here.

If there comes a time where everyone's internet speed is almost instant, then I can see something like this working.
Yaboze is offline  
Old 05-18-12, 12:35 PM   #5
|MaguS|
Guest
 
Posts: n/a
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

No Not anytime soon. The US is still FAR behind the rest of the world in offering broadband to the population and now with metered bandwidth becoming more coming cloud computing is looking like it will never be really plausible unless these companies strike a deal with the ISPs.

Sorry if anything all this cloud computing talk and remote gaming is dead... The same reason streaming is becoming more endangered because the ISPs are also content providers and if they don't want a service on their pipes they can cut it off, not like there's anyone to enforce fair play.
 
Old 05-18-12, 03:23 PM   #6
ViN86
 
Join Date: Mar 2004
Posts: 15,486
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

I agree with the others. This won't happen any time soon.

But I am not completely opposed to a subscription based service, as long as the graphic quality and responsiveness is comparable to dedicated hardware. Currently, that's impossible. But in the next 15-20 years, it may be feasible.
ViN86 is offline  
Old 05-19-12, 06:44 AM   #7
crainger
 
crainger's Avatar
 
Join Date: Aug 2004
Location: Coffs Harbour, NSW, Australia
Posts: 29,559
Send a message via AIM to crainger
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

Quote:
Originally Posted by |MaguS| View Post
No Not anytime soon. The US is still FAR behind the rest of the world in offering broadband to the population and now with metered bandwidth becoming more coming cloud computing is looking like it will never be really plausible unless these companies strike a deal with the ISPs.
That's what will happen. I only get 400Gb per month but never come close to using it because so much is unmetered, itunes, movies streaming, Xbox Live, Steam. Once Fibre becomes more common I can see cloud computing becoming more popular. Though I can't see the US Government paying for Fibre to the home like they are here. ::
crainger is offline  
Old 05-21-12, 11:18 AM   #8
Logical
Registered User
 
Logical's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,523
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

Quote:
Originally Posted by |MaguS| View Post
No Not anytime soon. The US is still FAR behind the rest of the world in offering broadband to the population and now with metered bandwidth becoming more coming cloud computing is looking like it will never be really plausible unless these companies strike a deal with the ISPs.

Sorry if anything all this cloud computing talk and remote gaming is dead... The same reason streaming is becoming more endangered because the ISPs are also content providers and if they don't want a service on their pipes they can cut it off, not like there's anyone to enforce fair play.
This..

Also in UK broadband isnt upto speed compared to most of Europe such as Netherlands. I myself am on the best available home broadband at 100/10. However ADSL is still quite popular especially in areas were optical broadband is not available. Most ADSL ISP's offer 'upto' 8MB but this can depend on quality of phone line and also the distance from the exchange. My parents are currently on a 'upto' 8MB ADSL and in real world speeds they are lucky if they acheive anything above 2MB with their connection constantly dropping and severe lag...Even web browsing can sometimes be a task.

This is not the only reason cloud gaming will not take off. I have been a gamer since i was able to pick up a joypad and have since collected many games on many platforms....I have never cared for trading games either and have 1000's of pounds worth of retro hardware and games stuffed in the attic. I am not just a gamer, i like many others am a collector and cherish what i buy.

Cloud gaming will be aimed at the casual gamer that are fortunate enough to have decent internet to play it on and i see it being about as successful as the free games that are available on some TV's with Apps. Cloud gaming to take over consoles ? Not in this lifetime....
Logical is offline  

Old 05-22-12, 08:49 PM   #9
MikeC
Administrator
 
MikeC's Avatar
 
Join Date: Jan 1997
Location: Virginia
Posts: 5,364
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

The original forward-thinking and somewhat positive thread on this subject is located here:

http://www.nvnews.net/vbulletin/showthread.php?t=181148

This one will probably be closed. Absolutely no constructive criticim here.
MikeC is offline  
Old 05-23-12, 08:03 AM   #10
MikeC
Administrator
 
MikeC's Avatar
 
Join Date: Jan 1997
Location: Virginia
Posts: 5,364
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

A final relevant post from me on the subject in this forum. Feel free to reply before this thread is closed.

Here's a story from a guy at Seeking Alpha. He's an investor. What the heck does he know about computers and NVIDIA?

http://seekingalpha.com/article/6110...rated-graphics

Quote:
There's been a lot of jazz about the integrated graphics solutions present on Intel's (INTC) Ivy Bridge CPUs and AMD's (AMD) Trinity chips and how they're going to relegate NVIDIA (NVDA) to a niche corner of the market. Interestingly enough, most of the hype is from the investor community and the general media. But if you talk to folks who are involved with designing and building PCs (as I do for myself and for others as a hobby), one thing quickly becomes clear: they're not enough to displace a discrete card in a situation where a discrete card would be purchased anyway.

It seems that Intel and AMD would like everyone to believe that having integrated graphics solutions is something that's "new" and "revolutionary". It's not. Integrated solutions have been shipping for ages and, yes, Intel has, for the longest time, shipped the most graphics chips year after year via these integrated onto motherboard solutions! The difference now is that these integrated solutions are directly on the CPU, meaning that overall platform costs decrease. Further, such integration allows for power savings for systems that actually use the on-die integrated graphics since power management can be done much more effectively and at idle, there are fewer chips in the system drawing idle power.

The next issue that comes up is that integrated graphics are becoming "good enough" and that they will eventually cannibalize the need for discrete graphics cards. To compound this, there have been rumblings in the PC gaming community that high budget games are increasingly targeted towards the relatively outdated hardware in Microsoft's (MSFT) Xbox 360 and Sony's (SNE) Playstation 3, supposedly removing the need for high end graphics chips on the PC counterparts of the games.

Nothing could be further from the truth.

While integrated graphics solutions are now transitioning from "absolute trash" to "competent" for modern 3D games, they're still not enough for PC games. The graphics cores in AMD's Trinity and Intel's Ivy Bridge, competent at low detail settings, are no match for an entry level discrete graphics chip from NVIDIA, let alone the higher end offerings from both NVIDIA and AMD. So for gamers looking to play at high frame rates at the higher quality settings, the current crop of integrated graphics processors are simply not enough. NVIDIA and, to a lesser extent, AMD also invest in technologies that allow -- generally in the laptop/Ultrabook areas -- their graphics chips to be turned off with minimal power waste when non-graphics intensive applications are running, so that the user gets the best of both worlds.

Now that I've made the case that NVIDIA's discrete GPU business is not particularly threatened by integrated solutions on the PC side, it's important to see how NVIDIA can actually benefit from integrated graphics. One word: Tegra.

That's right! On a desktop PC that will be used for gaming or even a laptop that will be used for gaming, a discrete GPU will likely be used, since NVIDIA has shown that their discrete solutions can even be used in Ultrabooks. But what about a mobile device such as a smartphone or a tablet where the key is to sip as little power as possible? That's exactly where the SoC style chip with integrated graphics becomes particularly valuable. Each year, phones and tablets can do more and more amazing things, especially when it comes to 3D graphics, and having competitive integrated GPU solutions here is key. NVIDIA's Tegra 3 is a good chip, but its graphics capabilities are not quite up to par with the graphics performance in Apple's (AAPL) A5X SoC , whose graphics technology is licensed from Imagination Technologies.

As puzzling as this may seem, it is imperative to note that NVIDIA certainly has the strongest technological background when it comes to efficient, high performance GPUs. If NVIDIA can bring this expertise to their Tegra lineup going forward, it will become extremely difficult for the phone vendors to ignore NVIDIA's offerings, especially as graphics and gaming sees the same explosive growth on the mobile side as they did on the PC in the 1990's and early 2000's.

NVIDIA shouldn't fear integrated graphics because they're perfectly positioned to deploy it where it counts -- mobile SoCs.

There's been a lot of jazz about the integrated graphics solutions present on Intel's (INTC) Ivy Bridge CPUs and AMD's (AMD) Trinity chips and how they're going to relegate NVIDIA (NVDA) to a niche corner of the market. Interestingly enough, most of the hype is from the investor community and the general media. But if you talk to folks who are involved with designing and building PCs (as I do for myself and for others as a hobby), one thing quickly becomes clear: they're not enough to displace a discrete card in a situation where a discrete card would be purchased anyway.

It seems that Intel and AMD would like everyone to believe that having integrated graphics solutions is something that's "new" and "revolutionary". It's not. Integrated solutions have been shipping for ages and, yes, Intel has, for the longest time, shipped the most graphics chips year after year via these integrated onto motherboard solutions! The difference now is that these integrated solutions are directly on the CPU, meaning that overall platform costs decrease. Further, such integration allows for power savings for systems that actually use the on-die integrated graphics since power management can be done much more effectively and at idle, there are fewer chips in the system drawing idle power.

The next issue that comes up is that integrated graphics are becoming "good enough" and that they will eventually cannibalize the need for discrete graphics cards. To compound this, there have been rumblings in the PC gaming community that high budget games are increasingly targeted towards the relatively outdated hardware in Microsoft's (MSFT) Xbox 360 and Sony's (SNE) Playstation 3, supposedly removing the need for high end graphics chips on the PC counterparts of the games.

Nothing could be further from the truth.

While integrated graphics solutions are now transitioning from "absolute trash" to "competent" for modern 3D games, they're still not enough for PC games. The graphics cores in AMD's Trinity and Intel's Ivy Bridge, competent at low detail settings, are no match for an entry level discrete graphics chip from NVIDIA, let alone the higher end offerings from both NVIDIA and AMD. So for gamers looking to play at high frame rates at the higher quality settings, the current crop of integrated graphics processors are simply not enough. NVIDIA and, to a lesser extent, AMD also invest in technologies that allow -- generally in the laptop/Ultrabook areas -- their graphics chips to be turned off with minimal power waste when non-graphics intensive applications are running, so that the user gets the best of both worlds.

Now that I've made the case that NVIDIA's discrete GPU business is not particularly threatened by integrated solutions on the PC side, it's important to see how NVIDIA can actually benefit from integrated graphics. One word: Tegra.

That's right! On a desktop PC that will be used for gaming or even a laptop that will be used for gaming, a discrete GPU will likely be used, since NVIDIA has shown that their discrete solutions can even be used in Ultrabooks. But what about a mobile device such as a smartphone or a tablet where the key is to sip as little power as possible? That's exactly where the SoC style chip with integrated graphics becomes particularly valuable. Each year, phones and tablets can do more and more amazing things, especially when it comes to 3D graphics, and having competitive integrated GPU solutions here is key. NVIDIA's Tegra 3 is a good chip, but its graphics capabilities are not quite up to par with the graphics performance in Apple's (AAPL) A5X SoC , whose graphics technology is licensed from Imagination Technologies.

As puzzling as this may seem, it is imperative to note that NVIDIA certainly has the strongest technological background when it comes to efficient, high performance GPUs. If NVIDIA can bring this expertise to their Tegra lineup going forward, it will become extremely difficult for the phone vendors to ignore NVIDIA's offerings, especially as graphics and gaming sees the same explosive growth on the mobile side as they did on the PC in the 1990's and early 2000's.

NVIDIA shouldn't fear integrated graphics because they're perfectly positioned to deploy it where it counts -- mobile SoCs.

Disclosure: I am long NVDA.

This article was sent to 2,292 people who get email alerts on NVDA.
MikeC is offline  
Old 05-23-12, 09:34 AM   #11
Logical
Registered User
 
Logical's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,523
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

Could have sworn i read that twice but only the once lol

Also i don't quite understand what that has to do with cloud gaming, have i missed something ?
Logical is offline  
Old 05-27-12, 08:43 AM   #12
MikeC
Administrator
 
MikeC's Avatar
 
Join Date: Jan 1997
Location: Virginia
Posts: 5,364
Default Re: Thoughts from console owners on NVIDIA's GEFORCE GRID

PlayStation 'poised for cloud gaming push'

Quote:
Reports suggest Sony has struck a deal with Gaikai or OnLive


Sony will strike a deal with OnLive or Gaikai at E3 next month for PlayStation cloud gaming, according to reports.

The platform-holder will announce a new game streaming service for 'PlayStation hardware' at the annual trade show in Los Angeles, according to VG247.


It's not clear whether this service could be for PS3 or PS Vita, both, or the next PlayStation home console.

Both OnLive and Gaikai offer cloud gaming streaming services. These allow users to stream full games or demos instantly online without the need for discs or hardware installations. Gamers can currently use devices such as PCs, TVs, smartphones, laptops and tablets to stream games online.

Last year OnLive said it had talked with Sony and Microsoft regarding a potential console collaboration.
MikeC is offline  
Closed Thread


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Nvidia Joins Cloud With GeForce Grid, Partners With Gaikai News GAIKAI 0 05-16-12 03:20 AM
New GPU from Nvidia Announced Today, the GeForce GTX 670 News Archived News Items 0 05-10-12 01:50 PM
Nvidia's GeForce GTX 670 graphics card (The TechReport) News GeForce GTX 670 Reviews 0 05-10-12 09:28 AM
Gainward Unleashes the Sexy GeForce GTX 670 Phantom Graphics Card, Also launches the News Archived News Items 0 05-10-12 09:28 AM
Nvidia's GeForce GTX 690 graphics card (The TechReport) News GeForce GTX 690 Reviews 0 05-09-12 01:10 AM

All times are GMT -5. The time now is 10:26 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.