View Single Post
Old 10-23-09, 02:57 PM   #3
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: Would you like to ask Nvidia a question?

First line of questions for October 23rd answered. We will continue to be fielding questions every week with the hope of answering 3 or more questions a week.


1. Is NVIDIA moving away from gaming and focusing more on GPGPU? We have heard a lot about Fermi's compute capability, but nothing of how good it is for gamers.




Jason Paul, GeForce Product Manager: Absolutely not. We are all gamers here! But, like G80 and G200 before, Fermi has two personalities: graphics and compute. We chose to introduce Fermi’s compute capability at our GTC conference, which was very compute-focused and attended by developers, researchers, and companies using our GPUs and CUDA for compute-intensive applications. Such attendees require fairly long lead times for evaluating new technologies, so we felt it was the right time to unveil Fermi’s compute architecture. Fermi has a very innovative graphics architecture that we have yet to unveil.



Also, it’s important to note that our reason for focusing on compute isn’t all about HPC. We believe next generation games will exploit compute as heavily as graphics. For example:

· Physical simulation – whether using PhysX, Bullet or Direct Compute, GPU computing can add incredible dynamic realism to games through physical simulation of the environment.

· Advanced graphical effects – compute shaders can be used to speed up advanced post-processing effects such as blurs, soft shadows, and depth of field, helping games look more realistic

· Artificial intelligence – compute shaders can be used for artificial intelligence algorithms in games

· Ray Tracing – this is a little more forward looking, but we believe ray tracing will eventually be used in games for incredibly photo-realistic graphics. NVIDIA’s ray tracing engine uses CUDA.



Compute is important for all of the above. That’s why Fermi is built the way it is, with a strong emphasis on compute features and performance.



In addition, we wouldn’t be investing so heavily in gaming technologies if we were really moving away from gaming. Here’s a few of the substantial investments NVIDIA is currently making in PC gaming:

· PhysX and 3D Vision technologies

· The Way it’s Meant to be Played program, including technical support, game compatibility testing, developer tools, antialiasing profiles, ambient occlusion profiles, etc.

· LAN parties and gaming events (including PAX, PDX LAN, Fragapalooza, Million Man LAN, Blizzcon, and Quakecon to name a few recent ones) Attached are some links to videos from those event.

http://www.slizone.com/object/slizon...ery_aug09.html

http://www.nzone.com/object/nzone_qu..._trenches.html

http://www.nzone.com/object/nzone_bl..._trenches.html

http://www.nzone.com/page/nzone_section_trenches.html



We put our money where our mouth is here.



Finally, Fermi has plenty of “traditional” graphics goodness that we haven’t talked about yet. Fermi’s graphics architecture is going to blow you guys away! Stay tuned.



2. Why Has NVIDIA continued to refresh the G92? Why didn't NVIDIA create an entry level GT200 piece of hardware? The constant G92 renames and reuse of this aging part have caused a lot of discontent amongst the 3D enthusiast community.



Jason Paul, GeForce Product Manager: We hear you. We realize we are behind with GT200 derivative parts, and we are doing our best to get them out the door as soon as possible. We invested our engineering resource in transitioning our G9x class products from 65nm to 55nm manufacturing technology as well as adding several new video and display features to GT 220/210, which put these GT200-derivative products later in time than usual. Also, 40nm capacity has been limited, which has made the transition more difficult.



Since its introduction, G92 has remained a strong price/performance product in our line-up. So why did we rebrand it? While hardware enthusiasts often look at GPUs in terms of the silicon core (i.e. G92) and architecture (i.e. GT2xx), many of our less techie customers instead think about GPUs simply in terms of performance, price, and feature set, summarized via the product name. The product name is an easy way to communicate how products with the same base feature set (i.e. DirectX 10 support) compare to each other in terms of price and performance. Let’s take an example – what is the higher performance product, a 8800 GT or a 9600 GT? The average joe looking at an OEM web configurator or Best Buy retail shelf probably won’t know the answer. But if they saw a 9800 GT and a 9600 GT, they would know that a 9800 GT would provide better performance. By keeping G92 branding current with the rest of our DirectX 10 product line-up, we were able to more effectively communicate to customers where the product fit in terms of price and performance. At the same time, we tried to make it clear to technical press that these new brands were based on the G92 core so enthusiasts would know this information up front.



3. Is it true that NVIDIA has offered to open up PhysX to ATi without stipulation so long as ATi offers its own support and codes its own driver, or is ATi correct in asserting that NVIDIA has stated that NV will never allow PhysX on ATi gpus? What is NVIDIA’s official stance in allowing ATi to create a driver at no cost for PhysX to run on their GPUs via OpenCL?



Jason Paul, GeForce Product Manager: We are open to licensing PhysX, and have done so on a variety of platforms (PS3, Xbox, Nintendo Wii, and iPhone to name a few). We would be willing to work with AMD, if they approached us. We can’t really give PhysX away for “free” for the same reason why a Havok license or x86 license isn’t free—the technology is very costly to develop and support. In short, we are open to licensing PhysX to any company who approaches us with a serious proposal.



4. Is NVIDIA fully committed to supporting 3D Vision for the foreseeable future with consistent driver updates or will we see a decrease in support as appears to be the current trend to many 3D Vision users? For example. A lot of games have major issues with Shadows while running 3D Vision. Can profiles fix these issues or are we going to have to rely on developers to implement 3D Vision compatible shadows? What role do developers play in having a good 3D Vision experience at launch?


Andrew Fear, 3D Vision Product Manager: NVIDIA is fully committed to 3D Vision. In the past four driver releases, we have added more than 50 game profiles to our driver and we have seeded over 150 3D Vision test setups to developers worldwide. Our devrel team works hard to evangelize the technology to game developers and you will see more developers ensuring their games work great with 3D Vision. Like any new technology, it takes time and not every developer is able to intercept their development/release cycles and make changes for 3D Vision. In the specific example of shadows, sometimes these effects are rendered with techniques that need to be modified to be compatible with stereoscopic 3D, which means we have to recommend users disable them. Some developers are making the necessary updates, and some are waiting to fix it in their next games.



In the past few months we have seen our developer relations team work with developers to make Batman: Arkham Asylum and Resident Evil 5 look incredible in 3D. And we are excited now to see new titles that are coming – such as Borderlands, Bioshock 2, and Avatar – that should all look incredible in 3D.

Game profiles can help configure many games, but game developers spending time to optimize for 3D Vision will make the experience better. To help facilitate that, we have provided new SDKs for our core 3D Vision driver architecture that lets developers have almost complete control over how their game is rendered in 3D. We believe these changes, combined with tremendous interest from developers, will result in a large growth of 3D Vision-Ready titles in the coming months and years.

In addition to making gaming better, we are also working on expanding our ecosystem to support better picture, movie, and Web experiences in 3D. A great example is our support for the Fujifilm FinePix REAL 3D W1 camera. We were the first 3D technology provider to recognize the new 3D picture file format taken by the camera and provide software for our users. In upcoming drivers, you will also see even more enhancements for a 3D Web experience.


5) Could Favre really lead the Vikings to a Superbowl?



Ujesh Desai, Vice President of GeForce GPU Business: We are glad that the community looks to us to tackle the tough questions, so we put our GPU computing horsepower to work on this one! After simulating the entire 2009-2010 NFL football season using a Tesla supercomputing cluster running a CUDA simulation program, we determined there is a 23.468% chance of Favre leading the Vikings to a Superbowl this season.* But Tesla supercomputers aside, anyone with half a brain knows the Eagles are gonna finally win it all this year! J



*Disclaimer: NVIDIA is not liable for any gambling debts incurred based on this data.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote