PDA

View Full Version : Windows Vista and Gaming


Pages : [1] 2

jAkUp
03-24-06, 11:47 PM
Check it out, interesting article from GDC:

http://www.extremetech.com/article2/0,1697,1941407,00.asp

DirectX 10 Features:

Consistency across hardware. Capability bits (cap bits) are gone forever. Cards are either Direct3D compliant or not. So you won't have the situation with Direct3D 9, where one vendor might support FP16 blending while the other vendor doesn't. The only differentiation between hardware will be performance.
Geometry shaders. Geometry fits into the 3D pipeline between vertex shaders and pixel shaders, taking the spot occupied by fixed function primitive setup in DirectX9. With DirectX 10, a geometry shader can actually generate new geometry (primitives) from incoming vertices.
A single, unified shader engine that offers consistent behavior across vertex, geometry, and pixel shaders.
More sophisticated flow control, including switch statements.
Texture arrays. Using texture arrays, with indices into those arrays, will increase the overall efficiency of texture transfers.
Overall efficiency has been increased. According to Sam Glassenberg, DirectX 10 program manager, state changes are much less costly, batch overhead is substantially lower, and some of the new features, like texture arrays and geometry shaders, means that fewer calls are needed.

One key aspect of Direct3D 10 is that it's almost certainly going to be a Vista only feature. Direct3D 10 depends on the new Windows Device Driver Model, plus some of the virtualization features built into Vista. Retrofitting DirectX 10 to Windows XP is possible, but would take significant engineering and QA effort. On the other hand, if Vista's schedule keeps slipping, Microsoft may need to reevaluate this stance.

nV`andrew
03-24-06, 11:50 PM
nice find and good info jakup

Edge
03-25-06, 12:29 AM
You know, I have to wonder if their whole enforced compliancy thing is really going to backfire like it has before. I remember that one of the things they talked about for DX9 was that it was supposed to have "better" standards than DX8 did, like they thought they were too lenient with it and that cards that were called DX8 compatible weren't good enough. Then DX9 comes out with it's supposedly more strict standards, and what did we get? The FX series. And about the only difference between the cards being performance...does that mean we won't see improvements to it over time like we did with the other Direct X models (8.0 to 8.1, 9.0 to SM3.0, etc.)? Or are they even going as far as keeping graphics cards from having exclusive features on them (like the Nvidia or ATI specific features that their past cards have)? I can understand creating a standard of minimum requirements, but it sounds like they might be going overboard and trying to make every card the same. It just seems like they're limiting it too much in features, but then will they even have any requirements for performance? I'm just worried that we'll end up like cards that perform under DX10 as well as the FX5200 handled DX9.

But at least it sounds more efficient. I wonder if any of these improvements will be backwards compatible with current DX9 cards.

SLippe
03-25-06, 12:42 AM
Interesting read. Thanks for posting. I haven't kept up with the whole Vista thing, but I would like to see a game ran on Vista with DX10 and see how it truly performs and looks. We'll see, I guess.

Mr_LoL
03-25-06, 01:21 AM
I wish they would just put statements like these in plain English :thumbdwn: Fewer cells???

Overall efficiency has been increased. According to Sam Glassenberg, DirectX 10 program manager, state changes are much less costly, batch overhead is substantially lower, and some of the new features, like texture arrays and geometry shaders, means that fewer calls are needed.

Rakeesh
03-25-06, 01:58 AM
That is fewer calls, not cells. In simple and more generic terms, that means you can accomplish the same thing in fewer steps. Fewer steps means less time required to accomplish the task. Less time required means higher speed, higher frame rate, etc.

Vik1dk
03-25-06, 08:40 AM
Im a little confused, the box for my 7900 says vista ready but that aint a dx10 card right? does that mean that it can run vista but no dx10 features?:o

Rakeesh
03-25-06, 08:42 AM
Pretty much anything is "vista ready" but you need at least dx9 if you want to use that aero glass thing.

Vik1dk
03-25-06, 08:59 AM
Thanks, so it the same crap as with hd ready and whatever ready(anal intrusion)(gay)

Brimstone7
03-25-06, 09:15 AM
The biggest problem that I see is that developers are going to have to still support "older" cards (including today's brand spaking new 7900, etc) so this standardization thing has already gone out the window. Why write a DX10 game when 80% of your user base dosen't have cards that support it. I'm sure this will all benefit us 2 years down the line but for now it's just talk.

agentkay
03-25-06, 09:25 AM
Have you seen the Crisis GDC trailer? No way this was DX9.

Mr_LoL
03-25-06, 12:53 PM
Have you seen the Crisis GDC trailer? No way this was DX9.
They say it is playable at e3 so I am assuming it would be running on directx 9 hardware as no dx10 cards are out yet.

Demanufakture
03-25-06, 02:27 PM
Have you seen the Crisis GDC trailer? No way this was DX9.

Actually crytek have already stated that Crysis (not Crisis) will support Direct X 9.0c and Direct X 10 if you have Windows Vista. Much in the way Farcry could switch from the Direct 3d 9 mode to Direct 3d 8 with the lighting setting. Crytek also said that the minimum gpu spec for Crysis will be a shader model 2.0 card. So technically people with high end Geforce FX cards will be able to enjoy Crysis as well, obviously not with high settings but it will be able to run it if they have a good cpu, at least 2gigs of ram for example. I'm sure the game will be very tweakable just like Farcry is.

jAkUp
03-25-06, 02:56 PM
Yup, Crysis will support DX10. As for GeforceFX cards, the SM2.0 shader path in FX cards is so horrendously slow that it might be better to actually not even support them... hehe :D

Demanufakture
03-25-06, 03:10 PM
Yup, Crysis will support DX10. As for GeforceFX cards, the SM2.0 shader path in FX cards is so horrendously slow that it might be better to actually not even support them... hehe :D

True, they are slow in SM2.0 but they can run it. With low details, an FX 5950 Ultra may be able to pull off playable framerates. Of course, I guess will just have to wait and see when the game gets closer to release.

$n][pErMan
03-25-06, 04:40 PM
I am so torn whether or not to do by big upgrade this summer (as planned) or just wait till vista is out along with DX10 video cards (as no game yet is really choking me down anyway). :|

WimpMiester
03-25-06, 05:51 PM
I think everyone is missing the big picture. MS is indirectly telling you that if you want to continue using your PC for gaming you will have to have Vista, period. There will be no DX10 upgrade for any previous OS. This means in a year or two after Vistaís release, if you donít have Vista youíre not gaming. So, if I was thinking of buy a new system or doing a major upgrade I may as well wait till next year.

Nv40
03-25-06, 11:02 PM
You can quote me on this...
But Directx10 will not take off and will not be important for many years until at least Xbox360 and PS3 are obsolete. PC games will take a secondary role ,for business reasons.. the market is not as big as the console market ,just visit any gamestop or EB store ,the Pc games sections almost doesnt exist .is all consoles games ,hardware ,mags and any other console ****!

DIrectx10 is all about faster programing with more elegance ,nothing to do with better graphics. a DIrectx10 pure game can be done 100% identically graphics wise using Sm3.0. The hype around the new api is not surprising ,Microsoft needs to sell their new VISta OS ,and Developers also can market their games better by saying it was made using future code.

As others has pointed ,Directx9 was said to have the ultimate security to force standars.you either are directx9 compliant or not. but look how ATI market their X1000's lineup as being SM3.0 ready ,thanks to a hole they found in the specs. Really guys there is nothing to look in directx10 that will make your games to look better. im not a game developer but flow control or unified hardware will not make your games to look even 1% better ,but more easier and in theory will run faster.but that performance improvement will not be seen that much if the game is cpu limited anyway.just notice how SLi users need to play at insane resolutions with the highest AA/AF settings to notice that SLI is working. and things will be worse in the future as Developers use more advanced AI in their games ,physics and better looking outdoors using thousands of polys. And that is because GPus arent very optimal to work with intensive MAths operations and prediction code like Cpus does. in other words your games will not look any better than the slowest hardware part on your machine. and that is your Cpu. the future now is multi processors and multi threaded aplications and hardware acelerated phyisics. anything that helps to have more physics and more detailed world shapes in your games.

generally API's exist not to invent new graphics ,but to fake to some little degree in realtime the ones that already exist in the Professional market. if we had by now gaming pcs with 50Ghz cpus with 3gb of ram as standar , Nvdia and ATI and Microsoft DIrectx's will go out of business ,because the graphics could be done using entirely the CPU and with much higher quality graphics using more advanced rendering techniques like radiosity for lighting. just look at the PS2.. it doesnt support directx's 7,8 or 9 and its graphics is comparable to the Xbox1 in many games.

Directx10 or any directx20 in the future will not be suported widely whether they are good or not ,for at least 4 years and that is because developers will not develop 2 games one for next generation directx9+ consoles and another path for a very tiny market of very highend PC's using Dx10.more when the PC games future is in doubt. vs the next gen console competition.

jAkUp
03-25-06, 11:25 PM
You can quote me on this...
But Directx10 will not take off and will not be important for many years until at least Xbox360 and PS3 are obsolete. PC games will take a secondary role ,for business reasons.. the market is not as big as the console market ,just visit any gamestop or EB store ,the Pc games sections almost doesnt exist .is all consoles games ,hardware ,mags and any other console ****!

Thats one of the things Microsoft is posing to change.

And by DirectX10 being more efficient should yield more impressive graphical effects.

Personally, I think after Vista, PC Gaming is going to take off, especially with online distrubition where we can cut the Publisher out of the picture. Gabe Newell said that a specific game distrubited on Steam, already sold more copies than its entire lifespan on the shelves.

Smaller developers usually start out on PC's first, since its cheaper, and they do not need to pay royalities of any sort. Now that we can cut the publisher out of the picture, this makes a huge step forward for PC Gaming.

Morrow
03-26-06, 05:03 AM
generally API's exist not to invent new graphics ,but to fake to some little degree in realtime the ones that already exist in the Professional market. if we had by now gaming pcs with 50Ghz cpus with 3gb of ram as standar , Nvdia and ATI and Microsoft DIrectx's will go out of business ,because the graphics could be done using entirely the CPU and with much higher quality graphics using more advanced rendering techniques like radiosity for lighting. just look at the PS2.. it doesnt support directx's 7,8 or 9 and its graphics is comparable to the Xbox1 in many games.

This paragraph is so wrong in many aspects that I will not comment them all.

Your logic is flawed because if we had 50GHz CPUs now, we would also have 30GHz GPUs at the same time. These 30GHz GPUs would be able to output better graphics than 2 dual-core 50GHz CPUs ever are likely to render in real-time.

Software and Hardware progress works hand in hand. You cannot say that if current CPUs were 5 times faster, you would not any longer need GPUs because in that case the software development would also be 5 times more complex and graphics 5 times more advanced than currently.

Intel made that prediction mistake 8 years ago and they notice quite fast, how wrong they were at that time. CPUs and GPUs are from now on always meant to coexist.

Your "PS2 doesn't support DX7,8,9" comment is also as wrong as the rest, OpenGL for example is not DX9 but it nevertheless can produce the same graphics as DX9c if desired. The PS2 does not use DX as its 3D API but that doesn't mean it cannot output comparable graphics as DX7 or DX8 for example. I really don't understand your point there.

mullet
03-26-06, 07:33 AM
Im not a huge DX fan at all really, I really like OpenGL a buddy of mine graduated from full sail college and he said that alot of the features in DX have been done years ago in OpenGL, he started to explain and after 3 mins I was lost on what he was saying. My question is how is Vista going to deal with OpenGL? I heard that M$ is trying to dump it.

$n][pErMan
03-26-06, 11:52 AM
My question is how is Vista going to deal with OpenGL? I heard that M$ is trying to dump it.
Ha... that would be utterly stupid on thier part.

Nv40
03-26-06, 12:00 PM
This paragraph is so wrong in many aspects that I will not comment them all.

Your logic is flawed because if we had 50GHz CPUs now, we would also have 30GHz GPUs at the same time. These 30GHz GPUs would be able to output better graphics than 2 dual-core 50GHz CPUs ever are likely to render in real-time.



My CPu prediction goes in line with what some developers have been saying.for some time. Unless you think TIm sweeney have no clue of computer graphics.. Let me explain it again to you.. CPu arent limited to what the API support or not. There is no limits what you can do there. Is that simple.GPus exist to help ..once you have CPus fast enough to handle radiosity in real time.. there will be no need for "graphics accelerators".


Intel made that prediction mistake 8 years ago and they notice quite fast, how wrong they were at that time. CPUs and GPUs are from now on always meant to coexist.


So if INtel made that prediction ,that makes my argument a little more valid that your flawed arguments and non sense.


Your "PS2 doesn't support DX7,8,9" comment is also as wrong as the rest, OpenGL for example is not DX9 but it nevertheless can produce the same graphics as DX9c if desired. The PS2 does not use DX as its 3D API but that doesn't mean it cannot output comparable graphics as DX7 or DX8 for example. I really don't understand your point there.

Whether directx is supported or not by the PS2 is irrelevant to this conversation. again you show your ignorance and lack of logic and reasoning. WHat im saying is that the PS2 doesnt have DIrectx 6,7,8 level graphics hardware.. (that doesnt have Nvidia or ATI gpu) like the xbox1 has ,to make comparable graphics using CPU power. is that more clear to you? it shows that you dont need Microsoft neither their API to make killers games with outstanding graphics within the possibilities of a system with very low hardware specs by today standars.
My main point is that everything that is done by GPus can be done by cpus.. (and alot more) if you have fast enough CPU power you can use it for graphics too ,not just for AI or physics.This wil be the case for MAny PS3 games ,will be using CELL for graphics. SOny and IBM have demostrated very impresive graphics demos 100% entirely done in the CPU. FOr the only reason that GPUS and CPUs goes "hand in hand" is because neither one is fast enough to replace the other.But now with multiprocessor ,multicore technology ,multi-threaded technology officially now embraced by next generation consoles industry.. the days of Gpus as the most important for graphics in hardware are not anymore. once the CPus are fast enough (and the code is optimal) Gpus will not be needed anymore. please stay of this topic and dont waste my time .

jolle
03-26-06, 12:04 PM
I wish they would just put statements like these in plain English :thumbdwn: Fewer cells???
Like alphawolf said, calls.
Basicly the API talking to the driver via CPU I think..
less calls, less CPU utilization..
The drivermodel, API and hardware is supposed to be a little more selfsufficient and less dependant on CPU or somesuch.

Sweeney made some comment in relation to dualcore cpus a while back, saying how API calls to the driver or somesuch could reach upwards 50% cpu usage..
memory is a bit foggy on that one tho..

My CPu prediction goes in line with what some developers have been saying.for some time. Unless you think TIm sweeney have no clue of computer graphics.. Let me explain it again to you.. CPu arent limited to what the API support or not. There is no limits what you can do there. Is that simple.GPus exist to help ..once you have CPus fast enough to handle radiosity in real time.. there will be no need for "graphics accelerators".
Yeah but GPUs are still gonna get there ALOT earlier then CPUs.
Graphics is handled very parallell now, even a quad core CPU couldnt come close to a 24 pipeline GPU.
Modern GPUs can handle 16 pixels per cycle and have room to spend more then one cycel on them without slowing down.

As GPUs develop the gap between software and hardware renderers have become so big that even Epic is concidering skipping software rendering for UE3.0, dunno what the final word on that is tho..
And GPUs will keep developing faster then CPUs, almost doubeling power every 6 months, dual core and the upcoming quadcores is giving the CPU some upswings, but its alot harder to fill that new performance void then it is keeping the GPU fed, since it has been parallell for ages now.

It sounds more reasonable to me that you end up with a number of free CPU cores doing the radiocity solution, gathering all the light data and then shipping that off to the GPU for render, unless they create methods of doing radiocity on the GPU before render.
It has to be pretty fast passes tho since you need a new one if anything has moved or changed, so it would be pretty much realtime, frame by frame calculations that had to be done.

mullet
03-26-06, 12:20 PM
[pErMan']Ha... that would be utterly stupid on thier part.

http://www.opengl.org/discussion_boards/ubb/ultimatebb.php?ubb=get_topic;f=12;t=000001#000000