PDA

View Full Version : Got a 6800GT and DX9.0c - Now what?


ekotan
07-14-04, 05:00 PM
Running a 3GHz Prescott and an XFX 6800GT overclocked as Ultra. All watercooled, all very stable. I've installed WinXPSP2 (RC2) and DXDIAG shows I have DX9.0c installed. I'm using NVIDIA's ForceWare 61.72 WHQL drivers. Performance is impressive (I've upgraded from an ATI 9700 Pro), stability is top notch, Nalu looks amazing, all my games run well, I'm happy, but:

1) Are there any programs/demos I can use to check out Shader Model 3.0 functionality? Neither ShaderMark 2.1 nor Far Cry 1.2 patch is yet available to the public. PowerVR's demos don't run on the NV40 due to specs changes between when they were coded and when DX9.0c was finalized. Oh, and 3DMark03's Info page shows my card supports VS 2.0 and PS 2.0 only. Should it not at least list VS 3.0 and PS 3.0 now that I've DX9.0c installed?? I have build 340, is there a patch I need to apply?

2) Some older demos (notably NVIDIA's CG Browser and the Meshuggah demo) say that my card does not support trilinear filtering and refuse to run! This has gotta be a driver bug, right?

3) How useful is the auto-overclocking feature in the ForceWare drivers? Are they intelligent enough to adjust VRAM speeds as well as GPU core? Should I use that or just continue to manually overclock? I'd like to use it since it would help reduce heat dissipation (no need to run the GPU at overclocked speeds when it's only drawing my 2D desktop) but I don't know if I should trust it, there's no info on the web.

4) How come there is no official driver which supports the 6800 family on NVIDIA.com? Cards are available to buy now, so what's the hold up on an official driver from the mothership?

5) Most of my ATI tech demos run well on the NV40, even Ruby runs with a little hack (although shows some artefacts). There are a few which won't run though, giving errors like this:

D3DAw Error: AwCreateRenderableTexture - Unable to create color texture object. Error creating color buffer!

Can this be fixed with a future driver update? I thought the NV40 should have all the features of the R3x0 series from ATI (and more).

6) Does anyone know of any 6800 Ultra cards which feature two dual link DVI outputs? I know all Ultra cards have two DVI ports, but they tend to be single link DVI ports. Dual link DVI ports are required to drive the new 30 inch Apple Cinema Display and I want two of them for 60 inches of dual head goodness. :kill:

Your assistance would be appreciated,

E@

saturnotaku
07-14-04, 05:25 PM
Running a 3GHz Prescott and an XFX 6800GT overclocked as Ultra. All watercooled, all very stable. I've installed WinXPSP2 (RC2) and DXDIAG shows I have DX9.0c installed. I'm using NVIDIA's ForceWare 61.72 WHQL drivers. Performance is impressive (I've upgraded from an ATI 9700 Pro), stability is top notch, Nalu looks amazing, all my games run well, I'm happy, but:

1) Are there any programs/demos I can use to check out Shader Model 3.0 functionality? Neither ShaderMark 2.1 nor Far Cry 1.2 patch is yet available to the public. PowerVR's demos don't run on the NV40 due to specs changes between when they were coded and when DX9.0c was finalized. Oh, and 3DMark03's Info page shows my card supports VS 2.0 and PS 2.0 only. Should it not at least list VS 3.0 and PS 3.0 now that I've DX9.0c installed?? I have build 340, is there a patch I need to apply?

2) Some older demos (notably NVIDIA's CG Browser and the Meshuggah demo) say that my card does not support trilinear filtering and refuse to run! This has gotta be a driver bug, right?

3) How useful is the auto-overclocking feature in the ForceWare drivers? Are they intelligent enough to adjust VRAM speeds as well as GPU core? Should I use that or just continue to manually overclock? I'd like to use it since it would help reduce heat dissipation (no need to run the GPU at overclocked speeds when it's only drawing my 2D desktop) but I don't know if I should trust it, there's no info on the web.

4) How come there is no official driver which supports the 6800 family on NVIDIA.com? Cards are available to buy now, so what's the hold up on an official driver from the mothership?

5) Most of my ATI tech demos run well on the NV40, even Ruby runs with a little hack (although shows some artefacts). There are a few which won't run though, giving errors like this:

D3DAw Error: AwCreateRenderableTexture - Unable to create color texture object. Error creating color buffer!

Can this be fixed with a future driver update? I thought the NV40 should have all the features of the R3x0 series from ATI (and more).

6) Does anyone know of any 6800 Ultra cards which feature two dual link DVI outputs? I know all Ultra cards have two DVI ports, but they tend to be single link DVI ports. Dual link DVI ports are required to drive the new 30 inch Apple Cinema Display and I want two of them for 60 inches of dual head goodness. :kill:

Your assistance would be appreciated,

E@

1) I think there is a ShaderMark 3.0 demo available. Check the Forceware driver forums as there was some discussion in there.

2) Might just be that the demos are too old, but it also could be a driver thing. Try turning Trilinear and Anisotropic Optimizations off via the driver control panel and see if that works.

3) Auto detect is useful for people who don't want to mess around. But you'll almost certainly achieve better results through manual clocking. The auto results for my GT were 400/1.07 but manually I've achieved 431/1120 - much better.

4) Drivers allegedly will be available early next week, according to sources that have posted in the drivers forum.

5) There's probably some non-standard DirectX coding on those demos. After all, it wouldn't look very good for ATI if NVIDIA cards could run their demos just as well, if not better. Same is true if the shoe was on the other foot. Don't count on any driver updates being able to run demos from NVIDIA's direct competitor.

6) Probably not, though I'm sure you will be able to purchase an adapter to get that display to run. You'd be one of only a handful of people to shell out that kind of money on that monitor. It doesn't really behoove an IHV to invest the kind of money in making such a thing for the sole purpose of being able to run one specific kind of monitor.

But now that you've brought it up, the dual-link DVI is probably a big reason why Mac versions of video cards tend to cost much more than their Windows counterparts. </shrug>

ekotan
07-14-04, 07:24 PM
Thank you for the helpful response. Forgot to mention that I'd already tried turning Trilinear and Anisotropic Optimizations off via the driver control panel. It didn't work. Oh well, these are old demos, so it's no big deal. I was just curious.

So do you have any idea why 3DMark03 refuses to show my card supports VS_3.0 and PS_3.0 now, even with DX9.0c installed? Do they need to release a patch to their sysinfo module to correct this, perhaps?

In the ATI demos, I don't think there is any non-standard DirectX coding. They check for hardware and driver caps, not a specific GPU brand, so my NV40 can now run most of the ATI demos including ATI Chimp, Rachel, Nature, Car Paint and Animusic (my favourite tech demo ever) at great frame rates. My old GF FX card couldn't run them simply because it didn't support the necessary caps like multiple render targets. Now that those features are supported by the NV40 family, these demos work great. This is in direct contrast to NVIDIA's tech demos which are often using Open GL (where it's easier to access the new and advanced features of a new card) and checking for NV-specific GL extensions, so you can't run even an old demo like NV Squid or Wolfman on a modern ATI card like the X800 series even though the hardware is certainly more than capable of doing so (those demos were written for the GF4Ti chips, so it oughta work on an X800!).

Dual link DVI is required not because it's an Apple monitor but because it features an insanely high resolution (2560x1600). That's over 4 million pixels in a single display and I want two of them for a dual head config (can never have too much screen real estate when video editing). The bandwidth on a single link DVI port is simply not enough to juggle that many pixels. Since the trend in TFT monitors is to design even bigger ones with higher resolutions, I think it is in every IHV's interest to make sure their flagship cards can drive these displays.

And I didn't notice that Mac graphics cards would cost so much more, at least not the high-end ones. I've just ordered the Mac version of the GF6800 Ultra with two dual link DVI ports for my Power Mac G5, it was $599, which is similar to the PC version. I only use my PC for gaming now, I prefer doing all my actual work on the Mac. Gone are the days where Macs were slower and incompatible with Windows. My dual 2 GHz Power Mac G5 is noticably faster than my ninja PC and Mac OS X is intentionally based on cross-platform standards so sharing my work between different platforms is a no brainer. I even prefer the Mac version of Office to the Windows version (cleaner UI). And I love my aluminium PowerBook. Uses a Radeon 9700 mobility GPU and runs Halo at playable frame rates, which impressed me on such a slim and quiet notebook.

Regards,

E@