View Full Version : GeForce 6800 First Look

Pages : [1] 2 3 4 5

06-11-04, 05:12 PM
:thumbsup: GeForce 6800 First Look

NVIDIA sent nV News a reference GeForce 6800 with 128MB of memory. The GeForce 6800 will have a suggested retail price of $299.


NVIDIA also released a version 61.34 WHQL candidate driver.


The GeForce 6800 contains 12 pixel pipelines and operates at a core frequency of 325MHz and an effective memory frequency of 700MHz. This particular sample is reporting a 335MHz core :)


Along with the 6800 Ultra and 6800 GT models, which have 16 pixel pipelines, the 6800 also supports Shader Model 3.0 (vertex shader 3.0 and pixel shader 3.0).

Brian Burke of NVIDIA mentioned the a new Far Cry update with a Shader Model 3.0 path is expected soon. The new path will show the performance benefits between Shader Model 3.0 and 2.0.

Brian also furnished reviewers with a new ForceWare 61.34 WHQL candidate driver and an updated GeForce 6 Series reviewers guide. Texture filtering optimizations have been a hot topic of late and I'm glad to report that NVIDIA is giving the user total control over texture filtering options in the 61.34 driver. In addition to trilinear filtering optimizations, NVIDIA also added optimizations for anisotropic filtering.

The following section contains excerpts from the reviewers guide in regards to texture filtering:

Image Settings

With the launch of the NVIDIA ForceWare Release 60 graphics driver, NVIDIA has modified the Performance & Quality control panel to more accurately represent the driver settings. Users now have full control over image quality, trilinear optimizations, and anisotropic optimizations. NVIDIA now offers High Performance, Performance, Quality, and High Quality modes for Image Settings.

- The High Performance mode offers users the highest frame rate possible.

- Performance mode offers users an optimal blend of image quality and performance.

- Quality mode offers users the highest image quality while still delivering exceptional performance.

- High Quality mode is designed to give discriminating users images that do not take advantage of the programmable nature of the texture filtering hardware, and is overkill for everyday gaming.


Trilinear and Anisotropic Optimizations

NVIDIA implements intelligent algorithms for trilinear and anisotropic optimizations. These optimizations are enabled by default. NVIDIA’s anisotropic optimization enables the NVIDIA display driver to take advantage of its programmability to substitute point-mipmap (bilinear) filtering for linear-mipmap (trilinear) filtering on some texture stages. The option the user specifies for "Image settings" determines which texture stages will be affected. The "Quality" image setting enables the use of point-mipmap filtering on all but the first texture stage.

NVIDIA’s trilinear optimization allows better texture filtering performance with no perceived lose of image quality. Users can view the areas of the image that are effected by the trilinear optimization by enabling textures that contain colored mipmap chains which are used in typical diagnostic applications.

Note: Colored mipmaps are not the sole determinant of the quality of the filtering. Comparing colored mipmaps with the optimization on and off only gives you a good idea of where to look on the "in-game" image for areas where the optimization is applied (i.e. where image quality may change). If the optimization is working correctly, you should not be able to see any difference in the in game screen shot when the optimization is on or off. Viewing objects in motion gives a much better illustration of filtering quality because trilinear filtering is a technique designed to reduce the appearance of the bands caused by the mipmap transitions when the image is in motion and the banding is more noticeable.

Turning trilinear optimization off disables trilinear optimizations and will result in the best image quality. NVIDIA understands that some users may want to disable these features, so we’ve provided controls to do this. To disable trilinear optimizations and anisotropic optimizations, click on the Show Advanced Settings checkbox and then set the Value to Off for Trilinear Optimizations and Anisotropic Optimizations. For OpenGL applications, the behavior of anisotropic optimizations is that regardless of what the Control Panel reports back; On or Off, the resulting behavior will be Off, because no Anisotropic Optimizations are implemented for OpenGL.

Anisotropic Filtering

NVIDIA's Quality, Performance, and High Performance modes feature adaptive texture filtering, a technology that takes advantage of the adaptive/programmable hardware to make more intelligent choices about bandwidth usage and allow the hardware to work more efficiently without making quality tradeoffs. Selecting High Quality mode will disable adaptive texture filtering.

NVIDIA’s High Quality mode gives true anisotropic filtering without leveraging the adaptive and programmable nature of the texture filtering hardware. There is virtually no perceivable image quality difference between High Quality and Quality modes. If you do image inspections between High Quality and Quality settings, you'll find them to be virtually indistinguishable.


The goal of my first series of tests was to measure the impact of the various texture filtering optimizations using 3DMark03. The table below contains results from each of the game tests in 3DMark03 and the overall 3DMark score. The first column, "TRI Opt" indicates if trilinear optimizations were on or off, while the "AF Opt" column does the same for anisotropic filtering. The driver control panel was set to application controlled antialiasing and anisotropic texture filtering.

The texture filtering headings "Bilinear", "Optimal", "Trilinear" and "4X Anisotropic" are the results from the corresponding texture filtering option in 3DMark03. The default setting is "Optimal". The professional version of 3DMark03 allows the remaining texture filtering options to be configured.


And the results, which are impressive:


Notice the odd result in Game 1 (211.2) under the 4X anisotropic tests where I turned trilinear optimizations off and left anisotropic optimizations on.

I'll be adding other benchmark results and image quality comparisons to the post during the weekend.

Halo Performance


AquaMark3 Performance


Update: Enemy Territory Performance

FRAPS was used to determine minimum and average frame rates while playing back a demo at normal game speed from a 6 vs. 6 player clan match that took place on the Railgun map. Performance was based on the first three minutes of the demo.

Note that anisotropic optimizations are not implemented for OpenGL.






Results from a GeForce 6800 Ultra and GeForce FX 5950 at 1600x1200 with 4X AA and 8X AF:

- GeForce 6800 Ultra - Avg: 95, Min: 42

- GeForce FX 5950 Ultra - Avg: 58, Min: 24



Dungeon Siege Gameplay

Gameplay instructions cound be found in this article and are based on a GeForce 6800 Ultra:


Although 1600x1200 is not officially supported, there was a glitch using 4X AA and 8X AF:


Notice the sword cursor, which also contains a rectangular texture. Both objects move together. The mouse is used to control player movement, but the player will move in the opposite direction chosen. The cursor doesn't trigger an action when clicking a menu item.

Fillrate Benchmark


Quality Image Setting - Trilinear and Anisotropic Optimizations Off


Note the low score in the customized pixel shader test compared to the 6800 Ultra results, which were ~3000 M-Pixel/s in this thread:


Here is the custom pixel shader code:


06-11-04, 05:14 PM
Awesome Mike. Thanks for the heads up.

06-11-04, 05:31 PM
Yep, thanks for the info. It's nice to see an optimization and option guide as well.

06-11-04, 06:07 PM
gj mike...

still one hell of a massiv ecard though :eek:

hope they can bring that baby out on a smaller pcb...

06-11-04, 06:08 PM
Great numbers. :)

06-11-04, 06:09 PM
Thanks for the pics and and info.

06-11-04, 06:16 PM
The only thing i can conplain about is the huge size of the card :o

06-11-04, 06:33 PM
The card looks like it's the size of a 5700U card, GF4 ti4600. Nice numbers even at 335mhz beats a 5950U!

06-11-04, 06:33 PM
it looks about the same size as the ti 4600 or fx5900

06-11-04, 06:37 PM
That's about the best post I've read on this board (although I haven't been here very long).

06-11-04, 06:38 PM
BTW these benches look much more like I expected from a 6800 nu compared to this


06-11-04, 06:40 PM
Bah, I could care less about the size... me likey :drooling:

This is the price point I'm gonna target... I just don't have the flow for ultra-high end, and judging by the numbers at stock speeds on early rev boards, this positively seems like a worthy kick-in the pants upgrade even from a 5900XT owner.

I will own one of these soon enough somehow...

06-11-04, 08:11 PM
MikeC, are you sure that you have flashed to the updated BIOS? I have heard that samples will be 8 pipelines and 335Mhz before flash, and 325Mhz and 12 pipelines after the flash.

06-11-04, 08:15 PM
What more can I say? MikeC's 6800NU first look ROCK HARD!!!

"The way it meant to be filtered" :)

06-11-04, 08:15 PM
jimmy, this a reference card from NV so probably no chance of a bios screw up. Look at the 3dmark scores..compared to a geforce 6800 Ultra. It's a 3 quad gpu is enabled allright

06-11-04, 08:17 PM
A guy at B3D who has a 6800NU initially had 335Mhz core clock and 8 pipelines, and after the flash it became 325Mhz core.

Wouldn't a fillrate tester be a good way to verify this?

06-11-04, 08:21 PM
Jimmy wait till he gets a chance to test it some more. I'm sure all questions will be answered in time. But give the guy a break ;p

MikeC is pretty busy so he probably hasnt had a huge opurtunity to test the fill rate results ect.

06-11-04, 08:26 PM
I'm just pointing it out, because the BIOS issue has come up for at least two other people at B3D. Especially considering NV's recent stance on sending cards out to reviewers with different core and mem clocks. ;)

06-11-04, 08:31 PM
weren't the cards from those folks in b3d's forums from nv's AIB partners?? I was under the impression they were. MikeC has a reference card from NVIDIA!

06-11-04, 08:39 PM
Great work Mike . time for some benchies :)

06-11-04, 08:41 PM
SH64: patience grasshopper, Mike is working on it and maybe a O/C report ;)

06-11-04, 08:42 PM
Wouldn't a fillrate tester be a good way to verify this?

3DMark03 Fillrate Test (Default Settings):

Single-Texturing = 2390.8 MTexels/s
Multi-Texturing = 3906.5 MTexels/s

06-11-04, 08:46 PM
single texturing is a bit low??

06-11-04, 08:52 PM
Great work Mike . time for some benchies :)

Thanks. I just added results from Halo.

06-11-04, 08:54 PM
Yes by the looks of it. The Single texturing is extremely bandwith limited.

335 x 8 = 2680, minus bandwith

335 x 12 = 4020 minus Bandwith

But the multitexture looks in line. guess the card just appears to be very bandwith limited.