View Single Post
Old 07-11-08, 02:49 PM   #86
Registered User
ChrisRay's Avatar
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

I apologize for the double post. But I wanted to show something. This thread made me put the 9800GTX + SLI cards back in. And I wanted to show this. Forgive the jpeg compression but didnt think it mattered much in this case

Unreal Tournament 3

16xAA/16xAF @ 1920x1080P ((4xAA memory footprint))

16xQAA/16xAF ((8xAA Memory footprint))

I know they are not 100% exact. But you try moving around 2 FPS. You want to hurt someone by the time you reach the map point. This is a perfect illustration of what happens in NWN 2, UT3, And FEAR at 16xQ/8xMS commonly on the 9800GTX, 9800GX2, 8800GT cards. I'll be back later to post a single GTX 280 or a single GTX 260 at the same settings.



I know you have forgotten more about video cards than I will ever know. I am asking the following with complete respect. Hope you can explain.

I totally get what you are saying. But in real world situations where 1GB is a good insurance policy for minimizing the chance of bad 512MB performance I have some questions.

In the following review you have a 768MB 8800GTX up against a 512MB 9800GTX. This is only one example of such reviews. But anyway as you look at all of the heaviest tests, the 512MB cards is doing better than the 768MB card. I know there are clock differences and other factors. But in the case of one situation where my brain THOUGHT would be a 768MB card winning was not.

COD4 2560x1600 4xAA Obviously maxed out.

The 512MB 9800GTX gets 40.7 fps
The 768MB 8800GTX gets 38.7 fps

Then I thought! Frig that! 768MB has GOTTA beat 512MB. I know Oblivion!

2560x1600 4xAA, 16xAF Maxed

512MB 9800GTX gets 24
768MB 8800GTX gets 20


All other tests regardless of settings and res, the 9800GTX appears to match or beat it.

Can you help me understand why? And at what point will the extra 256MB change this result?

And I am just looking for an honest answer here. Will a person notice more microstutter in that COD4 example even though the framerate is about the same?

I am off in search of situations where the 8800GTX beats the 9800GTX...

Thanks Chris.
Cvearl. Just guy to guy. I do appreciate your respect. And I dont want you to think otherwise. But lets just be guys on the forum here and forget the other nonsense. There is one thing that should be noted and I'm fully aware of. I'm extremely biased to multi GPU. I admit it right off. So when I look at anything. I take a multi GPU perspective about it. It may not be immediately obvious to those here who dont know ne as well.

I am very glad that you asked that question. Oblivion results dont surprise me. Considering my experience with the title its not as memory bound as people think. However if you turned on 8x Multisampling things would look very different. However probably not so at 1080P where I can use 16xQ fine on the 9800GTX and GTX 260/280 cards. Or used the texture mod pack which uses 4096x4096 texture maps. I havent done much work with COD4. So I am a little uncertain. Look above at my UT3 comparison of the 9800GTX running 16xQ which has an 8x MS storage for color data. The 9800GTX should and will always be faster when it doesnt run out of memory.
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote