Go Back   nV News Forums > Graphics Card Forums > NVIDIA GeForce 200 Series

Newegg Daily Deals

Reply
 
Thread Tools
Old 07-11-08, 12:32 AM   #61
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

AA Z Cycles is a condition of the ROPS. Its affiliated with bandwith but not entirely. Bandwith helps maximum amounts of fillrate/texel rate through the pipeline as fast as possible. When you use fewer cycles. You actually need less bandwith to do the same task. For example. The GTX 280 has a huge texel fillrate for bilinear and AF. To achieve its bilinear texel fillrate. It needs bandwith to feed texture units. Thats why. Despite the 9800GTX having a super texturing performance. In relation to the GTX 260. It cant match its performance due to a lack of bandwith. Of course thats a pretty simplistic explanation. There are obviously other factors. Just having more bandwith doesnt neccasarily mean your going to use it all. There are all kinds of things which can impact bandwith advantages and disadvantages.

Running out of memory is completely unrelated to the two though. The 8800 GTX had a large bandwith advantage over the 9800GTX. But it didnt always translate into better performance despite its superior pixel and bandwith. Typically when the 9800GTX failed in comparison to the 8800GTX it was actually the memory bottlenecking the card.

AA Passes relate to how many times the AA data must cycle through the ROPS to reach final rendering. In the old days went like this.

2x 1 Cycle
4x 2 Cycles
6x 3 cycles ((For ATI))

The Geforce 8800GTX did both 2x and 4x in 1 cycle. So 2x and 4x AA performance were largely identical when transparency SS or memory bottlenecks or zfill bandwith bottlenecks didnt come into play. Prior Nvidia cards improved did 2x in one cycle and 4x in another. Thats why the G80 was so good at running AA.

I'm not sure how many cycles ATI is doing with there 8xAA. But this could easily be tested with a fillrate tester that tests Z-Fillrate. The more your fillrate halves. The more cycles it takes. Over at beyond3d I saw some tests with the HD3850 Zfillrate. And it was taking twice as many cycles at 4xAA as the X1900XT. This was not broken. Just by design. Now I cant/wont confirm that with 100% certainty but it seems obvious to me ATI has improved the AA cycle routine of their ROPS.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-11-08, 12:33 AM   #62
cvearl
Radeon 9700 Pro
 
Join Date: Nov 2002
Posts: 475
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by ChrisRay View Post
I didnt use the time demo btw. This was taken from in game. I really dunno how the cards perform in a time demo cvearl

Chris

OK I'll Framps it. Where would be good?

C.
cvearl is offline   Reply With Quote
Old 07-11-08, 12:34 AM   #63
cvearl
Radeon 9700 Pro
 
Join Date: Nov 2002
Posts: 475
Default Re: The right price for the GTX200 family...

Does anyone know the name of that util that shows how much video ram you are using?

C.
cvearl is offline   Reply With Quote
Old 07-11-08, 12:35 AM   #64
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

You can try if you like. I obviously wouldnt say the comparison would be an ideal testing enviroment due to obviously different testing conditions and its very unlikely you'll be able to repeat my benchmark entirely. The Area I use is the landing site and up through the forest cvearl.


Your other question. Video Memory Watcher was what it was called. But it doesnt work in Windows Vista due the way Vista manages memory. I used it a long time ago in my CSAA analysis to show how 16x CSAA uses virtually identical memory to 4xAA.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-11-08, 01:11 AM   #65
lopri
Registered User
 
Join Date: Apr 2006
Posts: 14
Default Re: The right price for the GTX200 family...

Thank you so much, Chris. It's a lot more than I expected, and you explaiend it easy enough for me to understand. Let me confirm this, with regard to G80 vs G92.

Quote:
Originally Posted by ChrisRay View Post
Running out of memory is completely unrelated to the two though. The 8800 GTX had a large bandwith advantage over the 9800GTX. But it didnt always translate into better performance despite its superior pixel and bandwith. Typically when the 9800GTX failed in comparison to the 8800GTX it was actually the memory bottlenecking the card.
So, the reason why G92 sometimes shows inconsistent performance vs G80 (seemingly memory related) is the size of frame buffer, rather than its bandwidth? Relatively speaking.

Thank you again.
lopri is offline   Reply With Quote
Old 07-11-08, 01:14 AM   #66
lopri
Registered User
 
Join Date: Apr 2006
Posts: 14
Default Re: The right price for the GTX200 family...

Then again, 1GB G92 doesn't look to do any better than 512MB G92..?

Edit: Or is it bandwidth more severely impacting G92, not the frame buffer? I got that the frame buffer has nothing to do with bandwidth, just want to know which one is more limiting G92's performance. In the above quoted paragraph, I couldn't separate bandwidth from frame buffer.
lopri is offline   Reply With Quote
Old 07-11-08, 01:33 AM   #67
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by lopri View Post
Thank you so much, Chris. It's a lot more than I expected, and you explaiend it easy enough for me to understand. Let me confirm this, with regard to G80 vs G92.



So, the reason why G92 sometimes shows inconsistent performance vs G80 (seemingly memory related) is the size of frame buffer, rather than its bandwidth? Relatively speaking.

Thank you again.

"typically" yes. The G92 has better compression for high res AA. Specially with 4xAA and 16xCSAA. Color compression is purely a bandwith thing. So its more efficient with less. When you say 1 Gigabyte isnt doing any better. What tests and settings are you describing. If you dont have the memory it will be immediately noticable. If you do then both cards would perform the same.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-11-08, 04:14 AM   #68
lopri
Registered User
 
Join Date: Apr 2006
Posts: 14
Default Re: The right price for the GTX200 family...

Thank you much!
lopri is offline   Reply With Quote

Old 07-11-08, 07:07 AM   #69
MikeC
Administrator
 
MikeC's Avatar
 
Join Date: Jan 1997
Location: Virginia
Posts: 7,514
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by lopri View Post
I do understand what you're trying to say with regard to absolute amount of frame buffer. But the problem with NV cards is as if, for whatever reason, they hit their own memory limit before they get to that absolute amount of memory required. If you could explain to me, for example, why I can play Oblivion @2560x1600/8AA with a HD 4870 but 8800 GTX has a hard time handling (plus stuttering) 2560x1600/4AA even with x1.5 times of RAM.
It's possible that this type of situation could also be related to the differences in compression technology that ATI and NVIDIA employ. It would be interesting to conduct memory usage tests between the 4870 and GTX 280 using a program that reports video memory usage like VidMemWatch or RivaTuner.
MikeC is offline   Reply With Quote
Old 07-11-08, 08:37 AM   #70
Medion
 
Medion's Avatar
 
Join Date: Dec 2005
Location: Barksdale AFB, La
Posts: 1,238
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by Xion X2 View Post
Agreed. That's the point that I've been trying to drive home with him the entire time (and always mentioned "4870" so he knew I wasn't just referencing Nvidia cards.) Different architectures, different memory management. The best control test would be the exact same card with more VRAM on one of those.

My guess is the benches will come close to the 2900XT when it was released. Not much of a difference, if any.
Reading his comments, he just doesn't "get it." A controlled test is the only way to truly highlight the issue. His tests simply are not valid to me. And he's doing the same crap he pulls at R3D, where if you don't share his opinions, he gets all testy and defensive.
Medion is offline   Reply With Quote
Old 07-11-08, 09:23 AM   #71
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

Yet its not an opinion. Its math. It's clearly obvious. And if you've done nothing but make snide comments at me and remarks and argue semantics with me. Heck even Nvidia says the problem's I've encountered in these tests are memory related. I wish I had the luxury of 1 liner staw man semantical arguments. The data is clearly in front of you.. But instead of talking about the data in question. You attack the person presenting it because you have nothing else to go on. Good Game Medion. Good Game. I'll just run around on forums and start making things up like. "High bandwith mitigates the need for memory footprint!" It seems to work for some people.


Quote:
Originally Posted by MikeC View Post
It's possible that this type of situation could also be related to the differences in compression technology that ATI and NVIDIA employ. It would be interesting to conduct memory usage tests between the 4870 and GTX 280 using a program that reports video memory usage like VidMemWatch or RivaTuner.
Sadly. These types of things are extremely hard to monitor under Windows Vista because of how Vista manages/uses memory. ((It eats every drop up and uses it all. Even video memory and only frees resources as they are requested)) This is a test that simply must be ran in Windows XP to fully explore something's memory usage,.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-11-08, 09:29 AM   #72
saturnotaku
Apple user. Deal with it.
 
Join Date: Jul 2001
Location: The 'burbs, IL USA
Posts: 12,502
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by Medion View Post
Reading his comments, he just doesn't "get it." A controlled test is the only way to truly highlight the issue. His tests simply are not valid to me. And he's doing the same crap he pulls at R3D, where if you don't share his opinions, he gets all testy and defensive.
BAHAHAHAHA!!

CR has more knowledge of this stuff in his left testicle than 3/4 of the people on this forum (myself included).
saturnotaku is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
295.53: the kernel needs "acpi=off", thinkpad T420, nvs 4200M Imbrius NVIDIA Linux 1 05-27-12 07:18 PM
PC Games, CeleronII 566, CeleronA 300, BIOS Savior, Heatsinks, NES & Sega Items +pics TekViper For Sale/Trade 5 08-07-02 11:48 PM

All times are GMT -5. The time now is 06:32 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.