Go Back   nV News Forums > Graphics Card Forums > NVIDIA GeForce 200 Series

Newegg Daily Deals

Reply
 
Thread Tools
Old 07-10-08, 03:29 PM   #25
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

Quote:
The question, though, is if this "512 meg" ceiling is overadvertised by Nvidia and its promoters in order to redirect consumers toward the GTX line.
This "Ceiling" is not an advertisement. And its very real. AA + Resolution eats framebuffer data. Lots of it. Specially beyond 4xAA. And games like Crysis with heavy usage of uncompressed normal map textures will eat up memory like there is no tommorrow. The 512 problem has been there "Long" before the HD4000 series ever existed. And could be seen on 9800GTX 8800GT cards in comparison to 8800GTX cards in specific cases.

As time has moved on these cases have increased. Memory amount can be a hard bottleneck. Specially when used. Once you exceed that framebuffer count. Your system memory starts acting as framebuffer. How ATI/Nvidia may allocate that system memory to framebuffer could very well be different. But the point is you cant get past the Zstorage of AA data. And its expensive.

Just some framebuffer numbers to chew on.

4xAA/16xCSAA @ 2560x1600 250 Megs

4xAA/16xCSAA @ 1920x1200 140 Megs

8xAA/16xQCSAA @ 2560x1600 370 Megs of Memory

8xAA/16xQ CSAA @ 1920x1200 280 megs of memory.

Thats just memory dedicated to Zstorage alone.. That does not include HDR which also increases storage data. It also doesnt include textures. Uncompressed ((or compressed Normal maps)) either. Its a pretty inescapable fact that once you run out of onboard memory. You start touching system memory. I'd rather not do that it any event. Using system memory as a means for storing framebuffer is one of the largest causes of micro stuttering that exists.

I just got finished testing Crysis on 9800GTX + SLI cards. And they cant even produce 15 FPS @ 1680x1050 with 16xCSAA and Very High settings due to running out of memory.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-10-08, 03:42 PM   #26
cvearl
Radeon 9700 Pro
 
Join Date: Nov 2002
Posts: 475
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by particleman View Post
As the owner of both a 280GTX and 2 4870's in Crossfire, I disagree. The 280 GTX is worth as much as 2 4870's. With the 280GTX you get the fastest single chip solution available, meaning you avoid all the SLi/Crossfire issues. Also 2 4870's generate an incredible amount of heat and require a power supply even greater than that a 280GTX requires. I thought my 280GTX was a hot running card when I first bought it, but it is nothing compared to 2 4870's. The 2 4870s idle at 80 degrees all the time!

4870's in Crossfire has too many issues... I have had to totally rework the airflow in my case, it warms my entire room, it eats up a crazy amount of electricity, and doesn't work properly with some games.

I am much more satisfied with the $650 I spent on my 280GTX than the $600 I spent on my 2 4870s (the total cost is even more considering I had issues with my 975x board in Crossfire and had to get a x48 board).

With the 280GTX you just play what you want to play (assuming you don't get one with overheating issues). With 4870 Crossfire, you spend more time working on getting stuff working than playing.
Ya. Um. I never said Crossfire or SLI was a good value. This does not change my mind. I am talking about single GTX280 versus single GTX260/4870. Everything I see says the GTX280 needs to be $399 based on the current price of the GTX260 (which is parity with it's equal... the 4870).

$449 MAX price for the GTX280 IMO.

Obviously nVidia agrees that the GTX260 had to be $299 in light of the 4870 performance numbers. eViaGrA is already there as they often are the first ones to the starting line with new prices.

C.
cvearl is offline   Reply With Quote
Old 07-10-08, 03:42 PM   #27
Mr Bigman
Strongest Man On Earth
 
Mr Bigman's Avatar
 
Join Date: Nov 2006
Location: Chicago IL
Posts: 4,966
Send a message via AIM to Mr Bigman Send a message via MSN to Mr Bigman Send a message via Yahoo to Mr Bigman
Default Re: The right price for the GTX200 family...

There is a few who don't have to pay those high prices like Jakup who gets them for free or for little and maybe takin out of his pay.

There a few who work for the man and with proper negotiations can have stuff payed for by taking out of their paychecks weekly.

I worked for a recycler who gets 22 inch CRT's and were selling them for 250 at the time and i was getting one for 25 a week out of my pay but back to topic.

This is getting to be like it should and how it used to be.

Fx 5950 was 399 and so was 9800 XP at the time and they had lessor versions cards fro much less.

Since the 6800 Ultra and een 7800's Nvidia got greedy charging 600 bucks for something thats worth half that.

markup is to much involved here and even online prices are to high.

Go straight to the goats mouth and get them a factory cost probably a third less than Newegg.
__________________
EVGA X58 Tri SLI
Core I 7 3.7Ghz
12 gb corsair DDR3 1600
Three EVGA 285 SLI
Antec 1200
Corsiar 1000 HX Power Supply
2 OCZ SSD Hard Drives
4 wd black edition raid 2 setups for 4TB


_____________________________
having good looks and a good body is hard work.
Mr Bigman is offline   Reply With Quote
Old 07-10-08, 03:48 PM   #28
cvearl
Radeon 9700 Pro
 
Join Date: Nov 2002
Posts: 475
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by ChrisRay View Post
The GTX 280 offers 1 gigabyte of memory. Which is a very sound and future proof investment. I am already seeing alot of games suffering on 512 cards. Both the GTX 260 and 280 have alot of leg room in regards to memory.

I just dont see the 280 going for 400 dollars with 1 gig of memory under its belt. Thats the kind of thing you pay premiums for. Personally the card that continues to impress me is the 260. It basically has that and a bag of chips going for it at its current price range. It simply confounds me that people haven't seen the value of a 896 Meg 448 bit card for 299 dollars. Specially when games like Crysis are already exceeding 512 megs of memory at 1600x1200 with VH and AA enabled.

Chris
Crysis though is an example of an engine that when you DO go to a level that needs beyond 512MB framebuffer, the GPU of a GTX260 can't deliver playable framerates.

I guess someone can bench Crysis here that has a GTX260? Please bench 1600x1200 Very High DX10 in Crysis 4xAA 16xAF and let us know the result. I can run the same test on my 4870 to throw that in. I can't overclock my C2D8400 though. So no crazy clocked rigs please.

I am not challenging anyone. I am just curious. I know my 4870 will tank. Curious how the extra mem on the GTX260 will effect the outcome.

I can and will get a 1GB card next year I am sure though. Not hard to do when I only paid $289 for this one.

As GPU's reach the point where they CAN deliver 1GB worth of rendering at 40 fps or higher, 1GB obviously will be a nessesity.

Also to discount the 4870 as ONLY having 512MB RAM at it's current performance levels as it has demostrated makes all 8800 and 9800 series cards junk and unplayable by comparison? I don't think so. I would have no problem recommending the $200 9800GTX 512Mb to anyone. Powerful card for sure.

But I think there is one area I agree with you on. IF you are a 1920x1200 gamer and like AA and AF with maxed out settings in games to come throughout 2009? You better buy all the card you can afford. I would not expect todays $300 cards to do well in that arena. GTX260 or 4870 regardless of RAM. I really think those 2 cards are for the 22" and below sector (1680x1050 or 1600x1200 and lower).

24" LCD's for new gen engines? Better get a GTX280 and beyond as months go by. Unless they make Warhead, Clearsky and Alan quake really really efficient somehow.

24" LCD's and bigger have ALWAYS needed a $500+ video unit behind it for current gaming.

C.
cvearl is offline   Reply With Quote
Old 07-10-08, 03:53 PM   #29
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by cvearl View Post
Crysis though is an example of an engine that when you DO go to a level that needs beyond 512MB framebuffer, the GPU of a GTX260 can't deliver playable framerates.

I guess someone can bench Crysis here that has a GTX260? Please bench 1600x1200 Very High DX10 in Crysis 2xAA 16xAF and let us know the result. I can run the same test on my 4870 to throw that in. I can't overclock my C2D8400 though. So no crazy clocked rigs please.

I can and will get a 1GB card next year I am sure though. As GPU's reach the point where they CAN deliver 1GB worth of rendering at 40 fps or higher, 1GB obviously will be a nessesity.

C.
A single GPU might not be able too. But a multi GPU can. Once you start looking at multi GPU. The situation becomes even more problematic because you hamstring hardware potential by the limits of its memory.

These are benchmarks I took today as I am preparing my 9800GTX + preview. This graph is actually part of that preview. But its interesting because it specifically illustrates the problem.



P.S ignore the 16xAF. It doesnt work with very high. Its just forced through the control panel.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-10-08, 04:02 PM   #30
Medion
 
Medion's Avatar
 
Join Date: Dec 2005
Location: Barksdale AFB, La
Posts: 1,238
Default Re: The right price for the GTX200 family...

Quote:
16xCSAA @ 1920x1200 140 Megs
I game at 1680x1050, with 16xCSAA, so I'm content with my memory usage. Also, I don't care for Crysis. Since I don't plan to upgrade my monitor anytime soon, I can safely say that I can stay in the middle class as far as VRAM goes, and remain quite content.

I do agree that, for high end gaming, 1GB cards offer a tangible benefit in some cases. But I think that Xion X2 offered an excellent example of why Memory bandwidth can sometimes be as important or more important than just total amount of memory.
Medion is offline   Reply With Quote
Old 07-10-08, 04:09 PM   #31
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

Bandwith is important as well. But so its memory. I'm not suggesting someone go out and buy a Geforce 9600 GT with 1 gigs of memory. But theres a point where just multi GPUing a buncha 512 cards becomes a bit silly. Specially when their potential ends up hamstringed by the framebuffer allowed.

Bandwith isnt the reason the HD4000 series does so well with 8xAA compared to Nvidia ((percentage wise)) ATI does its z passes in fewer cycles than Nvidia at 8xAA. ,. Much like when the G80 came out Nvidia made 2xAA/4xAA use the same amount of cycles. But that doesnt change the memory consumption.

Chris
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-10-08, 04:13 PM   #32
cvearl
Radeon 9700 Pro
 
Join Date: Nov 2002
Posts: 475
Default Re: The right price for the GTX200 family...

OT - can someone please point me to that little util that you can run in the background that logs how much VRAM you just finished using?

C.
cvearl is offline   Reply With Quote

Old 07-10-08, 04:17 PM   #33
cvearl
Radeon 9700 Pro
 
Join Date: Nov 2002
Posts: 475
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by ChrisRay View Post
A single GPU might not be able too. But a multi GPU can. Once you start looking at multi GPU. The situation becomes even more problematic because you hamstring hardware potential by the limits of its memory.

These are benchmarks I took today as I am preparing my 9800GTX + preview. This graph is actually part of that preview. But its interesting because it specifically illustrates the problem.



P.S ignore the 16xAF. It doesnt work with very high. Its just forced through the control panel.
Thank you Chris. I will run the exact same test at home. I love comparisons regardless of the outcome. I like truth and facts.

I'm such a friggin nerd.

So on my 4870 it's Edge detect 4xAA right?

Set everything to Very High in Crysis? That's it? Did you want 16xAF enabled as well or is that moot as you say.

C.
cvearl is offline   Reply With Quote
Old 07-10-08, 04:24 PM   #34
Xion X2
Registered User
 
Xion X2's Avatar
 
Join Date: May 2006
Location: U.S.
Posts: 6,701
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by ChrisRay View Post
This "Ceiling" is not an advertisement. And its very real.
I'm not interested in playing the semantics game. I never said that it was an advertisement; I said that it was advertised. There's a difference.

I don't believe I ever said it wasn't "real," either.

Now that we have that out of the way...

Quote:
The 512 problem has been there "Long" before the HD4000 series ever existed. And could be seen on 9800GTX 8800GT cards in comparison to 8800GTX cards in specific cases.
And "long" before we ever had cards with over 115GB/s memory bandwidth at their disposal.

Quote:
As time has moved on these cases have increased. Memory amount can be a hard bottleneck. Specially when used. Once you exceed that framebuffer count. Your system memory starts acting as framebuffer. How ATI/Nvidia may allocate that system memory to framebuffer could very well be different. But the point is you cant get past the Zstorage of AA data. And its expensive.

Just some framebuffer numbers to chew on.

4xAA/16xCSAA @ 2560x1600 250 Megs

4xAA/16xCSAA @ 1920x1200 140 Megs

8xAA/16xQCSAA @ 2560x1600 370 Megs of Memory

8xAA/16xQ CSAA @ 1920x1200 280 megs of memory.

Thats just memory dedicated to Zstorage alone.. That does not include HDR which also increases storage data. It also doesnt include textures. Uncompressed ((or compressed Normal maps)) either. Its a pretty inescapable fact that once you run out of onboard memory. You start touching system memory. I'd rather not do that it any event. Using system memory as a means for storing framebuffer is one of the largest causes of micro stuttering that exists.

I just got finished testing Crysis on 9800GTX + SLI cards. And they cant even produce 15 FPS @ 1680x1050 with 16xCSAA and Very High settings due to running out of memory.
I'm aware of how it works, but thanks for explaining. In your example of the 9800GTX, I would say that you're overlooking memory bandwidth rates yet again. There's not a big difference in that between 8800 and 9800 series, but on the flip side of the coin you have the 4870 which has over a 45 GB/s advantage in that by comparison.
__________________

i7-2700k @ 5.0 GHz
Nvidia GeForce 570 2.5GB Tri-SLI
Asus P67 WS Revolution (Tri-SLI)
OCZ Vertex SSD x 4 (Raid 5)
G.Skill 8GB DDR3 @ 1600MHz
PC Power & Cooling 950W PSU
Xion X2 is offline   Reply With Quote
Old 07-10-08, 04:54 PM   #35
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

Memory bandwith has nothing to do with it. One resolution lower the 9800GTX SLI + produces more raw FPS than the GTX 280. And 9800GTX is only is only 6 FPS behind the GTX 260.




Memory bandwith does "Not" account for running out of memory. Once you ruin out of memory. Your onboard GPUs bandwith becomes almost meaningless as you are then limited by the PCIE interfaces bandwith. You can attempt to allocate that memory differently. As I am sure ATI does. But your still using system memory.


Quote:
Originally Posted by cvearl View Post
Thank you Chris. I will run the exact same test at home. I love comparisons regardless of the outcome. I like truth and facts.

I'm such a friggin nerd.

So on my 4870 it's Edge detect 4xAA right?

Set everything to Very High in Crysis? That's it? Did you want 16xAF enabled as well or is that moot as you say.

C.
I really dont think you can compare CSAA directly to anything ATI has. Same with CFAA or Edge detection to anything that Nvidia has. The special AA modes Nvidia/ATi offers are not comparable. AF cant/wont be usable with the form of paralex mapping Crysis uses at very high.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-10-08, 07:02 PM   #36
Xion X2
Registered User
 
Xion X2's Avatar
 
Join Date: May 2006
Location: U.S.
Posts: 6,701
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by ChrisRay View Post
Memory bandwith does "Not" account for running out of memory. Once you ruin out of memory. Your onboard GPUs bandwith becomes almost meaningless as you are then limited by the PCIE interfaces bandwith. You can attempt to allocate that memory differently. As I am sure ATI does. But your still using system memory.
I'm curious, then. Why do you think ATI chose to ignore VRAM and instead focus solely on bandwidth by tossing GDDR5 on the PCB?

The only tests I'm really seeing this "512 meg" thing possibly playing a hand in are those ran at 2560x1600 resolutions with AA applied on a single GPU. I'd say probably .01% of the population owns those displays or will own them in the near future, and those who do usually run multi-gpu. And this problem is remedied by going Crossfire which often scales over 100% at that resolution with a 2nd card enabled.

Here you have Call of Duty 4 running faster on the GTX260 at 2560x with 4xAA than on the 4870, but once you go Crossfire with them they're 9fps faster than 280 SLI at 2560x1600 with 4xAA applied. Crossfire scales 113% at these settings despite the 512MB frame buffer:



Again at 2560x, 4870s are outrunning GTX280 SLI on The Witcher. No AA on this one, though:



Again at 2560x w/ 4xAA in Oblivion, the 512MB 4870s in Crossfire are outgunning the 1GB GTX280 by over 30%:



I know every game is different and things can vary; it just seems that, given the benchmarks, this "deficiency" is a bit overstated at this point. I don't see games of the near future dropping from over 100fps down to 30 or less.
__________________

i7-2700k @ 5.0 GHz
Nvidia GeForce 570 2.5GB Tri-SLI
Asus P67 WS Revolution (Tri-SLI)
OCZ Vertex SSD x 4 (Raid 5)
G.Skill 8GB DDR3 @ 1600MHz
PC Power & Cooling 950W PSU
Xion X2 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
295.53: the kernel needs "acpi=off", thinkpad T420, nvs 4200M Imbrius NVIDIA Linux 1 05-27-12 07:18 PM
PC Games, CeleronII 566, CeleronA 300, BIOS Savior, Heatsinks, NES & Sega Items +pics TekViper For Sale/Trade 5 08-07-02 11:48 PM

All times are GMT -5. The time now is 06:08 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.