PDA

View Full Version : Geforce GTX 280 Performance, Price and Pictures!


Pages : [1] 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Soetdjuret
06-03-08, 08:06 AM
Here's how the beast looks like:

http://resources.vr-zone.com/computex08/gt200/IMG_0435.jpg

http://resources.vr-zone.com/computex08/gt200/IMG_0429.jpg

http://resources.vr-zone.com/computex08/gt200/IMG_0436.jpg



Now here is how the beast performs in games compared to 3870x2:

http://www.nvnews.net/vbulletin/attachment.php?attachmentid=31744&stc=1&d=1212511488


Wooho, looks very tasty! The price is said to be a whoppin 649 dollars for the GTX 280 and 449 dollars for the GTX 260! mios dio lol..
This my friend, is one hellova beast! :captnkill:

Elegy
06-03-08, 08:08 AM
I'd just like to mention the scale on the performance graph. In Crysis, the GTX 280 is just above 1.8 which means that it performs ~1.8x the speed of the 3870x2. That would be ~36fps for the 280 then if the 3870x2 runs at 20fps.

It seems like the average performance increase is 80% compared with the 3870x2 (ie, not even double the performance), certainly nowhere near four times. This is of course assuming that nVidia haven't chosen specific resolutions and quality settings that make the GTX seem better than it really is, which wouldn't surprise me.

nekrosoft13
06-03-08, 08:09 AM
performance have been posted before
http://www.nvnews.net/vbulletin/showthread.php?t=114209

i would still call it a rumor

mailman2
06-03-08, 08:24 AM
This is just a G92 with more shaders, I doubt we are going to see double the performance. And if this cooler is anything like the 9800 GX2 I had, it was an internal oven, a space heater. It was awful, it got so hot you could not even touch it - although the internal core temps were mid 70s - this gave off a ton of heat inside my case. Bad design.

I dont know why vendors don't go with the arctic cooling design, a large 80mm fan at the back pushing hot air out. These little piddly 60mm and 70mm fans just dont push enough air and are loud at full speed. For me water cooling FTW!

brady
06-03-08, 08:29 AM
Whoever put those fps numbers on the Crysis bars doesn't understand maths.

Soetdjuret
06-03-08, 08:35 AM
I'd just like to mention the scale on the performance graph. In Crysis, the GTX 280 is just above 1.8 which means that it performs ~1.8x the speed of the 3870x2. That would be ~36fps for the 280 then if the 3870x2 runs at 20fps.

It seems like the average performance increase is 80% compared with the 3870x2 (ie, not even double the performance), certainly nowhere near four times. This is of course assuming that nVidia haven't chosen specific resolutions and quality settings that make the GTX seem better than it really is, which wouldn't surprise me.

Nah youre wrong, the staple is more than 4 times as high. Witch means about 230% faster! U can't know for sure what those 1.0 and 2.0 numbers really are. Because if it was 2.0 times faster than its the double, and that doesn't correspond to the bar. I.e 2.0 x the speed = double the height of the bar, witch isn't the case here. Logical thinking wise.

Whoever put those fps numbers on the Crysis bars doesn't understand maths.

Whoever made this comment doesn't understand the principle and takes stuff for grantance...

hunta2097
06-03-08, 08:43 AM
We live in exciting times.

Lets hope we get these details confirmed. Some (reliable) comparison with Radeon 4850 and 4870 would be great too.

I'm not sure I'll ever be able to afford them though.

SR

Elegy
06-03-08, 08:50 AM
I may be wrong but I highly doubt it. Why would they normalize the graph to 1 for the 3870x2 if what I said wasn't true? If it really was four times faster, wouldn't they make it clear on the graph so people wouldn't assume that it's only 1.8x faster?

Plus the GTX 280 has 240 shaders compared with the Ultra's 128. That's 240/128 = 1.875x faster. The GTX280 has better performance per shader but the 280 also has lower clocks all-round so let's just assume that the lower clocks cancel out the improved performance per shader. So the 280 would then be ~1.875x faster than the Ultra, assuming we're talking about shader intensive games of course. That seems to be roughly in line with the GTX 280 being ~1.8x faster than the 3870x2. If I remember correctly, the 3870x2 is maybe 4 or 5% faster than the 8800 Ultra on average?

Let's assume I'm right and the GTX280 *is* 1.8x the speed of the 3870x2. If the 3870x2 is on average 4% faster than the 8800 Ultra, then the Ultra would be at ~0.96 (0.9615...) on that graph for each game. 1.8/0.96 = exactly 1.875.

If the GTX 280 is 1.875x faster than the 8800 Ultra from two independent sources, it's a pretty safe bet to assume that's the truth. Where did you get your figure of four times faster anyway? I don't think any generation has beaten the previous generation by that significant a margin, ever.

Soetdjuret
06-03-08, 08:53 AM
Your mathematic thinking doesn't work here mate. U can't just take the shaders or the clocks and say its this and that much faster in game performance, that's just stupid. The shaders are only one of the bases for how a graphic card performs.

Elegy
06-03-08, 08:59 AM
Your mathematic thinking doesn't work here mate. U can't just take the shaders or the clocks and say its this and that much faster in game performance, that's just stupid. The shaders are only one of the bases for how a graphic card performs.

I totally agree that shaders and clocks aren't everything but they are a rough indicator. Let's say a card came out that was exactly the same as the 8800 Ultra except it had twice as many shaders. Surely what you'd infer from that is that it's roughly twice as fast. Of course that's just an estimate but it's a better estimate than four times, or 1.2x.

The specs of a card obviously give indicators for performance. Efficiency, drivers, etc... all play a role too so I'm not saying that the GTX 280 will be exactly 87.5% faster than an 8800 Ultra. What I am saying is that it will be roughly 87.5% faster, based on what information we have. I'm guessing maybe... 87.5% +/- 50%? Not exactly precise but it seems fair. Nothing we've seen indicates that it'll be four times faster.

ihatedumbpeople
06-03-08, 09:04 AM
Your mathematic thinking doesn't work here mate. U can't just take the shaders or the clocks and say its this and that much faster in game performance, that's just stupid. The shaders are only one of the bases for how a graphic card performs.



Soetdjuret you are a complete and utter moron if you really think that the GTX280 is going to be anywhere close to being 4X as fast as an 8800Ultra. In order for that to be the case we would have to see a total and complete new architecture instead of just increased shaders etc.


Elegy's Figures are much much more believable than yours and pretty much fall in line with what everyone else thinks. I really do not think you should be the one to question any body elses mathematic skills either. LMAO.

Soetdjuret
06-03-08, 09:19 AM
I totally agree that shaders and clocks aren't everything but they are a rough indicator. Let's say a card came out that was exactly the same as the 8800 Ultra except it had twice as many shaders. Surely what you'd infer from that is that it's roughly twice as fast. Of course that's just an estimate but it's a better estimate than four times, or 1.2x.

The specs of a card obviously give indicators for performance. Efficiency, drivers, etc... all play a role too so I'm not saying that the GTX 280 will be exactly 87.5% faster than an 8800 Ultra. What I am saying is that it will be roughly 87.5% faster, based on what information we have. I'm guessing maybe... 87.5% +/- 50%? Not exactly precise but it seems fair. Nothing we've seen indicates that it'll be four times faster.

No man, u still can't use it as "roughly" comparision for real world performance in games as its only one of the factors from how a card performs. And "based on the information we have" is a graph that shows performance in bars. One bar is this high, and the othe bar is 4 times higher, why would nvidia make the bar 4 times as high if it was less than double the performance? Then they would make the bar 80% higher aka 4/5th times higher.


Soetdjuret you are a complete and utter moron if you really think that the GTX280 is going to be anywhere close to being 4X as fast as an 8800Ultra. In order for that to be the case we would have to see a total and complete new architecture instead of just increased shaders etc.


Elegy's Figures are much much more believable than yours and pretty much fall in line with what everyone else thinks. I really do not think you should be the one to question any body elses mathematic skills either. LMAO.


I don't think anything you "moron?", i speak from the graph nvidia released and are using logical thinking, am not assuming anything here! Mathematics or logical thinking, words against words.. theories against theories. What can i say? The mathematical theory says its 1.8 aka 80% faster if you guess that the nubers on the far left side really is "times faster", but that wouldn't correspond to the bars. The logical theory is that the bar is 4 times as high in a performance chart then this would mean its 4 times as fast, logically. So noone can really say witch one is right as we dont know what the numbers on the left means, we can only guess here. I'd like to see on things from the logical side.

Ps, there's no need to flame and behave like a kid. Youre welcome to take part in this discussion if you feel so. But keep doing as you did in your first post is gonna make your time on nvnews short believe me.

ragejg
06-03-08, 09:22 AM
Do not escalate this thread into a flamefest by name-calling etc.

Useless members do that crap. Don't be a useless member here.

Soetdjuret it seems that you like to get into these kind of arguments. Take that kind of behavior somewhere else.

It stops now. Thank you in advance.

ihatedumbpeople
06-03-08, 09:26 AM
No man, u still can't use it as "roughly" comparision for real world performance in games as its only one of the factors from how a card performs. And "based on the information we have" is a graph that shows performance in bars. One bar is this high, and the othe bar is 4 times higher, why would nvidia make the bar 4 times as high if it was less than double the performance? Then they would make the bar 80% higher aka 4/5th times higher.





I don't think anything you "moron?", i speak from the graph nvidia released and are using logical thinking, am not assuming anything here! Mathematics or logical thinking, words against words.. theories against theories. What can i say? The mathematical theory says its 1.8 aka 80% faster if you guess that the nubers on the far left side really is "times faster", but that wouldn't correspond to the bars. The logical theory is that the bar is 4 times as high in a performance chart then this would mean its 4 times as fast, logically. So noone can really say witch one is right as we dont know what the numbers on the left means, we can only guess here. I'd like to see on things from the logical side.

Ps, there's no need to flame and behave like a kid. Youre welcome to take part in this discussion if you feel so. But keep doing as you did in your first post is gonna make your time on nvnews short believe me.



I am calling you a MORON because that is exactly what you are and when these cards are out we will all see just how much faster they really are compared to an 8800Ultra and I will be laughing my ass off at you just like I am right now and so will everyone else that has seen your stupidity. :lol:

ragejg
06-03-08, 09:28 AM
STOP IT.

Get back to topic in a civil fashion.

Your asses WILL be kicked out of here if you don't civilize.

xbob
06-03-08, 09:29 AM
Soetdjuret you are a complete and utter moron if you really think that the GTX280 is going to be anywhere close to being 4X as fast as an 8800Ultra. In order for that to be the case we would have to see a total and complete new architecture instead of just increased shaders etc.


Elegy's Figures are much much more believable than yours and pretty much fall in line with what everyone else thinks. I really do not think you should be the one to question any body elses mathematic skills either. LMAO.

You have 4 post and they are all negative or inflamatory, simmer down there fella'!

ihatedumbpeople
06-03-08, 09:29 AM
STOP IT.

NO

ihatedumbpeople
06-03-08, 09:29 AM
You have 4 post and they are all negative or inflamatory, simmer down there fella'!

What can I say.............I just hatedumbpeople. LOL.

Elegy
06-03-08, 09:29 AM
No man, u still can't use it as "roughly" comparision for real world performance in games as its only one of the factors from how a card performs. And "based on the information we have" is a graph that shows performance in bars. One bar is this high, and the othe bar is 4 times higher, why would nvidia make the bar 4 times as high if it was less than double the performance? Then they would make the bar 80% higher aka 4/5th times higher.

Why would they make the bar four times as high when it's only 80% faster? Because it makes their product look faster and better, it's as simple as that. That's what companies do, day in day out. If the GTX280 really was 4x faster, we would know about it for sure and nVidia would make it perfectly clear on that graph. I've shown you two independent sources - the graph and the specifications of the Ultra and the GTX - to back up my theory. What backing do you have for yours?

ragejg
06-03-08, 09:31 AM
lol is right. You just stopped this thread from functioning.






People, until a mod takes care of business here, let's just not post in this thread... wait for the mess to be cleaned up a bit :p and then get back to discussing the matter at hand.

ihatedumbpeople
06-03-08, 09:32 AM
Why would they make the bar four times as high when it's only 80% faster? Because it makes their product look faster and better, it's as simple as that. That's what companies do, day in day out. If the GTX280 really was 4x faster, we would know about it for sure and nVidia would make it perfectly clear on that graph. I've shown you two independent sources - the graph and the specifications of the Ultra and the GTX - to back up my theory. What backing do you have for yours?


Elegy give it up. This guy you are arguing with is the biggest dumbass to ever grace the face of any internet forum. Just do a search and read other post and topics from him to see for yourself.

You have backed up your figures and all he has done is spew FUD from his stupid pie hole with no backing of his own which is what I find hilarious.

ihatedumbpeople
06-03-08, 09:34 AM
lol is right. You just stopped this thread from functioning.






People, until a mod takes care of business here, let's just not post in this thread... wait for the mess to be cleaned up a bit :p and then get back to discussing the matter at hand.

:wonder:

DHP
06-03-08, 09:41 AM
This is just a G92 with more shaders, I doubt we are going to see double the performance.

+ 512 bits memory interface
+ 2200MHz GDDR3
+ 1GB GDDR3
+112 more shaders
+ 50% better shader efficienty/ performance
+ only 25W idle!!!:

http://i31.tinypic.com/24e6rti.jpg

:D

It will rock!!!

ihatedumbpeople
06-03-08, 09:43 AM
+ 512 bits memory interface
+ 2200MHz GDDR3
+ 1GB GDDR3
+112 more shaders
+ 50% better shader efficienty/ performance
+ only 25W idle!!!:

http://i31.tinypic.com/24e6rti.jpg

:D

It will rock!!!



It will rock for sure and I cannot wait to get one but I am also realistic about it as I know better than to think that it will be 4X faster than the last gen cards.

DHP
06-03-08, 09:47 AM
Well if its as fast as 2 8800Ultra's (@100% scaling) in SLI it wil rock.

I'm sure the high res and AA performance will be very very good due to the bandwidth and 512bit interface. It wil crush 8800 Ultras, and every other card on this planet. :) :cool: