Originally Posted by Xion X2
The logic of this entire thing is pretty simple.
Can the GTX480 be cooled to the temps of the 5870 and under? Sure. Slap a massive heatsink or a waterblock on it and watch the temps nose dive.
So what? Trying to accurately gauge temps of a GPU based on whatever heatsink it has is useless. The heatsink, case, fan speed and ambient temps will differ a hundred times over between users. This is why you should base how warm a GPU runs on something concrete like its power usage because that is an independent variable that is constant.
These last cards from Nvidia are good performers, but I, like Madpistol, am sick of seeing people pull any argument that they can out of their &#^ trying to justify the temps. It is the highest power-drawing single-GPU card ever; therefore it is the hottest running single GPU ever. It's as simple as that. Any arguments against this are simply rhetoric and aren't based in fact.
This is the main reason I replied.
I wasn't trying to pull an argument out of my a$$ or "justify" anything.
What Bjorn3d said, and what I echoed, is that it's misleading to judge a cards thermals based on Furmark as it's not representative of normal use.
Let's say I'm trying to decide between a Fenwick and St Croix freshwater spinning rod and all the reviews say "OMG! We bolted each rod to the edge of the roof, tied a 300 pound weight to the tip, and threw it off. Only the Fenwick rod broke- clearly they are inferior.". It's a test- but of what? Conditions that don't happen?
Same with Furmark- with todays console ports you mostly won't be nearing Furmark GPU loads.
The 5870 and 5850 will still be running cooler of course, but there's a big difference in going in to a purchase thinking your card is going to be running 94C and mid 80sC. The latter is far more appealling and would take temperature out of the decision for most people.