The logic of this entire thing is pretty simple.
Can the GTX480 be cooled to the temps of the 5870 and under? Sure. Slap a massive heatsink or a waterblock on it and watch the temps nose dive.
So what? Trying to accurately gauge temps of a GPU based on whatever heatsink it has is useless. The heatsink, case, fan speed and ambient temps will differ a hundred times over between users. This is why you should base how warm a GPU runs on something concrete like its power usage because that is an independent variable that is constant.
These last cards from Nvidia are good performers, but I, like Madpistol, am sick of seeing people pull any argument that they can out of their &#^ trying to justify the temps. It is the highest power-drawing single-GPU card ever; therefore it is the hottest running single GPU ever. It's as simple as that. Any arguments against this are simply rhetoric and aren't based in fact.
In addition to that, take a look at Nvidia's stock. Their stock has taken a nose dive since Fermi released because investors know that AMD's latest line of GPUs are more power efficient and priced better and have sold better. Are they better? No, not for everyone (and not for me, because I think their drivers suck) but for the mainstream, yes, because of the above reasons (power usage and price.)
Nvidia's stock has lost 6$ a share (50%) since April, Fermi's release:
Anyone who cares to look at 5870 vs 480 logically (I'm not talking about you Nvidia fanboys who fail to ever see anything that Nvidia does wrong) just needs to accept that AMD offered the better product this time around. This is how it's viewed on Wall Street, and this is how it is in reality. Nvidia's saving grace at this point are their drivers and their feature set, but unfortunately for them, only the true enthusiast crowd seems to care about these. Not the mainstream.