Originally Posted by Rollo
This is the main reason I replied.
I wasn't trying to pull an argument out of my a$$ or "justify" anything.
Why is it that when I reply on a thread, without mentioning you anywhere in my post or debating any of your points, that you automatically assume that I am referring to you and take a defensive stance?
That was a general statement made regressing all the way back to Fermi's launch--had nothing to do with you. There have been endless arguments/excuses on here made for its temps. This is what I was referring to. To be blunt, I've said repeatedly that I really don't care what your opinion is as you're self-admittedly biased toward Nvidia and are rarely going to consider an opposing viewpoint seriously. So unless you've turned over a new leaf since you were last banned from here, my position remains unchanged.
And again, it doesn't really matter what you, I, or any of these review sites out there bench the cards running at. Power to heat conversion is embedded in the laws of Physics. So if a card consumes more power than another card, well, the GPU is going to run hotter. Now there are cases where it may not say so on your little temperature gauge in Furmark or whatever tool you're measuring GPU temps with, but that is simply due to a dependent variable such as a better case, better HSF, faster GPU fan speed, lower ambient temps, etc that always vary between users and cards. That power running from your wall socket and into your PSU/GPU cannot be destroyed; it just converts to another form: heat.
Heat is heat. It doesn't magically disappear. Given equal factors (same HSF, same ambient temps, same case, same GPU fan speed, etc) Fermi 480 will always, always, ALWAYS run hotter than any other single GPU card out there because it consumes more power than the rest of them. Period.