View Single Post
Old 06-20-10, 10:37 PM   #16
Xion X2
Registered User
 
Xion X2's Avatar
 
Join Date: May 2006
Location: U.S.
Posts: 6,701
Default Re: B3D GTX-480 Thermal Study

Quote:
Originally Posted by Rollo
What rhetoric?
Practically everything that you posted above is a pretty good example.

The point is not what the GPUs run at in your system. You clearly did not comprehend any of my last post.

Attempting to gauge how hot a GPU runs is pointless if you're basing it on a dependent variable such as a heatsink, fan speed, case, ambient temps such as what you're doing. These will always differ between users which is why you have some people here reporting different temps than what you see in benchmarks from different review sites. Users own different cases, run their thermostats at different settings, run different drivers that control fan speeds on the cards differently, etc. These dependent variables all change the ultimate outcome. All of these variables make it impossible to build or sustain an argument about how how hot a GPU runs because there is no standardization involved.

To judge how hot a GPU runs, you simply look to a standardized measure which we have in power consumption and heat conversion:

Quote:
In physics and thermodynamics, heat is the process of energy transfer from one body or system due to thermal contact...
http://en.wikipedia.org/wiki/Heat

Quote:
Power consumption is the "correct" unit of heat dissipation.

...We shall take power consumption for their heat dissipation measure (see above). Moreover, power consumption turns out a much more flexible characteristic in this respect, because it allows to quickly obtain precise data on heat dissipation...
http://ixbtlabs.com/articles2/storage/hddpower.html

Where do you think all that power goes that runs from your wall socket, into your PSU, and into your Fermis? It's converted into heat. That's a constant, scientific measure that does not change. Measure the power consumption and you have the heat output.

Quote:
You mean the stock that sells for 50% higher price than AMD stock, that makes NVIDIA itself worth more than AMD and ATi combined?
Firstly, it's not 50% higher than AMD. AMD is at nearly 9$ share at the moment, and Nvidia is at 12. 3/9 is 33%. Go back to kindergarten and learn your math.

Secondly, AMD has been in debt for the last few years which always tanks a stock. What's Nvidia's excuse if not the disappointment from the investor crowd in Fermi? There is no debt on Nvidia's balance sheet and the drop in their stock price coincides perfectly with Fermi's launch.

Quote:
I have to respectfully disagree- the GTX480 offers much better DX11 and AA performance, and better performance in general. Someday ATi may equal them, but until then, performance oriented individuals only have one choice.
Go argue with Wall Street, pal. Clearly you know better than the pros do.
__________________

i7-2700k @ 5.0 GHz
Nvidia GeForce 570 2.5GB Tri-SLI
Asus P67 WS Revolution (Tri-SLI)
OCZ Vertex SSD x 4 (Raid 5)
G.Skill 8GB DDR3 @ 1600MHz
PC Power & Cooling 950W PSU
Xion X2 is offline   Reply With Quote