Originally Posted by jyavenard
My 9400GT shows the maximum temperature settings at 120 degres.. so 71 degres is pretty cool
I know quite a lot about chips, see my site:
but I'm not an overclocking expert so I start to worry when a chip reaches a temperature of over 70 degrees (and especially measured under Linux, not using any of it's 3D-capabilities).
Normally I don't overclock parts (CPU's and GPU's) unless they are cheap to replace but why overclock them? Just buy a new generation of hardware!
BTW. When your GPU reaches a temperature of 120 degrees, you can boil water on it or instantly fry an egg on it!
In principle I prefer a modern video card with a fan that only
starts to rotate fast enough to make noise when the card
really starts to work at heavy-3D problems.
By the way, only the game 'Quantum of Solace' was demanding
enough to make me upgrade my hardware. The demo was very
shaky and almost unplayable.
I had a discussion once with a sales person in a PC-shop about
Crysis and that it demanded so much from the hardware but
I could play it quite well using my old CPU (3500+ AMD Athlon 64)
and my old 7600GT at modest settings of course (1024x768).
But never underestimate the power of the old nVidia cards!
Over the years I have gained experience with the TNT1/TNT2
The TNT1 is no longer supported by nVidia, but the TNT2 is
still capable of delivering quite impressive 3D-power (for old
Of course I wouldn't recommend it for modern PC's...