Go Back   nV News Forums > Graphics Card Forums > NVIDIA GeForce 400/500 Series

Newegg Daily Deals

Reply
 
Thread Tools
Old 06-21-10, 08:55 AM   #25
Yaboze
Registered User
 
Join Date: Oct 2005
Location: United States
Posts: 2,057
Default Re: B3D GTX-480 Thermal Study

I like the ATI cards but I really just hate their drivers. If you guys remember the 4870 ran hot and had weird fan issues in the beginning, people used a tweak to set the fan to 20 or 25% or whatever it was.

Like I said, I like both companies and cards but for me, it all comes down to Nvidia's drivers. Not CUDA, PhysX, Heat or anything else. If the 5870 was a few degrees warmer and faster than the 480 by (just reverse the story here), I'd probably still get the 480 because of the drivers. Me and my friend had 4870's and I sold him mine so he can do CF and I went with a 260-216 and then a 285 on RMA due to a bad fan.
Yaboze is offline   Reply With Quote
Old 06-21-10, 09:40 AM   #26
AntaresDC
Registered User
 
Join Date: Jan 2009
Posts: 27
Default Re: B3D GTX-480 Thermal Study

That's 'nuff said for me:

B3D quotes:

"So for what it is worth we could actually stop here and say stop whining about the GTX-480 temperatures. The HD 5970 was 1C lower than the GTX-480 and the same temperature as the GTX-470. The HD 5870 was 4C lower than the GTX-480. The GTX-285 was 5C lower than the GTX-480. The HD 5870 will throttle at 100C so it has 8C left, the GTX-480 throttles at 105C so it has 9C left yet the screaming about the thermals on the GTX-480 were ripped apart. I don't remember anyone screaming about the 5870 or the 5970 yet they have less overhead to throttle than the GTX-480 and have far less transistors to service per core. Each core on the HD 5870 and HD5970 have 2.154 Billion Transistors and come within 4C of the GTX-480 which supports 0.846 Billion or 846 Million more transistors.

(...)

All that being said it's still by Furmark which presents unrealistic temperatures but since most of the review sites were reporting GTX-480 temperatures with Furmark we though we'd set the record straight. If you want to scream about temperatures do so in a fair manner. All the modern GPU's run hot in Furmark, Furmark is designed to run them hot and isn't representative of real life temperatures."

Because of all those "reviews" that screamed about furmark temps, I was about not to buy the card. I followed lots of users reviews, temps posted etc. I know that experiences may vary from case to case (bot users & computer cases). So I bit the bullet.

Guess what? I'm glad I did.

I find B3D article spot on and presenting the reality as it is. I also happen to have a well ventilated case (CM690 with six 120 and 140 mm high-quality fans). So who cares about furmark temps? Never seen such thing during intense gaming.

In the most demanding games I never seen the temps rising over 85 degrees... Actually, in Metro 2033, the temps are 80 to 82 degrees with fan at 70% maximum. My older GTX280 usually ran at around 85 degrees (Stalker-COP, Crysis), it also idled at higher temps.

That is why most of the reviews must be taken with a grain of salt.
AntaresDC is offline   Reply With Quote
Old 06-21-10, 10:32 AM   #27
grey_1
Guest
 
Posts: n/a
Default Re: B3D GTX-480 Thermal Study

fwiw my 470 runs within 4-6c (approx) of what my 4870 did, and with much, much less annoying noise from the fan.

The hottest I've seen it get yet in a regular game is, oddly enough, during a dice mini game in The Witcher. For some reason that little mini game gets the gpu cooking to about 88-89c.

Using the default fan profile btw.
  Reply With Quote
Old 06-21-10, 10:36 AM   #28
DiscipleDOC
 
DiscipleDOC's Avatar
 
Join Date: Dec 2002
Location: Alabama, Planet Earth
Posts: 5,993
Default Re: B3D GTX-480 Thermal Study

Quote:
Originally Posted by grey_1 View Post
fwiw my 470 runs within 4-6c (approx) of what my 4870 did, and with much, much less annoying noise from the fan.

The hottest I've seen it get yet in a regular game is, oddly enough, during a dice mini game in The Witcher. For some reason that little mini game gets the gpu cooking to about 88-89c.

Using the default fan profile btw.
I've never owned a 4870+ card. I used to have an AIW 9800 card back in the day...but since then, I've always used nVidia cards.
DiscipleDOC is offline   Reply With Quote
Old 06-21-10, 11:16 AM   #29
grey_1
Guest
 
Posts: n/a
Default Re: B3D GTX-480 Thermal Study

Quote:
Originally Posted by DiscipleDOC View Post
I've never owned a 4870+ card. I used to have an AIW 9800 card back in the day...but since then, I've always used nVidia cards.
The 4870 ran pretty warm, but no warmer than most other high end cards at the time. I can say though the 470 is a pleasant surprise.

I will be turning on the A/C soon though, ambient is warm enough now that heat output from the pc really raises temps in the office.


Almost enough to make me consider going water again.
  Reply With Quote
Old 06-21-10, 12:24 PM   #30
Rollo
 
Join Date: Jul 2003
Posts: 1,719
Default Re: B3D GTX-480 Thermal Study

Quote:
Originally Posted by Xion X2 View Post
The logic of this entire thing is pretty simple.

Can the GTX480 be cooled to the temps of the 5870 and under? Sure. Slap a massive heatsink or a waterblock on it and watch the temps nose dive.

So what? Trying to accurately gauge temps of a GPU based on whatever heatsink it has is useless. The heatsink, case, fan speed and ambient temps will differ a hundred times over between users. This is why you should base how warm a GPU runs on something concrete like its power usage because that is an independent variable that is constant.

These last cards from Nvidia are good performers, but I, like Madpistol, am sick of seeing people pull any argument that they can out of their &#^ trying to justify the temps. It is the highest power-drawing single-GPU card ever; therefore it is the hottest running single GPU ever. It's as simple as that. Any arguments against this are simply rhetoric and aren't based in fact.
This is the main reason I replied.

I wasn't trying to pull an argument out of my a$$ or "justify" anything.

What Bjorn3d said, and what I echoed, is that it's misleading to judge a cards thermals based on Furmark as it's not representative of normal use.

Let's say I'm trying to decide between a Fenwick and St Croix freshwater spinning rod and all the reviews say "OMG! We bolted each rod to the edge of the roof, tied a 300 pound weight to the tip, and threw it off. Only the Fenwick rod broke- clearly they are inferior.". It's a test- but of what? Conditions that don't happen?

Same with Furmark- with todays console ports you mostly won't be nearing Furmark GPU loads.

The 5870 and 5850 will still be running cooler of course, but there's a big difference in going in to a purchase thinking your card is going to be running 94C and mid 80sC. The latter is far more appealling and would take temperature out of the decision for most people.
__________________
Rig1:
intel 990X + 2 X EVGA 3GB GTX580 + 3 X Acer GD235Hz
3D Vision Surround

Rig 2:
intel 2500K + NVIDIA GTX590 + Dell 3007 WFPHC

[SIZE="1"]NVIDIA Focus Group Member
[B]NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.[/B][/SIZE]
Rollo is offline   Reply With Quote
Old 06-22-10, 01:17 PM   #31
Xion X2
Registered User
 
Xion X2's Avatar
 
Join Date: May 2006
Location: U.S.
Posts: 6,701
Default Re: B3D GTX-480 Thermal Study

Quote:
Originally Posted by Rollo View Post
This is the main reason I replied.

I wasn't trying to pull an argument out of my a$$ or "justify" anything.
Why is it that when I reply on a thread, without mentioning you anywhere in my post or debating any of your points, that you automatically assume that I am referring to you and take a defensive stance?

That was a general statement made regressing all the way back to Fermi's launch--had nothing to do with you. There have been endless arguments/excuses on here made for its temps. This is what I was referring to. To be blunt, I've said repeatedly that I really don't care what your opinion is as you're self-admittedly biased toward Nvidia and are rarely going to consider an opposing viewpoint seriously. So unless you've turned over a new leaf since you were last banned from here, my position remains unchanged.

And again, it doesn't really matter what you, I, or any of these review sites out there bench the cards running at. Power to heat conversion is embedded in the laws of Physics. So if a card consumes more power than another card, well, the GPU is going to run hotter. Now there are cases where it may not say so on your little temperature gauge in Furmark or whatever tool you're measuring GPU temps with, but that is simply due to a dependent variable such as a better case, better HSF, faster GPU fan speed, lower ambient temps, etc that always vary between users and cards. That power running from your wall socket and into your PSU/GPU cannot be destroyed; it just converts to another form: heat.

Heat is heat. It doesn't magically disappear. Given equal factors (same HSF, same ambient temps, same case, same GPU fan speed, etc) Fermi 480 will always, always, ALWAYS run hotter than any other single GPU card out there because it consumes more power than the rest of them. Period.
__________________

i7-2700k @ 5.0 GHz
Nvidia GeForce 570 2.5GB Tri-SLI
Asus P67 WS Revolution (Tri-SLI)
OCZ Vertex SSD x 4 (Raid 5)
G.Skill 8GB DDR3 @ 1600MHz
PC Power & Cooling 950W PSU
Xion X2 is offline   Reply With Quote
Old 06-22-10, 02:43 PM   #32
Vorgus
Registered User
 
Vorgus's Avatar
 
Join Date: Dec 2008
Posts: 107
Default Re: B3D GTX-480 Thermal Study

I don't like to see chips pushing 100F much less 100c and 90c is getting mighty close. Whats more, we shouldn't need to have a separate circuit just to run a computer! For awhile now we have been seeing power supplies coming out with TWO power cords because a typical house outlet is only rated to 20 amps. Using 2 cords isn't going to help unless they are each on a different circuit because most breakers are 15-20 amps. And if you trip one, then the other is overloaded.

Just about every other part in the computer is getting it's power requirements reduced. Video card makers need to catch up. A modern high end card pulls 2-3 times more power then all the rest of the computer combined, and that includes the displays which have moved to LCD. My new quad core 3Ghz AMD computer with 4 hard drives ranging from 160 to 650Gb, and 8Gb of ram pulls far less power then my old AthlonXP running at 1.9Ghz with 2 80Gb drives and 1Gb of ram, not including the monitor which is the same. The main power difference is likely due to the fact that the old computer has a Ti4600 and the new one is using the intigrated graphics because I couldn't afford a new video card. Video cards pull too much power.
Vorgus is offline   Reply With Quote

Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 12:21 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.