PDA

View Full Version : Official GeForce GTX 680 Review and Discussion thread


Pages : 1 2 3 4 5 [6] 7 8

Maverick123w
03-23-12, 12:13 PM
I think that's only one piece of the puzzle.



The cards seem to be a clock race atm. GPU boost is really a big part of the 680's dominance.

http://i.imgur.com/1PCLn.jpg

http://vr-zone.com/articles/asus-gtx-680-2gb-overclocking-review-win-some-lose-some/15322-4.html

Also, it seems likely that Nvidia cherry picked review cards. Kyle at [H] saw his card hit 1300MHz! I'm willing to bet no one sees their card do that. Add to that the fact that many people run poorly ventilated cases. :o

Don't get me wrong, the 680 is a nice card, but the game this round is different than before. The two architectures are similar now, so for a comparable number of shaders, we are going to start getting into a clock race.

I'd venture that most people that buy $500 video cards have proper cases :p

ViN86
03-23-12, 12:20 PM
I'd venture that most people that buy $500 video cards have proper cases :p

Agreed, but proper spacing between cards isn't guaranteed. Even with good air flow in the case, if the two cards are close together, the inner card can get pretty warm.

Logical
03-23-12, 12:49 PM
Not sure about the cost since the 4GB FTW card is not yet available.
Most 3GB 580s are more than CHF 100.00 cheaper than the 680s. The 680 started at around CHF 600.00 (660.00 USD) in Switzerland.

For me, the performance gain could be significant because it would be 3 vs 2 cards. Guru3D will publish an article regarding Tri-SLI. That could be interesting.
And 4GB VRAM would be sweet as well. I simply can't use 2GB cards at 7680x1600. 3GB is the absolute minimum.

Yeah my point is that the 4GB ftw is like 3 months away or so i read, wouldn't you just rather wait for the 'higher' end kepler to be released?

i SPY
03-23-12, 05:39 PM
LOL apparently someone already killed one with just a minor OC 100mhz.. But its just a fake photoshop, esp. the way GPU stands out of the background :lol:


Hi guys!
I was happy a litle bit...
I change my HD7970 card to EVGA GTX680 card, because i need the extra fps in BF3.
I bought one, played a little at stock clocks, but i want to overclock a little, i pushed the power target to max and i added 100Mhz.
I played with BF3 with no problem about 1 hours but after immediatelly my pc was shutted down, and my card was smoked!!
Just watch this guys...
from g3d & overclock.net

http://www.overclock.net/t/1232473/official-nvidia-gtx680-owners-thread/240_30#post_16791753


http://i765.photobucket.com/albums/xx294/82globe/deadgtx680.jpg

i made a closeup to see if its really photoshoped,

http://i765.photobucket.com/albums/xx294/82globe/closeup.png

Its hard to tell, but yeah if you look at the whole picture you can see poor photoshop skillz xD







Edit:

Anyway some are having a hard time OC'ing past 140mhz, fan at 60-65% fixed it. Maybe driver throttling/temperature bug?
http://www.overclock.net/t/1232473/official-nvidia-gtx680-owners-thread/270_30#post_16792875

Muppet
03-23-12, 10:29 PM
When you adjust the power slide it increases the voltages. What a goose, no wonder he killed the card. :headexplode:

Bman212121
03-23-12, 11:02 PM
Some more Quad SLI AND Quad CFX benchmarks!

http://nl.hardware.info/reviews/2641/nvidia-geforce-gtx-680-quad-sli-review-english-version

Once again scaling is hit and miss with 3 and 4 card configurations, but when it works it works pretty well.

It would be interesting to see if they could also have done 3d surround testing. AvP could be run on normal settings and be able to handle it in full resolution with 60fps per eye. Assuming the software didn't have issues as 4 cards, surround, and 3d is incredibly complex. They might also have vram limitations for that type of setup. Skyrim it might even be possible to use AA on that type of setup!

GitDat
03-23-12, 11:13 PM
These latest GTX 680 drivers don't support my GTX 260 SC PhysX card :( Batman will have to suffer for awhile I guess, lol.

Maverick123w
03-23-12, 11:47 PM
Agreed, but proper spacing between cards isn't guaranteed. Even with good air flow in the case, if the two cards are close together, the inner card can get pretty warm.

Fair point

Ninja Prime
03-24-12, 12:05 AM
When you adjust the power slide it increases the voltages. What a goose, no wonder he killed the card. :headexplode:

Downside of the way they've done the boost clock...

i SPY
03-24-12, 12:15 AM
When you adjust the power slide it increases the voltages. What a goose, no wonder he killed the card. :headexplode:

nah its faked, check closely at whole card - especially the edges, he got it from this pic and then changed the background..

http://i765.photobucket.com/albums/xx294/82globe/GeForce_GTX_680_F_No_Therma.jpg
i got this from my post earlier and pic from alienbabeltech site..
http://www.nvnews.net/vbulletin/showthread.php?t=176749&page=9


VS faked
http://i765.photobucket.com/albums/xx294/82globe/deadgtx680.jpg

plus it has the same PCIe power connector shade marks on it, fail lol(lee)




Imo it was probably some big nvidia hatter or a Amd PR to freakout potential buyers lol:lol:

Roadhog
03-24-12, 12:16 AM
When you adjust the power slide it increases the voltages. What a goose, no wonder he killed the card. :headexplode:

It's a photoshop. Obvious troll was obvious.

Muppet
03-24-12, 12:48 AM
It's a photoshop. Obvious troll was obvious.

There's always one. :headexplode: Guess, i'm a bit naive. Can't for the life of me understand why people do crap like this.

Treason
03-24-12, 01:46 AM
It's a photoshop. Obvious troll was obvious.

It's just a lame attempt by an AMD PR/marketing hack to poison an otherwise flawless launch by nVidia.

fasedww
03-24-12, 09:41 PM
I'll be getting 2 Evga 680's to put in SLI, Can I use a 470gtx for physcis card or will this be in compatible or make it worse.?:o

Muppet
03-24-12, 09:53 PM
I'll be getting 2 Evga 680's to put in SLI, Can I use a 470gtx for physcis card or will this be in compatible or make it worse.?:o

That should work as far as i know. You are going to love these cards in SLI. They rock and if you have surround, then wow. Performance is through the roof compared to the GTX580's i had.

I have tried out BF3, Skyrim and Stalker, Call of Pripyat. performance has almost doubled in some situations and others it has more than doubled in Surround. stalker for instance i get 50-100 FPS at 6000x1200 DX11 everything maxed except AA with Sun Shafts. performance is stellar. BF3 runs at between 60 and 100 FPS on Custom Ultra (just turned off Motion Blur and AA) everything else maxed. I could only get 60 FPS on the 580's at high settings and no AA or Blur. Skyrim doesn't drop below 55 FPS on Maxed settings @6000x1200. The 580's would have crawled at that res and settings.

GitDat
03-24-12, 09:59 PM
My PhysX card is not supported in this drivers set :(

Muppet
03-24-12, 10:07 PM
My PhysX card is not supported in this drivers set :(

It's only early days yet with the drivers. I can run surround on the HP's, but can't enable the samsung using the Displayport connection as the forth monitor. Nvidia are aware of these issues and have said they are working on a fix.

RollinThundr
03-25-12, 12:01 AM
So question folks, not sure if it's been answered as I don't want to go through 14 pages of threads. What would be a good powersupply to run with this card?

WeReWoLf
03-25-12, 12:31 AM
Anything higher than a 650w I believe would be sufficient.

Muppet
03-25-12, 02:58 AM
Nvidia recommends: minimum 550watts, with 38amps on the 12volt rail.

Muppet
03-25-12, 04:11 AM
Sounds great! Maybe it's because you gained 0.5GB with these. Glad you like the cards :)

Yes, after all the hassle with Skyrim, I just did an integrity check of the files and that fixed it.

I also found out the issue with the 4th monitor, you need to connect it with Displayport or HDMI. That is if you are using Surround on the first 3 monitors. Otherwise it can be any connection you want if it's 4 individuals you using. There is a bug in the drivers, that won't allow Displayport to work with surround, but HDMI works ok. Nvidia knows and are working on a fix.

As for the cards, I love them. :)

john19055
03-25-12, 10:01 AM
If this is there low end chip the GK110 should be a monster.But my GTX 470's still have at least another year in them or longer ,and since I have three ,I can go to TRI-SLI when needed since I just game at 1920x1080.But the GTX 680 was more then I excepted since it is a GK104 chip supposed to be a midrange card,The GK110 should be out by the holidays and it should put the hammer down since the GK104 is a midrange chip or at least it used to be.

RollinThundr
03-25-12, 08:56 PM
Nvidia recommends: minimum 550watts, with 38amps on the 12volt rail.

Thank you for the info, I'm going to upgrade the 750 watt I have now before getting one of these cards. I think my machine is underpowered due to led's and a lit front panel. Since I am consistently getting the lovely "display driver has crashed and has been restarted" crap off and on no matter what card is in the system or what Nvidia drivers I use.

GitDat
03-26-12, 09:29 AM
I thought I read in one of these threads (can't find it now) that there was a way to boost the FPS on the middle screen while lowering the FPS on the 2 side screens in Surround. Is that true? :wtf:

Johnny C
03-26-12, 09:45 PM
http://www.atomicmpc.com.au/Review/294751,nvidias-geforce-gtx-680---in-the-labs.aspx/2

"The Boost technology – in order to function effectively – needs to process data every few milliseconds and adjust core voltage and clock instantly to reduce any “lag” between clock changes and greater demand from the user. If they only updated clocks every few seconds, the experience would be rather clunky to say the least. As you can probably guess, changing core voltage and clock every few milliseconds while trying to benchmark at a 40% overclock isn’t really helping anyone. For that reason alone, this card is sure to fail at extreme level overclocking, and arguably for some amateur overclockers.

For our sample, 1215MHz seemed to be all that we were able to boost to with any sort of reliability issues. This equates to a 159MHz overclock, past the default 1056MHz ceiling, and to be honest it isn’t really worth the hassle. Power draw goes up roughly 25 per cent to 45W, while performance only rose 1-10% over a range of applications. This removes most of the power advantage the GTX 680 has over the HD7970, and performs far worse than the latter when both are overclocked."
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
http://www.tomshardware.com/reviews/geforce-gtx-680-sli-overclock-surround,3162-11.html

"Overall, it’s pretty easy to see that AMD’s Radeon HD 7970 has more to gain from an aggressive overclock than Nvidia’s GeForce GTX 680."

"The other conclusion to fairly easily draw is that Nvidia’s GeForce GTX 680 sees much of its headroom exposed by GPU Boost already, leaving less on the table for overclocking."

For me..my next upgrade has to be a GTX680..unless ATI drops its price down before the end of the week..it makes no sense to get a 7970

Good info.

I'm waiting 2 weeks to see if AMD prices drop, if not....I'm getting a 680. It will be the last upgrade for this rig. Next rig will regretfully be an Intel based one.