PDA

View Full Version : Official GeForce GTX 680 Review and Discussion thread


Pages : 1 2 3 4 5 6 [7] 8

Muppet
03-26-12, 11:37 PM
An overclocking guide. Nice performance increase. :)

http://www.guru3d.com/article/geforce-gtx-680-overclock-guide/1

I thought I read in one of these threads (can't find it now) that there was a way to boost the FPS on the middle screen while lowering the FPS on the 2 side screens in Surround. Is that true? :wtf:

Yes, that is what I have read to.

ViN86
03-27-12, 12:02 AM
http://www.atomicmpc.com.au/Review/294751,nvidias-geforce-gtx-680---in-the-labs.aspx/2

"The Boost technology – in order to function effectively – needs to process data every few milliseconds and adjust core voltage and clock instantly to reduce any “lag” between clock changes and greater demand from the user. If they only updated clocks every few seconds, the experience would be rather clunky to say the least. As you can probably guess, changing core voltage and clock every few milliseconds while trying to benchmark at a 40% overclock isn’t really helping anyone. For that reason alone, this card is sure to fail at extreme level overclocking, and arguably for some amateur overclockers.

For our sample, 1215MHz seemed to be all that we were able to boost to with any sort of reliability issues. This equates to a 159MHz overclock, past the default 1056MHz ceiling, and to be honest it isn’t really worth the hassle. Power draw goes up roughly 25 per cent to 45W, while performance only rose 1-10% over a range of applications. This removes most of the power advantage the GTX 680 has over the HD7970, and performs far worse than the latter when both are overclocked."
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
http://www.tomshardware.com/reviews/geforce-gtx-680-sli-overclock-surround,3162-11.html

"Overall, it’s pretty easy to see that AMD’s Radeon HD 7970 has more to gain from an aggressive overclock than Nvidia’s GeForce GTX 680."

"The other conclusion to fairly easily draw is that Nvidia’s GeForce GTX 680 sees much of its headroom exposed by GPU Boost already, leaving less on the table for overclocking."

For me..my next upgrade has to be a GTX680..unless ATI drops its price down before the end of the week..it makes no sense to get a 7970

Yea if AMD doesn't drop the price then there's no advantage. If they could sneak in under the 680 at the $450 mark, the OC headroom allows it to pull ahead of the 680. But at $550, it's a no-brainer for the 680.

Johnny C
03-27-12, 06:39 PM
Yea if AMD doesn't drop the price then there's no advantage. If they could sneak in under the 680 at the $450 mark, the OC headroom allows it to pull ahead of the 680. But at $550, it's a no-brainer for the 680.

@$425 or even $450 the 7970 is the winnar......it just fails @ 550

i SPY
03-28-12, 07:29 PM
well it sucks in cuda-z as expected.. I just saw this over at overclock.net

http://i765.photobucket.com/albums/xx294/82globe/Driver%20testing/cuda-z.png
http://www.overclock.net/t/1232473/official-nvidia-gtx680-owners-thread/810#post_16833215

Btw it has 1.98tflops and not ~3tflops heh,(lee) looks like that Samaritan will run without a problem on mine too, ok at 950mhz (1.81tflops)..:P



vs my Oc'ed 570 @ 965mhz
http://i765.photobucket.com/albums/xx294/82globe/Driver%20testing/cuda-z965mhz.png

stock still faster in other 3..
http://i765.photobucket.com/albums/xx294/82globe/Driver%20testing/th_default.png (http://s765.photobucket.com/albums/xx294/82globe/Driver%20testing/?action=view&current=default.png)

Fotis
03-29-12, 08:28 AM
Gainward Geforce GTX680 Phantom (http://www.gainward.com/main/vgapro.php?id=863)
GPU Clockspeed : 1150 MHz (boost) / 1084 MHz (base)
Memory Clockspeed : 3150 MHz (DDR6300)
6 phase pwm
DrMOS
http://www.gainward.com/main/product/vga/pro/p00863/p00863_pic_11624f73b47a50605.jpg

Fotis
03-29-12, 10:18 AM
Indeed very good review.Some things just can't be measured simply by quoting min or avg frames.

Hambone
03-29-12, 03:42 PM
I just got an EVGA 680 in. I haven't had an Nvida card since the ATI 5800 series came out so I'm totally out of touch.

I'm reinstalling Win7 from scratch right now. I kind of felt the need for a clean install anyway. For this EVGA 680 - should I use EVGA drivers or reference drivers? And are there any tools on the disk that I would want - and, if so, should I even bother with the disk? Usually I just assume any disk has old drivers and go to the net...

Also, I may be weird but I still use Ghost 2003 to make images (I'm happy with how simple it is and don't like the extra bloat in windows from a windows installed backup prog). Why would the 680 does this at the DOS level?

http://imageshack.us/photo/my-images/842/20120329155458.jpg/

Thanks!

FastRedPonyCar
03-29-12, 04:06 PM
I just got an EVGA 680 in. I haven't had an Nvida card since the ATI 5800 series came out so I'm totally out of touch.

I'm reinstalling Win7 from scratch right now. I kind of felt the need for a clean install anyway. For this EVGA 680 - should I use EVGA drivers or reference drivers? And are there any tools on the disk that I would want - and, if so, should I even bother with the disk? Usually I just assume any disk has old drivers and go to the net...

Thanks!

no use these (assuming you're installing a 64 bit version of windows 7 - which you should be doing)

http://www.geforce.com/drivers/results/42929

Hambone
03-29-12, 04:13 PM
no use these (assuming you're installing a 64 bit version of windows 7 - which you should be doing)

http://www.geforce.com/drivers/results/42929


Thanks - much appreciated.

Other question I had is - why would the 680 do this at the DOS level? I've never seen a DOS screen corrupted like this by any video card (yeah - I still use Ghost 2003 - very simple and easy to use, works on SSD's and no bloat at the OS level):

http://img842.imageshack.us/img842/5692/20120329155458.jpg

FastRedPonyCar
03-29-12, 04:44 PM
i've seen ghost have artifacts like that on other system too. I work with ghost 2003 every day. some GPU's just do that.

as long as it doesn't do it in windows, I wouldn't worry about it. Install MSI afterburner to monitor temps and overclock. Keep an eye on the temperature. There is a slight chance that the heatsink may not have proper contact with the GPU but those are slim odds.

Blacklash
03-30-12, 12:17 AM
IMO below is the important part of the [H]ard GTX 680 SLi review. I've used SLi and Crossfire extensively. I concur with their sentiments. Raw FPS doesn't tell the whole story, particularly when discussing MGPU.

"We don't know what other descriptive word to use, other than "smoothness" to describe the difference we feel between SLI and CrossFireX when we play games. We've expressed this difference in gameplay feeling between SLI and CrossFireX in the past, in other evaluations, and we have to bring it up again because it was very apparent during our testing of 680 SLI versus 7970 CFX.

We can't communicate to you "smoothness" in raw framerates and graphs. Smoothness, frame transition, and game responsiveness is the experience that is provided to you as you play. Perhaps it has more to do with "frametime" than it does with "framerate." To us it seems like SLI is "more playable" at lower framerates than CrossFireX is. For example, where we might find a game playable at 40 FPS average with SLI, when we test CrossFireX we find that 40 FPS doesn't feel as smooth and we have to target a higher average framerate, maybe 50 FPS, maybe 60 FPS for CrossFireX to feel like NVIDIA's SLI framerate of 40 FPS. Only real-world hands on gameplay can show you this, although we can communicate it in words to you. Even though this is a very subjective realm of reviewing GPUs, it is one we surely need to discuss with you.

The result of SLI feeling smoother than CrossFireX is that in real-world gameplay, we can get away with a bit lower FPS with SLI, whereas with CFX we have to aim a little higher for it to feel smooth. We do know that SLI performs some kind of driver algorithm to help smooth SLI framerates, and this could be why it feels so much better. Whatever the reason, to us, SLI feels smoother than CrossFireX.

Personally speaking here, when I was playing between GeForce GTX 680 SLI and Radeon HD 7970 CrossFireX, I felt GTX 680 SLI delivered the better experience in every single game. I will make a bold and personal statement; I'd prefer to play games on GTX 680 SLI than I would with Radeon HD 7970 CrossFireX after using both. For me, GTX 680 SLI simply provides a smoother gameplay experience. If I were building a new machine with multi-card in mind, SLI would go in my machine instead of CrossFireX. In fact, I'd probably be looking for those special Galaxy 4GB 680 cards coming down the pike. After gaming on both platforms, GTX 680 SLI was giving me smoother performance at 5760x1200 compared to 7970 CFX. This doesn't apply to single-GPU video cards, only between SLI and CrossFireX."

http://www.hardocp.com/article/2012/03/28/nvidia_kepler_geforce_gtx_680_sli_video_card_revie w/9

MUYA
04-15-12, 09:44 PM
First non official GTX 680 4GB (from Galaxy) numbers at expreview. The link below has 5760x1080 res benched vs 2GB 680 and 7970!

http://www.expreview.com/19134-7.html

FastRedPonyCar
04-15-12, 09:51 PM
3 screem gaming!

looks like there wasn't very much difference between the 2 and 4 gig models to be honest. I'm glad I didn't wait. the 2 gig version 3 screen benchmarks we've seen over the last few weeks have shown us that it's more than enough to drive most games at high frame rates and I think that a 2nd one is needed for those 60+ fps in games like metro and arkham city vs simply more RAM.

Muppet
04-16-12, 04:23 AM
I can'r understand anything written in that review. It would be nice to know what AA settings they used. Other than that, there doesn't appear to be a great difference between 2Gb and 4Gb at that res.

Pennyboy
04-18-12, 11:50 PM
i've seen ghost have artifacts like that on other system too. I work with ghost 2003 every day. some GPU's just do that.

as long as it doesn't do it in windows, I wouldn't worry about it. Install MSI afterburner to monitor temps and overclock. Keep an eye on the temperature. There is a slight chance that the heatsink may not have proper contact with the GPU but those are slim odds.

Hey FastRedPonyCar, do you know if MSI Afterburner is better than EVGAs precision software?

Muppet
04-19-12, 01:16 AM
Hey FastRedPonyCar, do you know if MSI Afterburner is better than EVGAs precision software?

I'm pretty sure they are both based on the same program, but have different skins. I think it maybe Rivatuner or from the maker of it.

FastRedPonyCar
04-19-12, 10:11 AM
I'm pretty sure they are both based on the same program, but have different skins. I think it maybe Rivatuner or from the maker of it.

yeah under the hood, both look very similar.

Afterburner wasn't picking up any info from the 680. the overclock sliders were grayed out so I couldn't change anything.

For the EVGA tool, you don't actually have sliders to dictate the max frequency, you slide them to indicate how far BEYOND the stock overclock boost you want it to go. so if the card by default will overclock itself to a max core clock of 1 ghz, if you want to set the overclock to 1.2 ghz, you slide the core clock slider to 200mhz.


Also, make sure if you use Precision X to overclock, you download OC scanner to test the overclock.

http://www.evga.com/ocscanner/

mojoman0
05-12-12, 04:54 PM
everyone see the evga 680 classified? 4gb 14 phase pwm

our old pal sneaking a glance

http://www.evga.com/forums/tm.aspx?m=1586778

fasedww
05-13-12, 12:03 PM
Ah crap! Just found out that the 680 has absolutely horrible iRacing performance :(

That puts an end to my 680 4GB Tri-SLI upgrade plan for good.

The only game that really matters to me is not running good at all right now.

Maybe it's just driver's at the moment.:)

Logical
05-16-12, 11:37 AM
That's most likely the case. I have to look into that more.

Hope the performance in iRacing is good because after nV's GK110 Tesla tease today, a desktop version is far out and I can't be using the 580s for so long...

I'm waiting for GK110, from some of the information that has been teased states its possibly getting 3072 shader processors. What do you mean by its far out ? From what i understand the gk 110 will be used for a desktop card as well as the tesla's and should be available by the end of the year.

Logical
05-16-12, 03:19 PM
nV showed the Tesla K20 today, the first GK110 product scheduled for a late 2012 release.
7.1 billion transistors, 2880 shader units if all clusters are enabled and a 384 bit memory interface.

So the first GK110 in Q4. Add a bit of delay to that and different scheduling for the desktop product and you'll end up with a 2013 release for the high-end desktop Kepler... Even the latest rumors indicate a 2013 release date.

That's too long for me. By then, it'll be time to upgrade the 680s again...

Ok im with you now, your probably right too a 2013 release is too far off. Was hoping to get GK110 by Christmas but if thats not possible i may have to get a GTX 680 soon and maybe add another at the end of the year.

Ninja Prime
05-16-12, 11:48 PM
I dont know that it will be much, if any faster. I'm still saying it a GPGPU only product. ~87% more units but running 43% slower means you end up, in raw numbers, only 7% faster than 680 GTX. Thats assuming perfect scaling, which we know doesn't happen. It would be lucky to break even with 680 GTX, except in DP, where its 3x faster than 680 GTX.

IMO, its a DP-upscaled Tesla-only part.

Muppet
05-18-12, 06:14 PM
Nice. I'm looking forward to hear your thoughts on the 4 Gb versions and Surround.

GitDat
06-01-12, 08:08 PM
Nice. I'm looking forward to hear your thoughts on the 4 Gb versions and Surround.

I still can't find a good 4GB vs. 2GB GTX 680 Surround comparison review :(

FastRedPonyCar
06-01-12, 08:15 PM
I still can't find a good 4GB vs. 2GB GTX 680 Surround comparison review :(

that's because there's virtually no real performance difference between the cards. 2 gig card runs multi monitor games at virtually the same frame rate as the 4 gig card.