Go Back   nV News Forums > Graphics Card Forums > NVIDIA GeForce 200 Series

Newegg Daily Deals

Reply
 
Thread Tools
Old 08-26-08, 11:48 PM   #1
XxDeadlyxX
STALKER!!!!!!!!
 
XxDeadlyxX's Avatar
 
Join Date: Oct 2004
Location: NSW, Australia
Posts: 1,455
Default UT3 runs worse with 'Enable SLI Technology'

With my GTX 280 SLI setup, I'm having a weird problem.

If I Enable SLI technology in Nvidia CP and set the 3D settings to 'Single GPU', then in-game framerates are noticeably lower than when I choose 'Do not enable SLI technology'. Choosing 'Do not enable..', makes the game silky smooth and rarely dips under 60fps. With SLI Technology enabled, walking around the same maps with no players and FRAPS running the fps is noticeably lower when looking certain directions and firing rocket launcher there etc. I've flipped back and forward while testing it many times so I'm not imagining it.

In general, I've found that UT3 suffers from very slight input lag when I actually enable "Nvidia Recommended" or manually choose AFR2 in the game's Profile settings. Maximum Pre-rendered frames set at 3 compared to 0 and 8 seems to make little difference regarding that as well. Therefore, due to UT3 being a very fast-paced game online where every bit of accuracy helps, I've left it on 'Single GPU', where there is no lag, just like setting 'Do not enable SLI technology'.

When I was messing around with nHancer, choosing 2x2 Supersampling DID give a significant performance increase with SLI compared to Single GPU, however it still not as smooth as "Do not enable SLI technology" with single GPU and no AA.

I very much doubt there is a solution to this and it's probably a bug of some sort. Crysis runs really good with SLI and obviously I want to be able to use SLI on other titles without this issue, so choosing "Do not enable SLI technology" is not an option unless I was to change it every time I played UT3.

Any thoughts?
__________________
Main PC [Core i7 980X @ 4.3GHz] [Gigabyte X58A-UD7] [7TB of HDDs] [2x Inno3D GTX 480 SLI @ stock] [12GB Team Xtreem 7-7-7-24 @ 1500mhz] [Antec 1200] [Alienware AW2310 (general use/3D Vision) and Samsung 46B650 @ 1080p24] [X-Fi Forte] [Pioneer Blu-ray BDC-S02] [Silverstone Strider 1500W] [Creative Gigaworks S750 7.1] [Windows 7 x64] 3DMark Vantage - 38010
XxDeadlyxX is offline   Reply With Quote
Old 08-27-08, 12:50 AM   #2
Kaguya
Registered User
 
Kaguya's Avatar
 
Join Date: May 2003
Posts: 661
Default Re: UT3 runs worse with 'Enable SLI Technology'

If you do find that UT3 runs better on a single card, i don't think you need to disable SLI manually each time, can't you just use nHancer to force a different SLI profile, say Single GPU instead of whatever is there by default?

Also, just out of curiosity, have you tried enabling VSync (and triple buffering if UT3 has that)? I found in some games that enabling VSync smoothed them out noticeably in SLI.
__________________
Intel Core i7 960 @ 3.6Ghz
EVGA X58 3X SLI
OCZ Gold 12GB DDR3
EVGA GeForce GTX 470 x2 SLI
OCZ Vertex 2 EX & Corsair P256
LG GH22NS50 22X DVD-RW SATA
Antec TruePower Quattro 1000w SLI PSU
Acer HN274H 1920x1080 3D Ready LCD
Logitech X-530 5.1 Speakers
Antec 900 Ultimate Gamer Case
Kaguya is offline   Reply With Quote
Old 08-27-08, 01:18 AM   #3
XxDeadlyxX
STALKER!!!!!!!!
 
XxDeadlyxX's Avatar
 
Join Date: Oct 2004
Location: NSW, Australia
Posts: 1,455
Default Re: UT3 runs worse with 'Enable SLI Technology'

Quote:
Originally Posted by Kaguya View Post
If you do find that UT3 runs better on a single card, i don't think you need to disable SLI manually each time, can't you just use nHancer to force a different SLI profile, say Single GPU instead of whatever is there by default?

Also, just out of curiosity, have you tried enabling VSync (and triple buffering if UT3 has that)? I found in some games that enabling VSync smoothed them out noticeably in SLI.
Already tried that. It seems to have nothing to do with choosing Single GPU or any other mode (Auto or AFR etc), anything when Nvidia CP is set to 'Enable SLI technology' does not give as good fps as 'Do not use SLI technology'.

I am running Vsync.

Best way I can describe it is, if I go on Deck with 'Do not use..' and have no bots then the framerate pretty much does NOT go lower than 60fps except to 58-59 occasionally. If "Enable SLI.." is chosen and I do the same thing, the fps drops to 50 and in the 40s when I fire rockets into the middle of the map etc.
__________________
Main PC [Core i7 980X @ 4.3GHz] [Gigabyte X58A-UD7] [7TB of HDDs] [2x Inno3D GTX 480 SLI @ stock] [12GB Team Xtreem 7-7-7-24 @ 1500mhz] [Antec 1200] [Alienware AW2310 (general use/3D Vision) and Samsung 46B650 @ 1080p24] [X-Fi Forte] [Pioneer Blu-ray BDC-S02] [Silverstone Strider 1500W] [Creative Gigaworks S750 7.1] [Windows 7 x64] 3DMark Vantage - 38010
XxDeadlyxX is offline   Reply With Quote
Old 08-27-08, 02:28 AM   #4
Digital_Trans
 
Digital_Trans's Avatar
 
Join Date: May 2005
Location: Sin City
Posts: 2,796
Default Re: UT3 runs worse with 'Enable SLI Technology'

HAHAHA UT3 sucks~~~~
__________________
Case: HP Blackbird 002
PSU: Topower 1500W
Mobo: ASUS Striker Extreme II 790i Ultra
CPU: Intel Core 2 Extreme QX9770 Quad 3.8GHz LCi
RAM: Corsair 4GB DDR3 1800 (PC3 14400)
GPU: 3 X EVGA GTX 280 FTW Edition TRI-SLI
HDD: 3 X 150GB WD Raptor-X
HDD: 1 - 500GB Seagate Barracuda
HDD: 1 - 640GB WD Caviar SE
DVD: 2 X Ultra Slim 20X IDE DVD/CD Writer
SPU: Creative X-Fi Titanium Fatal1ty PCI-Express
LG Super Multi Blu-ray RW/HDDVD: GGW-H20L
Display: Dell UltraSharp 30" LCD
Gateway XHD3000 Black 30" LCD
Windows Vista Ultimate Edition x64 SP1
Digital_Trans is offline   Reply With Quote
Old 08-27-08, 10:32 AM   #5
Kaguya
Registered User
 
Kaguya's Avatar
 
Join Date: May 2003
Posts: 661
Default Re: UT3 runs worse with 'Enable SLI Technology'

Quote:
Originally Posted by XxDeadlyxX View Post
Already tried that. It seems to have nothing to do with choosing Single GPU or any other mode (Auto or AFR etc), anything when Nvidia CP is set to 'Enable SLI technology' does not give as good fps as 'Do not use SLI technology'.

I am running Vsync.

Best way I can describe it is, if I go on Deck with 'Do not use..' and have no bots then the framerate pretty much does NOT go lower than 60fps except to 58-59 occasionally. If "Enable SLI.." is chosen and I do the same thing, the fps drops to 50 and in the 40s when I fire rockets into the middle of the map etc.
Wow, that's nuts... there's something screwy going on. Have you tried uninstalling, cleaning and installing the newest betas (177.92 I think). They've been great for me.
__________________
Intel Core i7 960 @ 3.6Ghz
EVGA X58 3X SLI
OCZ Gold 12GB DDR3
EVGA GeForce GTX 470 x2 SLI
OCZ Vertex 2 EX & Corsair P256
LG GH22NS50 22X DVD-RW SATA
Antec TruePower Quattro 1000w SLI PSU
Acer HN274H 1920x1080 3D Ready LCD
Logitech X-530 5.1 Speakers
Antec 900 Ultimate Gamer Case
Kaguya is offline   Reply With Quote
Old 08-27-08, 05:37 PM   #6
XxDeadlyxX
STALKER!!!!!!!!
 
XxDeadlyxX's Avatar
 
Join Date: Oct 2004
Location: NSW, Australia
Posts: 1,455
Default Re: UT3 runs worse with 'Enable SLI Technology'

Quote:
Originally Posted by Kaguya View Post
Wow, that's nuts... there's something screwy going on. Have you tried uninstalling, cleaning and installing the newest betas (177.92 I think). They've been great for me.
Yeh I just formatted and installed 177.92 clean and it's the same

What I have figured out though, is that the lower fps with Enable SLI seems to be directly related to CPU/memory speed. If I go back to stock 2.4GHz, the fps will be about another 10 frames lower again (really bad). Increasing my memory clock from 800 4-4-4-12 to 1066 5-5-5-15 makes it increase by about 2-3fps.

Maybe enabling SLI automatically makes you heaps CPU limited? That's what I initially thought, because my results at 640x480 are identical to 1920x1200.
__________________
Main PC [Core i7 980X @ 4.3GHz] [Gigabyte X58A-UD7] [7TB of HDDs] [2x Inno3D GTX 480 SLI @ stock] [12GB Team Xtreem 7-7-7-24 @ 1500mhz] [Antec 1200] [Alienware AW2310 (general use/3D Vision) and Samsung 46B650 @ 1080p24] [X-Fi Forte] [Pioneer Blu-ray BDC-S02] [Silverstone Strider 1500W] [Creative Gigaworks S750 7.1] [Windows 7 x64] 3DMark Vantage - 38010
XxDeadlyxX is offline   Reply With Quote
Old 08-28-08, 06:55 AM   #7
john19055
 
john19055's Avatar
 
Join Date: Jul 2002
Location: GREENVILLE,TX
Posts: 3,857
Default Re: UT3 runs worse with 'Enable SLI Technology'

They are some other games that run better with just one GTX 280 then with two in SLI like,Quake Wars - Enemy territory and Tom Clancy's Ghost Recon: Advanced Warfighter 2
until you get to 2560x1600.It does'nt make a lot of since unless it is the drivers.some say it is CPU bound and needs a lot faster CPU.
__________________
Intel i7-3820+Corsair H-100+Gigabyte X79-UD5+16gigs G.Skill PC1600DDR3+2-ASUS DirectCU II GTX-670 in SLI+Crucial 256g-SSD+1-3Gig Seagate+2- Samsung 1-TB+3-WesternDigtial 640g+LG-12x Blu-Ray Burner+850watt XFX+Antec-P280 case+50" Plasma PM6700+Logitech Mouse+Keyboard+Pioneer VSX-1020+Polk Audio Speakers
john19055 is offline   Reply With Quote
Old 08-28-08, 07:00 AM   #8
XxDeadlyxX
STALKER!!!!!!!!
 
XxDeadlyxX's Avatar
 
Join Date: Oct 2004
Location: NSW, Australia
Posts: 1,455
Default Re: UT3 runs worse with 'Enable SLI Technology'

Quote:
Originally Posted by john19055 View Post
They are some other games that run better with just one GTX 280 then with two in SLI like,Quake Wars - Enemy territory and Tom Clancy's Ghost Recon: Advanced Warfighter 2
until you get to 2560x1600.It does'nt make a lot of since unless it is the drivers.some say it is CPU bound and needs a lot faster CPU.
Yeah, the faster I make my CPU, and memory MHz too, the higher the minimum fps is with 'SLI Enabled'.

I've found that with CPU at 3.4-3.6ghz, memory at ~1000+ @ 5-5-5-15, and World Detail at 2 (LOL), I can get a constant 60fps with 'Enable SLI technology'. So that's what I'll use, as I play UT3 just online now, so not too concerned with the eye candy. Constant 60fps is the most important thing. Increasing World Detail, even to 3, results in noticeably worse fps.
__________________
Main PC [Core i7 980X @ 4.3GHz] [Gigabyte X58A-UD7] [7TB of HDDs] [2x Inno3D GTX 480 SLI @ stock] [12GB Team Xtreem 7-7-7-24 @ 1500mhz] [Antec 1200] [Alienware AW2310 (general use/3D Vision) and Samsung 46B650 @ 1080p24] [X-Fi Forte] [Pioneer Blu-ray BDC-S02] [Silverstone Strider 1500W] [Creative Gigaworks S750 7.1] [Windows 7 x64] 3DMark Vantage - 38010
XxDeadlyxX is offline   Reply With Quote

Old 08-31-08, 07:08 AM   #9
FearMeAll
 
FearMeAll's Avatar
 
Join Date: Jul 2004
Location: Georgia, USA
Posts: 827
Default Re: UT3 runs worse with 'Enable SLI Technology'

This is why I'm sort of glad after 3 years of using Sli, I'm back to single card. So much fiddling you have to do...so much..
__________________
Consoles: Xbox Elite with HDDVD drive and PS3.
Screen:
Panasonic AX200U Widescreen Projector (1280x720), 160" HighPower Screen, AND A BIG COUCH.
[FONT="Courier New"] HTPC: Intel Wolfdale 3ghz, 4 gigs DDR2 800 ram, 500gig HDD, Craptastic 7600GS, Windows 7 32bit, XBMC installed [/FONT] :D
[B]My Video Encoding/Office rig:[/B] [I] Intel QX9650 @ 3ghz, 1333mhz FSB, 4gigs Patriot Extreme DDR3 @ 1333mhz, XFX 790i 3-way Sli mobo, Xf-Fi Extreme Gamer, 400watt Fanless Power Supply, 9500GT 64-bit bus(ouch!), 8.750TB HDD, Windows 7 64-bit Ultimate. [B][FONT="Arial Narrow"]Everything at stock and completely boring..[/FONT][/B][/I] :type:
FearMeAll is offline   Reply With Quote
Old 08-31-08, 12:22 PM   #10
MisterMister
Registered User
 
MisterMister's Avatar
 
Join Date: Jun 2008
Location: San Diego, CA
Posts: 118
Default Re: UT3 runs worse with 'Enable SLI Technology'

You know, I really dont get the big deal with SLI (problems wise). Ive been using my 9600GT SLI setup for about 6 months now without a single problem. In fact the BIGGEST "issue" I've ever encountered is having to re-enable SLI in the Nvidia control panel after changing/updating the drivers!

On my current system/setup, I've played (in no specific order):

The Orange Box (HL, HL2, HL2 ep. 1 & 2, TF2, & Portal); Need For Speed: Pro Street & NFS: Undercover; G.R.I.D. ; Call of Duty 2 & COD 4; DOOM3; F.E.A.R. & it's expansion packs; Supreme Commander; Black & White 2 & "Battle Of The Gods" expansion pack; Crysis; Far Cry; Battlefield 2 & 2142; Tiger Woods PGA 2007 & 2008; Bioshock; Gears of War; Star Trek Legacy; UT04; UT3; Devil May Cry; IL2 Sturmovik; Americas Army; Quake III Arena; Fallout 2; GTA: Vice City; GUN; Rainbow Six 3; Microsoft Flight Sim X; Microsoft Combat Flight Sim 3; & Sins Of a Solar Empire... and that's what I just remember off the top of my head! It's also not including the handful of game demos which I have downloaded and played (with similar results).

I should also mention that I have run benchmarks including 3DMark 2001SE, 2003, 2005, 2006, & 3DMark Vantage... as well as AquaMark, PCMark06 & PCMark Vantage.

I have been able to play ALL of these games/programs with my SLI setup without issue ONE. No noticable "micro stuttering", no huge drops in performance, no freezing/crashing, or any other issues which would scream "its SLI's fault!"...

...And as you can see I've played everything from classics from the 90s, all the way to the latest and greatest current titles... as well as a myriad of graphic & system benchmark programs.

So with that, I have to say, SLI has been VERY good to me. I get kick ass benchmark scores, and I can play every game I just mentioned (with the exception of Crysis) at it's respective maximum detail settings (including AA/AF) @ my screen's native resolution of 1680x1050. And when I say "I can play" I mean I am getting (very) near, at, or above a 60FPS average frame rate. All of which being run on a pair for mid range video cards which can now be had for under $100 USD each (I did pay ~$140/ea when I first bought them 5+ months ago of course)! Now, THATS "bang for your buck" any way you look at it IMO.

To bring things to a finer point... going off of my first hand experience with SLI... I must admit that I not only have NO qualms with it but I also have no doubt that I will be using SLI and/or multi GPU video card setups in the future.

-MM
__________________
Ultra M998 mid-ATX case
SilenX 650W 14dB SLI Power Supply Unit
Nvidia/EVGA 780i tri-SLi motherboard
Intel Q6600 quad core @ 3.4GHz (3.6 for benchmarking)
2GB OCZ ReaperX HPC Edition PC9200 DDR2 RAM (2x1GB)
2x EVGA 9600GT SC in SLI (750/1100Mhz over-clock)
2x WD Raptor HDD (1x 36GB; 1x 74GB)
Samsung Spinpoint HDD (250GB)
Thermalright TRUE-120 w/ dual 120mm fans in push-pull config.
Windows Vista Home Professional w/ SP1 (32-bit)
Samsung SyncMaster 206BW 20" LCD display (1680x1050)

3DMark06: 17,420 Points
MisterMister is offline   Reply With Quote
Old 09-13-08, 06:29 PM   #11
H3llF1re2008
Registered User
 
Join Date: Sep 2008
Posts: 2
Exclamation Re: UT3 runs worse with 'Enable SLI Technology'

Hey buddy. Just found this thread via a Google search - thought I'd just post my experiences. I've just recently upgraded my rig with GTX280 SLI (managed to grab the cards for a real good price). Nforce 680i, E6850 (stock clocks presently), 4GB Geil black dragon, 1000w PSU, Vista 64bit SP1 (and also I have the Antec 1200, as well! :-)).

Generally I'm really happy with the cards - seeing great scaling in games like Crysis and Bioshock. Unfortunately though UT3 actually, if anything, plays worse than before with my old (single) 8800 card! Like you, I get performance gains if I uncheck "Enable SLi Technology", but the FPS still isn't right, and tends to drop below 50 regularly during busy games in maps like Shangri La. Strangely reducing detail levels doesn't seem to have a noticable impact on performance (even though the FPS reports higher) - it still "feels" laggy to play! :-( Really hoping it's just a driver issue that they are working on.

Please let me know if you find any solutions or anything that helps!

Cheers,

H3.
H3llF1re2008 is offline   Reply With Quote
Old 09-13-08, 09:37 PM   #12
mailman2
Ducking & Dodging
 
Join Date: Mar 2008
Posts: 3,948
Default Re: UT3 runs worse with 'Enable SLI Technology'

Quote:
Originally Posted by FearMeAll View Post
This is why I'm sort of glad after 3 years of using Sli, I'm back to single card. So much fiddling you have to do...so much..

QFT - never again am I gonna use dual GPUs even 9800gx2 did this to me. Single card FTW.
mailman2 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 04:03 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.