PDA

View Full Version : SLi--Is it really WORTH it?


Pages : [1] 2

Xion X2
06-09-06, 11:04 AM
Not to me it's not, and I'll tell you why. It seems like a rush project that was just thrown together and completely unoptimized for the hardcare gamer. More of a bragging right than good performance--something you can spout off to all your buddies about (hey man I got two graphics cards you don't ..heh.. my machine crushes yours.. (pirate) ).

:thumbdwn:

Given, I am a paranoid individual when it comes to my CPU because I am a perfectionist. However, I have the right to be after spending so much money on this thing lately. I will go ahead and say that my machine has been checked for viruses/spyware, I've properly cleaned old drivers off, GPU is installed properly, etc. And my computer benchmarks fine, as you can see from my sig below.

What has me totally shocked right now is that even though I have this incredible graphics card with all this GPU power that can push just about any kind of graphical effect you throw at it, I am getting overall WORSE performance on my 1680x1050 resolution monitor than I was with my 7900GTX. Oh, it benchmarks just fine. You'd never know anything was wrong unless you were playing games--however, that's the primary reason I bought this GX2. Let me give just a few examples of the joys of SLi so far.

* No triple-buffering like single-card configurations. I bold this because it is the absolute biggest disadvantage I've found so far.

* Flickering and tearing of the screen, even when not running DXtweaker for the triple-buffering option AND running with v-sync on in the windows control panel or in-game. F.E.A.R., in particular, has this problem. This often happens when the point of view is changed, like when you turn the camera or in a room when lights are flickering on/off (at one time, when I was standing in a room with flickering lights, my screen tore into 4 rows from top to bottom)

* Stuttering in locations even when your frame rate exceeds 60fps

* Distortion on your screen, like you can almost see different parts split up that don't quite fit together for some odd reason. Another form of improper v-sync. Parts of the screen seem to move at different intervals and are not totally in sync (This has happened w/ SFR, AFR, and AFR2, that I've noticed.)

* An immediate drop in frame-rates to 40 or many times 30 fps in games like FEAR and Oblivion that tax your system just enough to run right below your refresh rate (60Hz), due to loss of triple-buffering option. An immediate drop in fps from 60-75 to 30-40 causes serious stuttering and unsmooth gameplay.


And the craziest thing of it all is I hear few of you guys w/ SLi rigs talking about it. Do none of you care to have v-sync in your games so your screen doesn't tear right down the middle into 2, or sometimes 4 different pieces??? Do none of you care to have your games stuttering because of the constant fluctuation in frame-rates due to lack of triple-buffering? After spending so much money, wouldn't you want better performance than this?

When I turn v-sync off, I'm getting an average of 90 fps on FEAR. If Nvidia, or Microsoft, or whomever would just get their act together and allow some form of triple-buffering in D3D for SLi users (and some v-sync that actually WORKS) then it would be a truly kickass setup. As it is, console gaming still eats it for breakfast and the 360/PS3 are looking better and better right now.

Sorry to vent, but I'm just really frustrated right now. I once heard the saying:

"Hardware is only as good as the software that's written for it."

Unfortunately, that phrase has new meaning to me now. Hard lesson learned. Thanks, Nvidia. :(

OWA
06-09-06, 11:20 AM
SLI, in general, is a godsend for me since I need to game at 1920x1200. With the X1900XTX and with single Nvidia cards, I have to lower settings or the resolution for the performance to be acceptable. With SLI, I usually can run things maxed although there are still a few games that even SLI doesn't run acceptably at 1920x1200. I'm hoping quad-sli will eventually rectify that.

With a single 7950, it seems to work pretty well and it was only slightly slower than my true SLI system. With Quad-SLI, it's showing potential but isn't quite there yet. It's still fun to play with though. :)

Most of the issues you mention, don't really apply to me since I don't use vsync. I do get some tearing in some games but the majority of the time it's minimal and not that noticeable to me. I'm not sure how much of that is just being use to not having vsync. I haven't really used it in years and don't use it even with single cards (like the X1900XTX).

Xion X2
06-09-06, 11:52 AM
Thanks for your response, OWA. Any and all SLi-users feel free to chime in with your opinion and persuade me to keep this setup, because right now I'm thinking about dumping it all and going w/ a 360.

OWA--you're right. You definitely need two cards if you're going to game at that high a resolution. Unfortunately, in my opinion, there are no good options for gaming at that high a resolution right now because SLi and Crossfire are so poorly supported by DirectX9. It is impossible to get the same quality of imagery and smoothness of gameplay that you can get from a single-card setup w/ triple-buffering/vsync enabled through DXTweaker or ATI Tray Tools.

I respect your opinion and the fact that you can do w/o v-sync. Unfortunately, I despise screen-tearing and can't deal without some form of v-sync in my games. It is very noticeable to me because I game at my computer desk right in front of my 20" monitor.

What's really ridiculous is that I've spent over 2-grand on this computer, and more on a single graphics card, than I would on a console with comparable graphics which will play games at consistent frame rates without screen-tearing.

lopri
06-10-06, 06:22 AM
You presented 5 examples but they're indeed all the same thing.. which could be fixed via tripple-buffering, or better yet via V-sync. Yes it's one of the shortcomings of SLI but not necessarily need to be inflated to 5 wanna-be-different traits.

V-sync will more or less take care of all 5 problems you're experiencing. Of course V-sync can greatly reduce the benefit of SLI depending on the situation, (Well, if you get something like min 60FPS in a certain game using SLI, you should definitely enable V-sync) then again SLI is an option, not an obligation. Whether it's worthwhile to an individual, is ultimately the individual's business.

I'm sorry to sound negative and I can certainly understand your frustration with the expensive hardware, but I can't help but getting bored of same topics repeating themselves for almost 2 years since SLI was introduced.

I'd say you'd have a better chance to send an e-mail to NV for solving/evaluating your situation.

Edit: Also, I missed at first reading but F.E.A.R. not supporting widescreen isn't SLI's fault. I don't know exactly what resolution you're playing the game at, but if your sig represents the said-SLI system, you're probably limited to 1280x1024 stretched horizontally? (Not sure about this) If that's the case, you should have no problem playing F.E.A.R. with V-sync enabled, and with perfectly smooth FPS. Most benches I've seen with the latest Forceware, F.E.A.R. gets near 100 FPS @1600x1200 with GTX SLI.

And mine: ;)
http://img58.imageshack.us/my.php?image=fearsli10249ln.jpg
http://img64.imageshack.us/my.php?image=fearsli16007yx.jpg

Superfly
06-10-06, 06:56 AM
I love my Sli setup and could'nt go back to single cards now, like OWA I have no issues with tearing at all and likewise I never used v-sync on single cards.

Your issue is more of a feature request than an actual problem, SLi was certainly no rushjob and is now very mature and stable.

ttfn.

Burnt_Ram
06-10-06, 07:34 AM
You presented 5 examples but they're indeed all the same thing.. which could be fixed via tripple-buffering, or better yet via V-sync. Yes it's one of the shortcomings of SLI but not necessarily need to be inflated to 5 wanna-be-different traits.

V-sync will more or less take care of all 5 problems you're experiencing. Of course V-sync can greatly reduce the benefit of SLI depending on the situation, (Well, if you get something like min 60FPS in a certain game using SLI, you should definitely enable V-sync) then again SLI is an option, not an obligation. Whether it's worthwhile to an individual, is ultimately the individual's business.

I'm sorry to sound negative and I can certainly understand your frustration with the expensive hardware, but I can't help but getting bored of same topics repeating themselves for almost 2 years since SLI was introduced.

I'd say you'd have a better chance to send an e-mail to NV for solving/evaluating your situation.

Edit: Also, I missed at first reading but F.E.A.R. not supporting widescreen isn't SLI's fault. I don't know exactly what resolution you're playing the game at, but if your sig represents the said-SLI system, you're probably limited to 1280x1024 stretched horizontally? (Not sure about this) If that's the case, you should have no problem playing F.E.A.R. with V-sync enabled, and with perfectly smooth FPS. Most benches I've seen with the latest Forceware, F.E.A.R. gets near 100 FPS @1600x1200 with GTX SLI.

And mine: ;)
http://img58.imageshack.us/my.php?image=fearsli10249ln.jpg
http://img64.imageshack.us/my.php?image=fearsli16007yx.jpgholly crap ! what rez and settings are you using to get frames rates like that in FEAR ? i get nowhere near that high :(

Maverickman
06-10-06, 08:29 AM
SLI is more beneficial in certain games and situations than in others. I've read some reviews of the 7950 GX2, and it has several drawbacks. The primary drawback is that it will only work on a limited number of motherboards. I don't think you can really judge SLI with this card. My 7800 GTX SLI setup is great, but there are situations where SLI can perform worse than a single card. This is primarily true in older games. Take a look at www.xbitlabs.com for an excellent review.

Remember that the 7950 GX2 is still NEW TECHNOLOGY. When you're an early adopter, you have to deal with all the bugs and glitches. That's why I don't rush out to buy the latest and greatest all the time. Read the review and post what you think!

lopri
06-10-06, 08:53 AM
holly crap ! what rez and settings are you using to get frames rates like that in FEAR ? i get nowhere near that high :(

It's max everything including soft shadow @1024x768 and @1600x1200, respectively.

Review
06-10-06, 09:14 AM
Been an SLI user for just under a year and I can happily say I'm satisfied. I never really miss v-sync or triple buffering and don't really notice any of the problems you've noted apart from the odd bit of tearing; but it doesn't effect me in any way.

And to be honest I would only say SLI or quad-SLI is useful for those who want to game at native res on big screens (like the 24" Dell for example).

Kaguya
06-10-06, 09:30 AM
I've had SLI for nearly six months now and I'm satisfied with it completely. I never used V-Sync at all, EVER, so I didn't notice that you couldn't use it and triple buffering with SLI ;) I always find V-Sync noticeably slower when rotating left-right.

I especially like my setup for widescreen gaming and its AA/AF performance. My only worry regarding SLI is that my motherboard only supports x8 lines in SLI, so if bandwidth ever becomes an issue I'll either swap out or go with a single-card solution... but I figure once the G80 comes around its single card will kill any current Double SLI and hopefully even those Quad SLIs out there :D

rewt
06-10-06, 09:48 AM
* No triple-buffering like single-card configurations. I bold this because it is the absolute biggest disadvantage I've found so far.

Since when does nvidia drivers allow forcing of triple-buffer in D3d? It seems like years we've been discussing this issue. Single card setups had the same problem.

*EDIT - oh I see now. You mean Triple-buffer doesn't work with SLi and DXTweaker..

In that case you always have the option of going back to a single-card setup if DXTweaker works with that.

Tho Jo Smale
06-10-06, 11:01 AM
SLi good, Yes


One Word


OBLIVION.

the only single card that can really pwn in oblivion is the new 7950 GX2 which isn't a single card at all.

Plus I enjoy playing Counterstrike source at 8xSLi antialiasing and 16x anisotropic filtering while generating about 170fps....that's what SLi is for BOYS

Now I do agree that single card solutions can still produce playable framrates and good quality but they just don't possess the graphics horsepower of an Sli system.

And As for screen tearing....when you're getting 170fps you don't get any tearing trust me....and with 8x AA you can't see that "jaggies" for miles

JC

kinnane
06-10-06, 12:00 PM
Yeah i may only have lowly 6600gts but they pwned all when i got them. AA doesn't scale very well on these cards but they have let me play the majority of games damn well, its only the hardcore GFX killer games that i cant play at 1600x1200 like F.E.A.R, NFS:WM. Thats been my standard res in everything with all gfx settings turned up but no extras like AA/AF execpt in DoW which I run at SLI16xAA:).

i think a SLI rig is probably good enough for a long stretch into the next gen of GPU`s. For the little extra cost when building a rig its better to get SLI and enjoy games with ALL the eye candy and blazing FPS now and still be able to ruture proof yourself in the next gen of cards.

I was an early adopter for SLI and its payed off very well. If NVidia can do the same with quad as they did for dual SLI then its gonna rock. ive got my 7950gx2 on order i cant wait. Quad would be good but by the time i have cash for a new mobo with 16x2pci-e and conroe the G80 will probalby be out and ill get that.

Redeemed
06-10-06, 01:18 PM
I have yet to get SLi, but it is something I'm going to do- soon. I've already got the mobo sitting here in my room, just need the rest of the system now. :D ;)

In fact, before long I'll be heading to the bank to deposit both my paychecks- and then when I get home today I'm placing an order for some more components of my new rig.

And from everywhere else I read- SLi is well worth it- far more than a game console.

I don't know about everybody else, but it seems to me that when more recent PC titles have been made available on a console that is already 3+ years old, the graphics just don't look as good and clean as they do on a pc.

Then again it is all up to the individual's opinion I guess. Being that you are a perfectionist as you claim- well, as a true perfectionist nothing would please you in all reality. Not even the 360 or PS3. Hate to say it, nothing is perfect. ;)

zaG
06-10-06, 03:01 PM
@ Xion X2

I thought i were alone with this. So much money to play games without v-sync?? I hate tearing thats it... so iŽll send them back and get a single highend card :D

SLi = :thumbdwn:


Got ATi with CF the same problems?

grey_1
06-10-06, 04:01 PM
I can't really see doing without atm, course my cards are no longer high end, but I need them at 1920x1200 and I run 4xAA on pretty much every game.
FEAR is pretty much the only one I get any tearing at all and it's very minimal, usually anywhere there are bright, flickering lights.

V-sync is a must for me in Doom3 and FEAR. The triple buffering, I guess someone would have to show me:) , I've never worried much about it, but enabling it with Doom3 left me with the same fps as without. This was with v-sync enabled.

If I'm not understanding exactly what your all looking at, someone please explain? Thanks

bagman
06-10-06, 09:27 PM
Please excuse my complete ignorance on this subject, but I am thinking of adding another 7800GTX to enable SLI
Reading this post has brought to my attention something that I was completely unaware of i.e. the lack of v-sync plus tripple buffering.
Have I read that right..... once SLI is enabled v-sync/tripplebuffering are disabled unlike in a single card ???

What other features do you miss out on in SLI compared to a single card ??

cheers and thanks
Bagman

Xion X2
06-10-06, 11:49 PM
Yup, that's right. With SLi, you lose the ability to run vsync + triple-buffering.

For those who have asked, triple-buffering is that godsend feature that keeps your frame-rates from dropping so severely when you have v-sync enabled in Direct3D games.

For example, you're running FRAPS and see that you're getting 60 (or if you have a higher refresh rate monitor, maybe 75) FPS w/ vsync enabled, since vsync syncs your graphics to your monitor's refresh rate. Then, lo and behold you walk into a room that is more demanding of your system and your FPS suddenly drops to 30 like a slot-machine. The reason this happens is that as soon as your system cannot support the number of frames your refresh rate is set to, it is double-buffered so that it splits them right down the middle.

That doesn't happen w/ triple-buffering. I'm no programmer, so I can't tell you exactly what it is in technical terms, but I believe it has something to do with a buffer in memory inbetween your refresh rate and your graphics, so that it keeps your frame rate as high to your refresh rate as possible--even when your system struggles to hit your refresh rate.

Basically, triple-buffering gives you the added bonus of no screen-tearing while, at the same time, not slowing down your system.

I just don't understand how you guys say the screen-tearing doesn't bother you. It is absolutely horrible on my monitor. I own a 20" 1680x1050 LCD, and my refresh rate only goes to 60Hz at this resolution. Is there a monitor out there that would better help cut down on the tearing that I'm experiencing?

If I run vsync in FEAR, my FPS will drop down to 30-40 quite often. I've tested it repeatedly. You wouldn't think it would, because I'm getting 73FPS w/ the stress test at 1680 resolution, but it does. Regardless if this worked or not, it doesn't speak well of future titles that would begin to stress my system. They would probably start stressing it enough to knock it down below 60FPS and I'd have the same problem.

I'm trying to get used to it, but it's really hard after playing w/ vsync/TB for so long.

Again, if anyone can point me to a better monitor that will cut down on the tearing then please do.

Xion X2
06-10-06, 11:51 PM
Got ATi with CF the same problems?

Not sure, this is something I've tried to find out.

shungokusatsu
06-10-06, 11:52 PM
Yup, that's right. With SLi, you lose the ability to run vsync + triple-buffering.

For those who have asked, triple-buffering is that godsend feature that keeps your frame-rates from dropping so severely when you have v-sync enabled in Direct3D games.

For example, you're running FRAPS and see that you're getting 60 (or if you have a higher refresh rate monitor, maybe 75) FPS w/ vsync enabled, since vsync syncs your graphics to your monitor's refresh rate. Then, lo and behold you walk into a room that is more demanding of your system and your FPS suddenly drops to 30 like a slot-machine. The reason this happens is that as soon as your system cannot support the number of frames your refresh rate is set to, it is double-buffered so that it splits them right down the middle.

That doesn't happen w/ triple-buffering. I'm no programmer, so I can't tell you exactly what it is in technical terms, but I believe it has something to do with a buffer in memory inbetween your refresh rate and your graphics, so that it keeps your frame rate as high to your refresh rate as possible--even when your system struggles to hit your refresh rate.

Basically, triple-buffering gives you the added bonus of no screen-tearing while, at the same time, not slowing down your system.

I just don't understand how you guys say the screen-tearing doesn't bother you. It is absolutely horrible on my monitor. I own a 20" 1680x1050 LCD, and my refresh rate only goes to 60Hz at this resolution. Is there a monitor out there that would better help cut down on the tearing that I'm experiencing?

If I run vsync in FEAR, my FPS will drop down to 30-40 quite often. I've tested it repeatedly. You wouldn't think it would, because I'm getting 73FPS w/ the stress test at 1680 resolution, but it does.

I'm trying to get used to it, but it's really hard after playing w/ vsync/TB for so long.

Again, if anyone can point me to a better monitor that will cut down on the tearing then please do.

I really don't see a monitor cutting down the tearing from no vsync, I have a 5ms lcd and it still gets that tearing. Even the 2ms models do as well.

Xion X2
06-10-06, 11:55 PM
The reason I asked is that some guys say they don't notice it at all. I don't see how one couldn't notice it. I have heard that higher refresh-rate monitors may help cut down on the tearing, but I'd like some feedback from someone who has one of these to be sure.

It doesn't matter if my games run at 100+ FPS--it's as clear as day on my monitor, but it does only go to 60Hz at this resolution.

shungokusatsu
06-10-06, 11:58 PM
The reason I asked is that some guys say they don't notice it at all. I don't see how one couldn't notice it.

It doesn't matter if my games run at 100+ FPS--it's as clear as day.

I agree but you get used too it. I played counter-strike competitively for year, and you just get used to getting max fps especially in chaotic multiplayer lans. Then again technology has changed drastically and you can now run everything at max with vsync on and off to get max fps. Anyone who says they don't notice it bsing, it's definately noticable even on 2ms lcd's. Perhaps people just don't like being limited to 60 fps or 100 if there monitors can handle it?

lopri
06-11-06, 03:49 AM
I noticed tearing in F.E.A.R. without V-sync with my 7600 GT. It's just as bad as in Doom 3. It really sucks that you have to pick either tearing or V-sync, especially when your minimum FPS is at the borderline of your monitors refresh rate. (think consistent 55~59 FPS. :D )

In my opinion, it's partly the reviewers' fault in that they overlook this. If reviews tell us which game tears badly along w/ SLI, users could make better informed decisions. Then again, I wouldn't think, in a game like F.E.A.R., a single-card config will give better experience than an SLI setup, everything considered.

OP: Like I said, it sucks that you don't get the most out of your investment. Let's hope NV fixes this via drivers. Or maybe wait for a few months and pick up another GX2 for cheap. Your rig will be flying through the roof. Regarding the original post, I really thought it's ONE, albeit a very serious, issue, not 5 different ones and I thought we've all known about this. (apprently not all of us, but then again it could be a blessing for them if they don't know what tearing is, therefore they don't catch it in the game. For example, I still have no idea what "shimmering" is.)

Xion X2
06-11-06, 02:15 PM
Or maybe wait for a few months and pick up another GX2 for cheap. Your rig will be flying through the roof.

The last thing on my mind right now is dumping another 600$ into a system that doesn't play as well as a 400$ console.

It wouldn't matter if my system "flew" even faster than it does now (which is already fast as hell.) I would still have tearing all in my games. I can play FEAR well above 60 FPS consistently w/o vsync off, so you would think I could vsync at a consistent 60FPS, but it doesn't happen. I get a maximum of 40FPS when I get in a graphics-intensive location, which is a hell of a lot choppier than 60FPS. Many times, it even drops down to 30, which is unacceptable to me for an SLi rig that I've spent 2 grand on. I've tried anything I can to fix this. I've tried changing the SLi renderer to SFR, AFR and AFR2--it does the same on all three. I've tried changing the graphics in my control panel. Nothing. Nothing fixes this annoying problem, and nobody seems to give a crap about fixing it any time soon, which is most frustrating.

Regarding the original post, I really thought it's ONE, albeit a very serious, issue, not 5 different ones and I thought we've all known about this.

They are all related to the vsync/triple-buffering problem, but so what? It's still 5 key flaws in the system.

As I said before, even when I have vsync ON I get tearing all over the place in FEAR and some other games.

Bottom line: SLi is an unoptimized, overhyped, overpriced pile of cow dung.

shungokusatsu
06-11-06, 02:19 PM
The last thing on my mind right now is dumping another 600$ into a system that doesn't play as well as a 400$ console.

It wouldn't matter if my system "flew" even faster than it does now (which is already fast as hell.) I would still have tearing all in my games. I can play FEAR well above 60 FPS consistently w/o vsync off, so you would think I could vsync at a consistent 60FPS, but it doesn't happen. I get a maximum of 40FPS when I get in a graphics-intensive location, which is a hell of a lot choppier than 60FPS. Many times, it even drops down to 30, which is unacceptable to me for an SLi rig that I've spent 2 grand on. I've tried anything I can to fix this. I've tried changing the SLi renderer to SFR, AFR and AFR2--it does the same on all three. I've tried changing the graphics in my control panel. Nothing. Nothing fixes this annoying problem, and nobody seems to give a crap about fixing it any time soon, which is most frustrating.



They are all related to the vsync/triple-buffering problem, but so what? It's still 5 key flaws in the system.

As I said before, even when I have vsync ON I get tearing all over the place in FEAR and some other games.

Bottom line: SLi is an unoptimized, overhyped, overpriced pile of cow dung.

I thought it wasn't true SLI without an entirely seperate 2nd card?