PDA

View Full Version : Single gpu or Multi gpu?


JakeSteel
07-09-05, 11:09 AM
For an Sli configuration which setting in the control panel is better?Ive tried both of these and cant seem to find a difference either way.any one know exactly what these settings do?Ive searched the forum and been to slizone.com but i cant seem to find a good explanation on which setting is best to use.I do know that the default setting is single gpu and that sli works in this mode.Any input would be helpful.
Thanks :confused:

BrianG
07-09-05, 11:17 AM
Well, you need to have multi-GPU enabled to use SLI. I am not sure what you are looking at or what your question actually is.

What applicationsa re you using that you do not see any difference? What are you trying to do? What are your system specs?

JakeSteel
07-09-05, 12:21 PM
Specs are as follows:Asus A8N SLI-Premium 2 BFG 7800 GTX in SLI 2 T7K250 Hitachi Deskstars in a raid 0 array Pc Power and Cooling 510 SLI power supply processor is a FX 57.
My system is stable and all works as it should.I built this system for games only.The games i play all have sli profiles by default.Games such as cs source ut2004 and battlefield 2.
My Question is about the sli rendering tab under advanced options where you can select either single gpu rendering or multi gpu rendering.Default is single gpu rendering,even though i have checked the box marked enable sli in the control panel options.
I can switch back and forth between single or multi and notice no change whatso ever.Ive checked show load balancing jus to see if sli is working and it works in either mode.
So in short im trying to find out what other sli users like to leave thier settings at?I think those drivers options means....single gpu select,lets the driver decide choose either single or multi gpu based on sli profiles ,or multi gpu which basically sets it on all the time reguardless if the game has a profile or not.
Im not sure of course which is why im asking.There doenst seem to be much info on this setting that i can find.

Thanks for the help

Jake :D

SH64
07-09-05, 12:22 PM
No matter what you select , if the game already has a profile it will use that profile & thus that rendering mode.
if it dosent then that where it might matter .. the multi-GPU might force either SFR/AFR mode in the game though i'm not sure on what its going to depend on that .. i use coolbits to unlock all rendering modes & choose what to force myself so i really didnt bother to test that.

JakeSteel
07-09-05, 12:40 PM
Yea i tryed coolbits and i used auto select but i didnt see a difference there either.

However after i tryed coolbits and removed the registry entry,when i checked the control panel settings it stayed at multi gpu.Weird....guess auto select equals multi gpu or some such.

Thanks for the quick reply ;)

SH64
07-09-05, 02:29 PM
Yea i tryed coolbits and i used auto select but i didnt see a difference there either.

As i said .. thats because the game(s) you are running might have its own profile so it will run according to that profile no matter what you choose.

nippyjun
07-11-05, 06:37 PM
With coolbits enabled i still don't get the option of setting the SLI rendering mode except to choose between sli multi and sli single. Am i missing something that allows the other sli settings?

OWA
07-11-05, 08:19 PM
Specs are as follows:Asus A8N SLI-Premium 2 BFG 7800 GTX in SLI 2 T7K250 Hitachi Deskstars in a raid 0 array Pc Power and Cooling 510 SLI power supply processor is a FX 57.

My system is stable and all works as it should.I built this system for games only.The games i play all have sli profiles by default.Games such as cs source ut2004 and battlefield 2.

OT but what kind of memory are you using?

BrianG
07-11-05, 08:25 PM
With coolbits enabled i still don't get the option of setting the SLI rendering mode except to choose between sli multi and sli single. Am i missing something that allows the other sli settings?
That is taken care of by the profiles unless you use one of the tweak utilities.

JakeSteel
07-12-05, 06:09 PM
OT but what kind of memory are you using?
Memory is now in signature