PDA

View Full Version : G80 Quad SLi You Know it's Coming!


Pages : [1] 2 3 4

MaXThReAT
10-15-06, 02:00 PM
It may be two 8800GX2's or maybe something ells. We all know a duel GPU G80 card will be coming out at some point. Theoretically near doubling the performance and in Quad ...just think about it...

Single 8800GTX
575MHz Core Clock
900MHz Mem Clock
768MB GDDR3 memory
384-bit memory interface (86GB/s Bandwidth)
128 unified shaders clocked at 1350 MHz
38.4 billion pixels per second theoretical texture fill-rate

In SLi
1536MB GDDR3 memory (1.5Gb), 768 Per GPU.
384-bit memory interface (~172GB/s Bandwidth)
256 unified shaders clocked at 1350 MHz
~76.8 billion pixels per second theoretical texture fill-rate

Now in Quad SLi
3072MB GDDR3 memory (3Gb) 768 Per GPU.
384-bit memory interface (~344GB/s Bandwidth)
512 unified shaders clocked at 1350 MHz
~153.6 billion pixels per second theoretical texture fill-rate
*800W PSU requirement FTW.

I think I'm skipping SLI or quad with G7x and going SLi with G80. My EVGA step-up took too long to go SLi and two GTSC-KO's just don't look that attractive anymore.

|MaguS|
10-15-06, 02:03 PM
You do not gain more videoram by going SLi or QuadSLi...

MaXThReAT
10-15-06, 02:06 PM
You do not gain more videoram by going SLi or QuadSLi...

Yeah that’s true, each GPU will have 768MB dedicated to that GPU. So technically you'll have 3Gb of GDDR3 in the Sys.

SH64
10-15-06, 02:09 PM
I think its too early to talk about that.

anyway i reckon QuadSLI will come later .. maybe after the 8900GTX (i.e. 8950GX2). so you're maybe going to wait for long.

MaXThReAT
10-15-06, 02:19 PM
I think its too early to talk about that.

anyway i reckon QuadSLI will come later .. maybe after the 8900GTX (i.e. 8950GX2). so you're maybe going to wait for long.

I don't know, it's a difference of maybe 3 months. The 7900 was out in March 06 and Quad GX2 in June 06. So I would think Quad G80 will be available near Feb-March. It's not that long. Buy one G80 in Nov-Dec Ebay later to buy 2 8800GX2's in March.

lee63
10-15-06, 02:27 PM
I think we have seen the last of Quad SLi as the power requirements would be astronomical for the next gen cards.

SH64
10-15-06, 02:27 PM
Well the gap between each gener refreshes should widen now that technology is getting more complex. so expect longer period roadmaps.
i'm not even sure if we are going to see a "8800GX2/8850GX2".

J-Mag
10-15-06, 02:31 PM
I think we have seen the last of Quad SLi as the power requirements would be astronomical for the next gen cards.

In 6-10 months you will see Nvidia adopt 65nm on high end products. I bet you will see quad G80 then. Just sorta like How you saw quad SLI after the 7900gtx's came out which were essentially 7800gtx's cores with a die shrink and some other minor modifications.

Xion X2
10-15-06, 03:00 PM
PFFT.. who cares. Doubt you'll ever see this anyway.

O--V--E--R--K--I--L--L .

SH64
10-15-06, 03:06 PM
Overkill for Crysis & Alan Wake @1920x1200,16xAA,16xAF,FP32 HDR ? i dont think so ..

jAkUp
10-15-06, 03:07 PM
Overkill for Crysis & Alan Wake @1920x1200,16xAA,16xAF ? i dont think so ..

How about Crysis at 2560x1600? :p

SH64
10-15-06, 03:08 PM
How about Crysis at 2560x1600? :p
The monitor will blow up :D

lee63
10-15-06, 03:11 PM
There's no such thing as overkill :)

Xion X2
10-15-06, 03:13 PM
Overkill for Crysis & Alan Wake @1920x1200,16xAA,16xAF,FP32 HDR ? i dont think so ..

Yes, Overkill. When you consider what we're hearing now (that G80 is 2-2.5x as strong as the GX2 when run in DX10, less CPU overhead, Vista more optimized for gaming, etc.), then yes, it's SERIOUS overkill.

I could almost see two G80's, but not four. Ridiculous. Not needed at this point. Find some better use for your money.

Xion X2
10-15-06, 03:15 PM
Tell me about it^^

I hope G80 SLI is enough for Crysis and Alan Wake (and later UT07) @ at least 1920x1200.

Are you serious? The only game the GX2 can't spin around like a toy at that res right now is Oblivion, and it still plays it at 30+FPS most of the time.

The G80 is thought to be at least twice as strong when running in DX10, and you think dual-G80's is necessary for a good framerate at 1920?

I have to disagree. Strongly.

Xion X2
10-15-06, 03:26 PM
2560 is a killer for sure, but don't forget that these new cards are supposed to implement a form of AA that takes less of a performance hit. I've even heard it referenced as "free" up to 4x.

That's been the biggest performance killer in the past (along with resolution), because 16xAF is at the point where it's almost free on these cards now.

If anyone would need more than two G80's to turn up every graphical effect at a decent framerate, then it would go no further beyond the 2560 crowd. But I just don't see the sense in laying that extra money on the table in order to move your graphics options bar one or two slots to the right (like grass shadows on Oblivion, for instance. I mean, who gives a f*** when you can have everything else turned up all the way?)

superklye
10-15-06, 03:27 PM
There's no such thing as overkill :)
I agree. There is only "kill" and "underkill."

Xion X2
10-15-06, 03:33 PM
That would be great! Because I can't use AA most of the times at 2560 right now...

How many jaggies do you really see at that resolution, anyway? I'm sincerely curious, because I could tell a decrease by going from 1280 res to 1680 on my monitor. I run 2x on a lot of games now compared to having to run 4xAA on them before.

jAkUp
10-15-06, 03:34 PM
He will see jaggies. Yes its high resolution, but the screen size is huge.

Xion X2
10-15-06, 03:39 PM
30 in? If so, I wouldn't think he would ever have to run more than 4xAA to clean things up. That cleans everything up on mine except the crawling textures in the distance, but from what I've heard that has more to do with nvidia's architecture than AA settings.

Bman212121
10-15-06, 03:39 PM
We can do this argument for ever. Provided the extra cores can be used it will offer a benefit. I'm sure if NVidia did a Quad SLI again they would be able to build on what they have learned. DX10 may allow them many more features that they couldn't use before and add some real performance benefits for Quad SLI. One example could be rather than buying a dedicated PPU you buy a dual purpose card that can do everything at once. You can spend $500 on one card and another $250 on a PPU, or you can just buy a $600 card that does both.

All I can say is that time will tell, we all know that there will be games that will need more hardware to run, and if it takes 4 GPU's to do what you want to do then someone's going to buy it.

I'm guessing they will do another Quad SLI, but they are probably still figuring out DX10 so they can make use of it.

jAkUp
10-15-06, 03:41 PM
I think with DX10, SFR is dead, all DX10 games should be rendered in AFR mode.

Xion X2
10-15-06, 03:42 PM
Yeah, Bman hit on something important there. I wasn't taking into account the physics aspect of all this. That kinda makes it hard to make an accurate guess right now. There are just too many unknowns in the equation. Physics, operating system, DX10, etc.

Xion X2
10-15-06, 03:44 PM
I think with DX10, SFR is dead, all DX10 games should be rendered in AFR mode.

Have you heard whether or not they're supposed to fix the triple-buffering + vsync problem with SLi? Or am I the only one on this forum who gives a damn about that? :)

If they would fix that, then I'd think seriously about going dual-G80's to start off with.

Redeemed
10-15-06, 03:44 PM
If I'm able to play FEAR at 1600x1200 with 2xAA, 16xAF, and all in-game options maxed on my two 7600GTs with a s754 3400...

I'd cry if one G80 wasn't enough for high settings on Crysis.

Two would not be overkill if you think of future DX10 titles that are sure to abuse the GPU. But I can't see myself going quad with G80s.

Now, with G90s I might consider, but that is a whole other system build we're talking about. It'd probably sport two quad core Core2s with 32GB of ram at that time. :p

But for now, two G80s is all I'm shooting for, and my s754 3700 @ 3Ghz should be plenty fast enough as well if the G80 actually does have its own physics processor integrated.

I only see SLi'ing two G80s for safeguarding you performance in games two years from now tha should prove a significantly greater challenge for GPUs. Ony then, will two G80s prove to have been a worthy purchase. ;)