PDA

View Full Version : 7950 GX2 Vs X1900 XT Crossfire


Pages : [1] 2

Blacklash
07-17-06, 12:42 PM
So I was bored and broke down the numbers from a X-bit labs review.

Here are the numbers, 7950 GX2 listed first, and overclocked results excluded. Neg or Pos refers to GX2's numbers vs X1900 XT Crossfire. Stock 7950 GX2 vs X1900 XT Crossfire @ 1600x 4xAA|16xAF(or HDR+16 AF):

X1900 XT x2 clocks 625|725, 7950 GX2 500|600.

Numbers were rounded up or down accordingly-

Battlefield 2- 126 vs 130, neg 4

Chronicles of Riddick- 99 vs 84, pos 15

Call of Duty 2- 67 vs 88, 21 neg

Doom III- 83 vs 85, neg 2

Oblivion Indoor- 68 vs 74, neg 6

Oblivion Outdoor- 48 vs 57, neg 9

Far Cry Pier- 96 vs 103, neg 7

Far Cry Research- 133 vs 143, neg 10

F.E.A.R.- 70 vs 59, pos 11

G.R.A.W- 60 vs 23, pos 37 +(G.R.A.W will work with Crossfire properly if you rename the .exe. Typical of the shoddy support available.)

Half Life 2- 98 vs 110, neg 12

HL2: Lost Coast- 64 vs 60, pos 4

Project: SnowBlind- 39 vs 54, neg 15 (Works poorly with dual card|gpu)

Quake 4- 89 vs 94, neg 5

Serious Sam 2- 57 vs 59, neg 2

Splinter Cell:CT- 92 vs 116, neg 24

Pacific Fighters- 71 vs 55, pos 16 (ATi's SM 3.0 done right cost them here :p)

X3 Reunion- 61 vs 80, neg 19

Ages of Empires 3- 57 vs 59, neg 2

Wahammer 40k: DoW- 81 vs 75, pos 6

Biggest wins for Crossfire was plus 24, and the 7950 GX2 plus 16. I did not count G.R.A.W against ATi because I know it can work properly. Lowest reported valid average FPS results 55 for Crossfire and 48 for the GX2. I am not including bugged games in that statement or results where Crossfire or multi GPU isn't in use. If you are bored count the number of valid benches where FPS is 10 or less between the two configurations.

Source:

http://www.xbitlabs.com/articles/video/display/nvidia-gf7950gx2.html

My conclusion would be the 7950 GX2 is a much better value. Often the differences are so small they will not be noticed in play. When you factor in space taken up, power consumption, and software support it is even more attractive. Not to mention if you want to use an Intel 975x board it gives you 512mb 7900 GT SLi level peformance. Last I checked X1900 XT Crossfire costs 290 usd more to own vs a regular 7950 GX2.

Riptide
07-17-06, 12:47 PM
Wow nice of you to take the time to do that. X1900 x-fire seems to hold up pretty well. But yeah not worth the extra $$ vs. the single GX2.

snilloconator
07-17-06, 01:45 PM
Very nice follow up Malficar! I'm also a happy owner of 7950... This card is the best bang for the buck when it comes to high end cards IMO!

SH64
07-17-06, 02:04 PM
Could've used + & - instead of P.O.S. & neg :p

nice comparsion! handy for those goin' XF.

Xion X2
07-17-06, 04:50 PM
Nice benches, but you can get a 1900XT now for around 350$ on newegg. No, that's still not cheaper (about 100$ more than the GX2 for Crossfire), but it's not a huge difference, and it is better overall performance--especially at extra high resolutions where Crossfire seems to pull away a little from SLi.

Still the GX2 is a good value, IMO. However, I'd be touting the fact that the GX2 uses a heck of a lot less power than Crossfire for similar performance if you're trying to trump ATi on something here. Duke Power blacks out whenever someone flips the switch on their Crossfire rig.

Airbrushkid
07-17-06, 05:14 PM
I'm a Nvidia fan all the way. But the 7950 is still 2 video cards in on. Come on 2 GPU's, 2 boards and each board has 512 mb. All in one pci express slot. Before you know it they will sucker you guys into buying a card with 4 GPU's, 2 gig of ram and take up 4 slots and they'll call it 1 video card. They can't seem to go any further with one gpu so they make a card with 2. It's just like the due core processors.

Riptide
07-17-06, 05:42 PM
Before you know it they will sucker you guys into buying a card with 4 GPU's, 2 gig of ram and take up 4 slots and they'll call it 1 video card.No offense intended but who really gives a crap anyway? A rose by any other name would smell as sweet. Bottom line is most of us don't care if it takes 20 GPUs and 6 cards as long as they fit in our cases and don't turn it into an inferno we will deal with it in exchange for the performance.

IMO the GX2 looks damn sweet in a case with a window. Two cards sandwiched like that looks awesome.

K007
07-17-06, 07:27 PM
No offense intended but who really gives a crap anyway? A rose by any other name would smell as sweet. Bottom line is most of us don't care if it takes 20 GPUs and 6 cards as long as they fit in our cases and don't turn it into an inferno we will deal with it in exchange for the performance.

IMO the GX2 looks damn sweet in a case with a window. Two cards sandwiched like that looks awesome.

Yea..i mean look at jakups pc ><(nana2) (nana2)

Airbrushkid
07-18-06, 03:44 AM
Well his isn't that bad yet. But when you light bill cost you more then you house and car payment a month. You'll end up with a power tranformer attached to your house and every house that has a wild computer. Thats just for the power you'll need. But you say 20 gpu's. I'm glad that if that does come to that you'll beable to waste about $6000.00 or more for just the video cards!!

Yea..i mean look at jakups pc ><(nana2) (nana2)


I think they should take the video card out of the computer case and give it it's own case away from the computer case and the heat.

Shocky
07-18-06, 06:59 AM
Nice results, but it doesnt really take into account that the X1900XT's wont overclock much at all the the 7950 will overclock allot !

my gainward bliss 7950 500/600 overclocks to 600/800.

K007
07-18-06, 07:13 AM
I would like to see the temps compared..from what i recall the XT does get a **** load hot..and we all know how hot the GX2 can get..i mean this comming from what 50max on my gtx to my gx2 idle 55ish..load goes upto 70ish<..i know its normal but dam..and its winter in aus..cant wait to see summer..."Fire in the hole!"

Xion X2
07-18-06, 08:38 AM
Nice results, but it doesnt really take into account that the X1900XT's wont overclock much at all the the 7950 will overclock allot !



That's bull. My friend has his 1900XT overclocked from 625/1450 to 765/1700 with a software volt-mod. If you can cool it, the 1900XT overclocks easier than a 7900GTX because you don't have to hard-mod it to get more voltage.

Blacklash
07-18-06, 10:01 PM
I look at it like this. You have two GPUs running at 125MHz less on both the memory and core keeping alarmingly close to faster clocked ones. In six out of the twenty benches they actually provide greater FPS. In five benches where they provide less FPS the difference is five FPS or less. In the three large wins for Crossfire FPS provided by the GX2 is never less than sixty-seven FPS.

The 7950 GX2 consumes much less power, less space and creates less noise. In my experience SLi has better software support than Crossfire at this time. I encountered more odd annoying bugs with Crossfire. It also takes a while to enable itself on windows boot. Last I checked cost for a X1900 XT and a X1900 Master Edition card was a total of 848 usd.

When you get into resolutions of 1600x and greater with AA|AF single GPUs do not cut it, particularly if you start to tack on transparency|adaptive AA. All things considered, I'll reiterate, based on personal experience, I believe the 7950 GX2 to be the best value out for resolutions of 1600x and up with AA|AF active. I put up the X-Bit review breakdown because it frames my point well.

Many know I was using X1900 XT Crossfire in my primary rig for quiet a few months. After trying the 7950 GX2 in my secondary setup I was so impressed with it I moved it to my main rig.

Naturally, folks ought buy what they wish. Thank you for all the feedback and comments.

BTW some of you may like this for fun. It's from one great overclocker in Japan. It's an E6600 @ 4GHz pushing an overclocked 7950 GX2.

http://img235.imageshack.us/img235/9517/fear1024768629dt3.jpg

Source (and more fun here):
http://www.xtremesystems.org/forums/showthread.php?t=106142

Q
07-18-06, 10:18 PM
For the record...there are people who care that the GX is two cards. And I actually have a reason.

Dual monitors. SLI is workable with dual monitors but its a bit of a hassel. Mush less than it used to be, but things like wallpapers and desktop icons get messed up a lot easier with switching back and forth with SLI.

grey_1
07-18-06, 10:22 PM
Very nice job Malficar, thanks!

K007
07-19-06, 06:52 AM
lol..999FPS...

Lazaredz
07-19-06, 08:09 AM
http://img235.imageshack.us/img235/9517/fear1024768629dt3.jpg
[/url]


He-he-he-he ..... that makes me giddy!!! :)

Blacklash
07-20-06, 05:24 AM
Thought I'd tack this here instead of making another thread.

You need good CPU support to tell the difference in some games like Oblivion. When I first installed my 7950 GX2 I thought it wasn't working properly and I had my CPU at stock. I checked the driver CP and up to 16x AA was available so I was then fairly sure I was good to go. When I cranked my CPU up my FPS in one area in Oblivion where it was only 64 FPS @1600 x res shot up to 109 FPS. I remembered my X1900 XTX Crossfire setup supported by a X2 4400+ @ 2.84 GHz put up 124 FPS in the same situation so I discovered I simply needed to push up my CPU clock. You won't have that problem when you get your Conroe :p:

EDIT:

Here ya go. 7950 GX2 @ 571|775 for both shots, 16x AF @ 1600x res HDR active. I just loaded a saved game, didn't move and did a screen capture.

CPU @ 3 GHz=63 FPS

http://img234.imageshack.us/img234/5099/oblivion2006071909521406nu8.th.jpg (http://img234.imageshack.us/my.php?image=oblivion2006071909521406nu8.jpg)

CPU @ 4.5 GHz=106 FPS

http://img205.imageshack.us/img205/8409/oblivion2006071909580950ri3.th.jpg (http://img205.imageshack.us/my.php?image=oblivion2006071909580950ri3.jpg)

CPU @ 4.5 GHz and 7950 GX2 stock 500|600=105 FPS

http://img48.imageshack.us/img48/3785/oblivion2006071910201087pp5.th.jpg (http://img48.imageshack.us/my.php?image=oblivion2006071910201087pp5.jpg)

Finally, CPU @ 4.5 GHz 7950 GX2 stock with multi-gpu off in the NV driver CP=64 FPS

http://img206.imageshack.us/img206/4434/oblivion2006072005030085zo5.th.jpg (http://img206.imageshack.us/my.php?image=oblivion2006072005030085zo5.jpg)

So if you are running a stock P 4 @ 3GHz or a D don't expect big things from dual GPUs. Looks like my CPU is holding it down even @ 4.5 GHz.

K007
07-20-06, 06:39 AM
Yea i think a overclock and putting stress on the gpu is pointless with cards like this..even the gtx it self..its really pointless unless you care about 3dmark scores..its much better to o/c the cpu..i am waiting myself to score a X2 once the Price Wars Episode I beings ><..should help alot on those games that use them.

Xion X2
07-20-06, 12:13 PM
Malficar--I'd be interested to see just how much of a difference you see on the outdoor environments in Oblivion. Those shots are inside towns where lots of characters are walking around.

When I had my GX2, I could average around 55-60FPS outdoors at 1680x1050, but the second I walked into towns my framerate would drop to 35-40FPS.

The towns on Oblivion are much more CPU limited than the outdoor environments are, I believe.

You could also test FEAR out. I averaged around 76FPS on 1680 res except when there were multiple characters on the screen. When that happened, I would again drop down to around 35FPS.

I think that, until they start multithreading games more, a heavy-hitting CPU isn't really necessary right now. The only times you really see a difference are when A.I. is all over the place, in the cases of the impressive A.I. on FEAR and towns in Oblivion where lots of characters are walking around.

If you get time to take some benchmarks of the outdoors on Oblivion with similar results, though, you could change my mind about that.

Blacklash
07-20-06, 03:37 PM
Malficar--I'd be interested to see just how much of a difference you see on the outdoor environments in Oblivion. Those shots are inside towns where lots of characters are walking around.

When I had my GX2, I could average around 55-60FPS outdoors at 1680x1050, but the second I walked into towns my framerate would drop to 35-40FPS.

The towns on Oblivion are much more CPU limited than the outdoor environments are, I believe.

You could also test FEAR out. I averaged around 76FPS on 1680 res except when there were multiple characters on the screen. When that happened, I would again drop down to around 35FPS.

I think that, until they start multithreading games more, a heavy-hitting CPU isn't really necessary right now. The only times you really see a difference are when A.I. is all over the place, in the cases of the impressive A.I. on FEAR and towns in Oblivion where lots of characters are walking around.

If you get time to take some benchmarks of the outdoors on Oblivion with similar results, though, you could change my mind about that.

My point in all of this would be to balance your CPU|GPU choices appropriately for the resolutions you play. The amount of candy you like to pile on is important as well. Many folks can get away with a single card @ 1600x with light or no AA|AF. When you dial them up and tack on something like transparency AA multi-GPU set ups become much more viable and even in some cases, needed.

I grow tired of folks that do things like bench Quad SLi with 3dmark06 @ default then complain about FPS. It's simply ignorant.

I have seen plenty of evidence to suggest if you do not have a certain level of CPU support multi|dual GPU set ups are a waste of time. Again resolution and filtering factors into this too. The point of my screens was to show my CPU at stock speed simply can not take proper advantage of the strengths a multi GPU solution can provide in Oblivion. IMO a single 7900 GT would be more suitable for this CPU @ stock.

Creature A.I., real time rendering in combat situations, tangent space transformation, skinning and shadows are certainly some things that can heavily stress the CPU. Your dynamic moment to moment decision making as a player may effect these things too.

I am not here to change anyone's mind and am rather offering a warning. That you will be very disappointed if you invest in dual|multi GPUs and ignore crucial variables like the resolution you play, and proper CPU support.

I begin to see some decent returns in Oblivion at 1600x when I hit 3.6Ghz with this CPU using a 7950 GX2. I still see no reason to overclock this card, even with the CPU @ 4.5GHz. I'll be able to give you a more detailed opinion when I can test Conroe. I'll return to this when I can contrast an overclocked E6600 with my current chip in Oblivion using a stock 7950 GX2. I'll try cities, outdoors, heavy grass areas, and dungeons, then report. So updates and feedback sometime after the 27th.

Slammin
07-20-06, 07:39 PM
Yes, one of the mistakes (sort of) I made was choosing this 2405 monitor. 1900x1200 is just at the ragged edge of what my dual 7800GTX's can handle.

Live and learn.....

Xion X2
07-20-06, 08:43 PM
Malficar--I'm just not convinced that a processor (a good one) is going to end up being much of a bottleneck right now. I used to think like you and feel it made a big difference, but I don't so much anymore, and here's why:

http://www.bit-tech.net/hardware/2006/01/10/amd_athlon_64_fx-60/6.html
http://www.bit-tech.net/content_images/amd_athlon_64_fx-60/hq-fear.png

http://www.firingsquad.com/hardware/amd_athlon_64_fx-57/page9.asp
http://www.firingsquad.com/hardware/amd_athlon_64_fx-57/images/d31600.gif

http://www.firingsquad.com/hardware/amd_athlon_64_fx-60_review/page9.asp
http://www.firingsquad.com/hardware/amd_athlon_64_fx-60_review/images/fear1600.gif

http://www.driverheaven.net/reviews/fx60/FEAR.htm


Here's a good discussion about it; I was actually the one arguing the same point you are now, but for now this guy's changed my mind about how much of a bottleneck the processor is right now to these games:

http://forums.extremeoverclocking.com/showthread.php?t=223684&page=2&highlight=gx2

jAkUp
07-20-06, 10:55 PM
That's bull. My friend has his 1900XT overclocked from 625/1450 to 765/1700 with a software volt-mod. If you can cool it, the 1900XT overclocks easier than a 7900GTX because you don't have to hard-mod it to get more voltage.


Comeon man you can't compare vmod'd cards. You can VMod a 7950GX2 too.

Xion X2
07-21-06, 01:03 AM
[double post]