PDA

View Full Version : help: 7950 gx2 or G80 ?


Pages : [1] 2

redxix
08-06-06, 08:20 AM
Hi, I am new to the nv news and I wanted to ask everyones opinion. I am looking to build a new system; a few requirements are HDCP compatibility and good performance for HL2: Ep1, Counter-Strike: Source, and Fear (at least decent fps for Fear).

I am interested in the 7950 GX2 and the G80. I would like to have this built by the end of August or early September. Depending on how good G80 is, I could wait a little longer. Does anyone think the G80 will surpass the perfomance of the 7950 GX2 at 1920x1200? Additionally, I have heard that G80 will support HDR+AA, is this true?

Does anyone know when the Nvidia Nforce 590 Mobo for INTEL will be released?

I will list the important parts I am planning on purchasing b/c I think it is important to know \the resolution I want to play at and also if there are any forseeable compatibility issues:

-Conroe E6400 (want to get a 30-50% OC)
-Asus P5B mobo
-2 GB 667MHz DDR2 RAM
-Raptor 150GB HDD
-Samsung 24" 1920x1200 244t
-Sony Blu-ray Drive

K007
08-06-06, 08:31 AM
Well...the 7950 is there to be put up against the single cards..so i would assume that the 7950 will have more power to play with..but something like 8800gts in sli probably kick the single 7950 ass..

It is hard to say how much power the g80 will bring but i would think that maybe the 8800gt..or gtx in single will probably not beat the 7950..might be close..but the g80 in sli probably kick the 7950 ass..and by sli on g80 i mean something like 8800GT..wouldn't surprise me to see a 8950GX2..now that would be hot -.-....quad sli those..future proof :).

Wouldnt the 6600 be a better choice over the 6400?...and whats with the blu-ray drive -.-

Just remember the 7950 has alot of juice for a "Single card" solution...and quad sli just makes even better..but g80 does bring dx10..but with that said i think i am more looking forward to the r600 than the g80..either way..if u want to buy a card now..then i say wait atleast for the x1950xtx..

Mr_LoL
08-06-06, 08:37 AM
Yes but dont forget that thr G80 will be DX10 compatible while the 7950 is not. Personally I will wait for the G80.

1337_Like_ThaT
08-06-06, 08:44 AM
Purchase the 7950 GX2 right now from EVGA and utilize their StepUp Program to upgrade to the G80 once released :)

retsam
08-06-06, 09:40 AM
buy a stock 7900gt now... then when g80 comes out, buy it. :)
this is what i did. once the g80 is released im going for it... i just hope they have a 1 gig version on release.

redxix
08-06-06, 09:41 AM
Thanks for all the input! I didn't know that EVGA had a step up program.

I will have to look into that. Can the EVGA be overclocked like the XFX (the high end one)?

KR007: yes, I am looking at the GX2 as a single card. Although I know that SLI would be better I am looking to spend closer to $500-600 rather than $1000-1200. Additionally, I am running on a PC Power and Cooling, 510 SLI Power Supply and I don't want to have to upgrade to the 1000w b/c I think that the power envelope for G80 will be 175w per card and I am running a Raptor and 3 WD 400GB SATA drives.

----------------off topic------------------

KR007: With conroe, I believe that the E6400 has the best price to performance ratio taking into account overclockability. I think that the E6300, E6400, and E6600 can attain 50% overclock which would yield 2.8GHz, 3.2GHz, and 3.6 GHz respectively at a cost of about $215, $240, and $340 respectively.

Although I believe that the E6300 can be found to be cheaper, it might end up being the most popular and hence drive prices a little higher or go out of stock quicker. Additionally, I don't mind spending $35 extra to go from 2.8 to 3.2 GHz. (I am using the pricing from Monarch since they ship to APO addresses; not many etailers do)

As for the Blu-ray question, I have two uses for it: I want to store data, and watch movies.

Data: other alternatives would be DVDRs and HD-DVDRs; HD-DVDRs will only be 15GB and 30GB, while Blu will be 25 and 50. I need to re-burn some of my older data DVDRs since they are starting to laser-rot (the dye is starting to become discolored in some discs). I had to throw away about 100 DVDRs because they are no longer readable. DVDRs are cheaper per disc to re-burn but I figure I'd consolidate the files into a larger capacity media format.

Either way, I figure that Blu-Ray will be a better format for archiving. The Sony Blu-Ray drive will be released at $750 but the drives have been out for a while. I think HD-DVD burners are starting to hit the market at about $3000. Based on side by side comparison I read, I think I'd rather watch movies on HD-DVD but I think Blu-Ray movies will still look better than regular DVDs.

redxix
08-06-06, 10:02 AM
I looked through the EVGA step-up program and I think it might be difficult since I am overseas and it would require a long downtime to do the upgrade.

I did some googling and it "sounds" like 8800 GTX will be released in October at the earliest but most likely November. I don't think I can live on a 7900 GT for that long.

Since I am planning my next upgrade to be in about 12 months, I think I will just get the 7950 GX2 now... and get a G80 GX2 or G90 in mid 2007. The 45nm Intel quad-core should be around then also =)

Lfctony
08-06-06, 11:06 AM
Just have in mind that the 6300/6400 have 1MB of cache per core, while the 6600 has 2MB. Chips aren't identical. The extra cache comes in handy.

Mr_LoL
08-06-06, 12:15 PM
Purchase the 7950 GX2 right now from EVGA and utilize their StepUp Program to upgrade to the G80 once released :)

Does that mean that the G80 will be released within 90 days then?

J-Mag
08-06-06, 12:42 PM
i just hope they have a 1 gig version on release.

why? I have seen oblvion push 400-500mb of frame buffer (modded ini for increased draw distance, max in game settings 1600x1200 4x AA), but nothing else even comes close at similar resolutions. I guess it is possible to spill over the 512mb mark, but trying to run that game at any higher settings will turn it into a slideshow

I am guessing we won't see 1gb cards (7950 counts as 512mb), until later half of 07.

redxix
08-06-06, 01:24 PM
Lfctony: L2 cache does make a difference, but I do not think I will need it. I need performance for video encoding and games. A few minutes difference for a few hours encode is fine by me. Additionally, games such as HL2: Ep1 at high resolution do not benefit that much from the extra cache

http://www.bit-tech.net/hardware/2006/07/14/intel_core_2_duo_processors/5.html

I just don't think it is worth the extra $100 for the 1% difference in HL2 and a few minutes in Vdub.

SH64
08-06-06, 04:14 PM
The G80 will most likely arrive next year (along with Vista) .. i dont see a point waiting for it if you are building your PC now or within the next couple months.

go ahead & get the 7950GX2 .. & if HDR+AA is a priority for you & you cant live without it then either get the X1900XTX or wait further for the X1950.

DataMatrix
08-06-06, 04:31 PM
Just wait until the AMD 4x4 and G80 is out...

shungokusatsu
08-06-06, 06:00 PM
Wait for the G80, we still don't know if it will be dual gpu based or not. A 79590 will give you excellent frames in al lthe games you listed, go Quad and you'll get even more, especially in FEAR. I'd personally wait it out for Vista and DX10.

a12ctic
08-06-06, 06:25 PM
id suggest the extra cache, it seemed like a HUGE difference from the reviews i read.

redxix
08-06-06, 10:29 PM
id suggest the extra cache, it seemed like a HUGE difference from the reviews i read.

Most of the reviews I've seen compare the E6300 2MB at 1.86 GHz to the E6600 4MB at 2.4 GHz. It is difficult to conclude the benefit of the extra cache when the clock speeds are different.

The link I provided above shows that in Half-Life 2: Episode 1 there is only a 1% difference between 2MB cache and 4MB cache at 1600X1200, I imagine that the difference would be smaller at 1920x1200 which is the resolution I will be running at.

redxix
08-06-06, 10:32 PM
I looked online and I found the BFG 7950 GX2 OC edition... but it doesn't list the clocks it has them as xxx MHz and xxxx MHz... what does this mean???

K007
08-07-06, 03:40 AM
Most of the reviews I've seen compare the E6300 2MB at 1.86 GHz to the E6600 4MB at 2.4 GHz. It is difficult to conclude the benefit of the extra cache when the clock speeds are different.

The link I provided above shows that in Half-Life 2: Episode 1 there is only a 1% difference between 2MB cache and 4MB cache at 1600X1200, I imagine that the difference would be smaller at 1920x1200 which is the resolution I will be running at.

Yea at high ress it is not as much..but i think if you check out jakups sli thread..i think its worth it to get the e6600 and o/c that to really give the gx2 a real boost..or what eva card you plan to get..the GX2 has alot of power going to waste from crappy cpus ><..like mine..i should o/c this sucker when i plan to get x2 when crysis comes out -.-

Also with the O/C GX2..= Complete waste of money. Sure it runs faster..but in the end it is not worth the extra preasure you put on the card considering how fragile the 7900 series was even at stock speeds..you can really give the GX2 alot of power by simply o/c the cpu or getting a better cpu/ram..simply not worth the preasure you put on the gx2..plus i saw the xfx one go for some real insane value compared to a stock card..just for like 1-2 fps increase..o/c cpu is better / safer choice.

Easy to get the cpu fixed than returning a brocken GX2.

Dazz
08-07-06, 05:12 AM
I would say pick up a 7900GT card as the 9750GX2 is two GPU's so if games don't beinfit from SLI then it's a waste then get a G80 later down the line, if all you want to play is HL2 then a 7900GT is the best bang for the buck card.

K007
08-07-06, 05:50 AM
I would say pick up a 7900GT card as the 9750GX2 is two GPU's so if games don't beinfit from SLI then it's a waste then get a G80 later down the line, if all you want to play is HL2 then a 7900GT is the best bang for the buck card.

Who said the GX2 needs profiles like traditional SLI setup? Did nvidia say the GX2 use SLI and make use of AFR or SFR?...i sure haven't..and i dont need profiles to ge tmine to work..

Lfctony
08-07-06, 07:26 AM
Lfctony: L2 cache does make a difference, but I do not think I will need it. I need performance for video encoding and games. A few minutes difference for a few hours encode is fine by me. Additionally, games such as HL2: Ep1 at high resolution do not benefit that much from the extra cache

http://www.bit-tech.net/hardware/2006/07/14/intel_core_2_duo_processors/5.html

I just don't think it is worth the extra $100 for the 1% difference in HL2 and a few minutes in Vdub.

Yeah I agree there, I was just pointing out that they have a difference in the cache pool in case you were not aware of that fact. :)

redxix
08-07-06, 11:06 AM
Yeah I agree there, I was just pointing out that they have a difference in the cache pool in case you were of that fact. :)

ah no. don't worry about it. I am guessing there are some people wondering "how much" of a perfomance difference there is between 2MB and 4MB, and when they read this particular thread, hopefully they can make an informed decision. ;)

I really try to keep useful information in the threads I create.

SH64
08-07-06, 12:31 PM
I would say pick up a 7900GT card as the 9750GX2 is two GPU's so if games don't beinfit from SLI then it's a waste then get a G80 later down the line, if all you want to play is HL2 then a 7900GT is the best bang for the buck card.
90% of the games benefit from SLI (even if by small margin) .. so theres no "waste" here.

grimreefer
08-07-06, 08:12 PM
get a 7900gt and an e6600 and wait for g80....
dont waste money on an expensive graphics card with no sm4.0 support

KoRnfR3ak
08-08-06, 03:03 PM
Well...the 7950 is there to be put up against the single cards..so i would assume that the 7950 will have more power to play with..but something like 8800gts in sli probably kick the single 7950 ass..

It is hard to say how much power the g80 will bring but i would think that maybe the 8800gt..or gtx in single will probably not beat the 7950..might be close..
One 7950GX2 has around the same performance of 2x 7900GT in SLI.
A single G80 will be faster than 2x 7900GT in SLI.
Do you guys remember what happened last year?
May 2005, the top nVidia card is the 6800Ultra.
June 2005, the first Geforce7 is released, the 7800GTX 256MB.
A single 7800GTX 256MB manages to beat two 6800Ultras in SLI. How? 2048x1536 4xAA 16AF, that's how.

Anandtech: Splinter Cell 3 (http://www.anandtech.com/video/showdoc.aspx?i=2451&p=14), Half Life 2 (http://www.anandtech.com/video/showdoc.aspx?i=2451&p=13),Battlefield 2 (http://www.anandtech.com/video/showdoc.aspx?i=2451&p=8)

Firingsquad:
The Asus 7800GTX (Far Cry (http://firingsquad.com/hardware/asus_extreme_n7800_gtx_top/page6.asp), HalfLife 2 (http://firingsquad.com/hardware/asus_extreme_n7800_gtx_top/page8.asp), Battlefield 2 (http://firingsquad.com/hardware/asus_extreme_n7800_gtx_top/page9.asp), F.E.A.R. (http://firingsquad.com/hardware/asus_extreme_n7800_gtx_top/page10.asp))
The MSI 7800GTX (Battlefield 2 (http://firingsquad.com/hardware/msi_nx7800_gtx_review/page9.asp))
The eVGA 7800GTX(Far Cry (http://firingsquad.com/hardware/evga_e-geforce_7800_gtx_ko_acs3_review/page7.asp), HalfLife 2 (http://firingsquad.com/hardware/evga_e-geforce_7800_gtx_ko_acs3_review/page9.asp), Battlefield 2 (http://firingsquad.com/hardware/evga_e-geforce_7800_gtx_ko_acs3_review/page10.asp))Techreport:
3Dmark05 - Game Test #1 - Return to Proxycon (http://www.techreport.com/etc/2005q3/hires-gaming/index.x?pg=9)
Splinter Cell 3 (http://www.techreport.com/etc/2005q3/hires-gaming/index.x?pg=8), The Chronicles Of Riddick (http://www.techreport.com/etc/2005q3/hires-gaming/index.x?pg=7), Half Life 2 (http://www.techreport.com/etc/2005q3/hires-gaming/index.x?pg=6), Doom3 (http://www.techreport.com/etc/2005q3/hires-gaming/index.x?pg=4)Check only the 2048x1536 tests. In high resolutions like that, A single 7800GTX 256MB has the same (and some times even better) performance than two 6800Ultras in SLI. In some of the 2048x1536 tests, they only test a single 6800Ultra, but you can clearly see that the single 6800Ultra sometimes has half (or less than half) of the single 7800GTX's frame rate, meaning that even 2x 6800Ultras wouldn't match the single 7800GTX performance, at that resolution.

If the technology leap Geforce7-to-Geforce8 is similar to Geforce6-to-Geforce7 leap we saw in 2005, then at that high resolutions, with lots of AA and AF, and with recent games, a single first card of the next generation (one G80) might have equal or more performance than 2x the best cards in SLI of the previous generation (2x 7900GTX).

You might ask "But who plays games at such high resolutions?". Notice that since 2005, 24" 1920x1200 LCD monitors (Dell, BenQ, Samsung) became more mainstream. In 2007, maybe even 30" 2560x1600 LCD monitors (Dell, Benq) might become a little less expensive.
So for the owners of those monitors, a single G80 will prove herself to be better than a single 7950GX2. Likewise for 2xG80 VS. 2x7950GX2.
And I'm only talking about DX9 games. With DX10 games, G80's lead over the 7950GX2 will be even higher.