PDA

View Full Version : nVidia 7800 GTX in SLI lower in perfomance in OBLIVION compared to x1900xt ATI!!


Pages : [1] 2

sandeep
04-16-06, 07:16 AM
Ppl who have 7900GTX/7800GTX single or SLI, and want to test, please take a screenshot of this save file and compare. When you load the game, do not move your mouse, just take a screenshot for BEST comparison.

My driver versions 84.43(Latest official BETA)
You should run the game without any tweaks on .ini file.
All in-game settings are MAX!
1600*1200
HDR ON
No AF
http://www.uploadtemple.com/view.php/1145179702.rar

On this scene, my SLI gets 11fps and X1900XTX single gets 20fps.
SLI 78000GTX=11,22fps, single X1900XTX=20,31fps respectively
http://img506.imageshack.us/img506/434/oblivion20060416023323506eh.th.jpg (http://img506.imageshack.us/my.php?image=oblivion20060416023323506eh.jpg)http://img53.imageshack.us/img53/8633/oblivion20060416023127296vp.th.jpg (http://img53.imageshack.us/my.php?image=oblivion20060416023127296vp.jpg)http://img510.imageshack.us/img510/350/sandeep2nr.th.jpg (http://img510.imageshack.us/my.php?image=sandeep2nr.jpg)http://img51.imageshack.us/img51/1319/sandeep16tw.th.jpg (http://img51.imageshack.us/my.php?image=sandeep16tw.jpg)

SLI 78000GTX=47fps,single X1900XTX=55fps respectively
http://img77.imageshack.us/img77/7127/oblivion20060416175359769xq.th.jpg (http://img77.imageshack.us/my.php?image=oblivion20060416175359769xq.jpg)http://img55.imageshack.us/img55/3103/sandeep56cb.th.jpg (http://img55.imageshack.us/my.php?image=sandeep56cb.jpg)

After a long investigation, I have found that nVidia cards are running **** compared to ATI.

More details are found here:
http://forums.tweakguides.com/showthread.php?p=29854#post29854

I have done the tests personally and that is what I concur.

Cheers

I think CF X1900XTX will run great.

|MaguS|
04-16-06, 07:34 AM
Shoo troll... Oblivion never dips to 11fps for me at 1600x1200 16xAF HDR on.

sandeep
04-16-06, 07:50 AM
Prove it?;)

Try to download that save file and place it in your MyDocuments\Games\Oblivion\Save folder.

Please remember no optimisation to .ini file. Load the save file and take a screen shot with fraps. Its all easy and simple.

1600*1200 everything MAX
HDR on
No AA/AF

Cheers

N.B Please be honest

FearMeAll
04-16-06, 08:45 AM
Prove it?;)

Try to download that save file and place it in your MyDocuments\Games\Oblivion\Save folder.

Please remember no optimisation to .ini file. Load the save file and take a screen shot with fraps. Its all easy and simple.

1600*1200 everything MAX
HDR on
No AA/AF

Cheers

N.B Please be honest
err ok..I cannot run at 1600x1200 but 1680x1050 which is the same thing repsectively. Also I will not decrease my settings and tweaks because I have them maxed out to their fullest extent. also I have increase foliage distance by 5000 in the ini and have not messed with density. I am also using the super high resolution textures by captain kill also I am running with 16x aniso. I really didn't want to lower my settings at all.
I am getting 30fps in this scene with settings maxed beyond what the game calls for. You are doing something wrong to get FPS that low. Also I am using the same drivers as you.
Proof is in the pudding. I even have my vid vard clocks lowered by 10mhz on core and memory.
Having said that..the x1900xt is a kickass card and I praise it's efficient and well crafted design. But I don't think my FPS "suck" in Oblivion as you put it. I will run with the low aniso settings and foliage settings and post again if you really want. Hell I'll even take away my pretty new textures.
(edit- just noticed you are running the 256 meg version of the 7800gtx with much lower clocks and half the memory of an x1900xtx no friggin wonder.)

a12ctic
04-16-06, 08:50 AM
ATI Fanboy, please leave, just because the 1900xtx wins in a few games doesnt mean its the allout better card... If oblivion was in opengl nvidia would be getting 60 fps and ati would be getting about 30 and you want to know why? because ati cant write a driver... I remember a few months ago ati cards couldnt play civ 4 a very hyped and in my opinion great game for weeks until after release because ati cant write a driver. Also, ati doesnt provide alternative OS drivers which makes it a stupid choice for anybody who wants a stable OS...

sandeep
04-16-06, 08:57 AM
I am getting 30fps in this scene with settings maxed beyond what the game calls for. You are doing something wrong to get FPS that low. Also I am using the same drivers as you.
Proof is in the pudding. I even have my vid vard clocks lowered by 10mhz on core and memory.

Mate, I called for honesty here. Clearly your Self Shadow is turned 'off' and your shadow on the grass is 'off'.

Please look at the pictures I have posted. You will see the self shadow on the farmer on the two cards BUT not on yours. Also notice the shadow casted by the Horse on the wood.

If I turn my shadows 'off' I get 31fps!!

ATI Fanboy, please leave, just because the 1900xtx wins in a few games doesnt mean its the allout better card... If oblivion was in opengl nvidia would be getting 60 fps and ati would be getting about 30 and you want to know why? because ati cant write a driver... I remember a few months ago ati cards couldnt play civ 4 a very hyped and in my opinion great game for weeks until after release because ati cant write a driver. Also, ati doesnt provide alternative OS drivers which makes it a stupid choice for anybody who wants a stable OS...

P.S This is not a life and death situation, there is nothing to brag about a graphics card. Its just a matter of money and quality. I had 7800GTX and boought another one ONLY to find out X1900XTX toasts them. If anything, I have always preferred nVidia cards for all my purchases but it might just change. However I did recommend my mate 9800 Pro even when I had FX5900ULTRA.
So please don't stereotype anyone with fanboy remark without a proper research. I am not doing anything wrong here. I just want the truth. Oblivion is by far a VERY good game and I would LOVE to play it with quality experience. After all, I buy graphics card to enjoy my gaming experience and not just for the sake of having one.

MUYA
04-16-06, 08:59 AM
Flamebaitish thread..I have changed the Thread title. Sandeep in future would you please pick a less provacative title. You can discuss merits of different cards but do not incite a flame war.

sandeep
04-16-06, 09:05 AM
Flamebaitish thread..I have changed the Thread title. Sandeep in future would you please pick a less provacative title. You can discuss merits of different cards but do not incite a flame war.
Cheers MUYA. Yep, sorry about the title and thanks for changing it.

FearMeAll
04-16-06, 09:08 AM
Mate, I called for honesty here. Clearly your Self Shadow is turned 'off' and your shadow on the grass is 'off'.

Please look at the pictures I have posted. You will see the self shadow on the farmer on the two cards BUT not on yours. Also notice the shadow casted by the Horse on the wood.

If I turn my shadows 'off' I get 31fps!!



P.S This is not a life and death situation, there is nothing to brag about a graphics card. Its just a matter of money and quality. I had 7800GTX and boought another one ONLY to find out X1900XTX toasts them. If anything, I have always preferred nVidia cards for all my purchases but it might just change. However I did recommend my mate 9800 Pro even when I had FX5900ULTRA.
So please don't stereotype anyone with fanboy remark without a proper research. I am not doing anything wrong here. I just want the truth. Oblivion is by far a VERY good game and I would LOVE to play it with quality experience. After all, I buy graphics card to enjoy my gaming experience and not just for the sake of having one.
Why would anyone play with self shadows on? They look absolutely horrid and they are buggy. I don't like playing a game where the women have beards.
Here's an idea. Why don't you compare cards of the same generation instead of 2 cards from different ones. A 7800gtx 256 even in sli is going to be no where near as powerful as an x1900xtx. That card is almost a year old and you should compare the 7900gtx to an 1900. Yep the x1900 will win in this game, but you know what even all of our top end cards are barely playing this game. It is moot to argue over who is king in this game because as it stands right now the best super rigs are taking everything they can to keep this game playable.
In that shot I posted you call 20fps playable on the x1900xtx?...please. I would still disable self shadows and shadows on grass to get it above 30fps. This thread is basically can be summed up like this - "the 7800 sucks in this game but the x1900 sucks less". It is unplayable on both cards with these 2 shadow features(one of them borked) but it is a measure by "how unplayable". Get where I'm going with this? I like never dipping below 30fps personally.

Quijonsith
04-16-06, 09:17 AM
That's funny, I read a review specifically about oblivion and various card combinations. In some cases the 7900gt sli beat out the x1900xt crossfire, much less the 7900gtx sli. Sounds to me like you are bottlenecked buddy. SLI needs alot of cpu power running it as well as plenty of ram. So what are your specs? Also, did you use the oblivion SLI profile that enables SLI support for the game?

sandeep
04-16-06, 09:22 AM
My Specs are in my sig.:) I have run my Opteron 175 on 2.5Ghz and 2.7Ghz no difference.

Btw. the screenshot provided by X1900XTX was run on a FX57.

scott123
04-16-06, 09:26 AM
Oblivion plays fine on the 1900 and 7900 cards. It doesn't matter which card you have, the game will play fine and look awsome, so no need to debate.

I can tell you that rFactor plays MUCH better on Nvidia as an example. My 7800GTX was aprox 30% faster with all options maxed (game setting = DX9 mode) compared to my 1900 (wierd). This just shows that it really boils down to development support, rather than hardware specifications. The 1900 and 7900 cards overall are very even. Purchasing either card would be the right decision.

It really seems kinda stupid to start these silly "look what I found" threads when we all know it does nothing but start a flameware between the two sides.

Quijonsith
04-16-06, 09:33 AM
My Specs are in my sig.:) I have run my Opteron 175 on 2.5Ghz and 2.7Ghz no difference.

Btw. the screenshot provided by X1900XTX was run on a FX57.
Wow, yea that's my cue to go to bed when I didn't even notice your sig. Still didn't answer if you used the SLI profile though. Regardless, I agree with scott123. A couple of frames here and there doesn't matter, they're all good.

FearMeAll
04-16-06, 09:36 AM
Oblivion plays fine on the 1900 and 7900 cards. It doesn't matter which card you have, the game will play fine and look awsome, so no need to debate.

I can tell you that rFactor plays MUCH better on Nvidia as an example. My 7800GTX was aprox 30% faster with all options maxed (game setting = DX9 mode) compared to my 1900 (wierd). This just shows that it really boils down to development support, rather than hardware specifications. The 1900 and 7900 cards overall are very even. Purchasing either card would be the right decision.

It really seems kinda stupid to start these silly "look what I found" threads when we all know it does nothing but start a flameware between the two sides.
agreed ;)

sandeep
04-16-06, 10:05 AM
Wow, yea that's my cue to go to bed when I didn't even notice your sig. Still didn't answer if you used the SLI profile though. Regardless, I agree with scott123. A couple of frames here and there doesn't matter, they're all good.
I used the SLI profile that came with the drivers and its working.

OWA
04-16-06, 10:30 AM
Comparing the 7800 GTX 512s in SLI to a X1900 XTX in my systems, SLI is usually faster but not by a huge amount. If you compare optimizations on for ATI versus optimizations off for Nvidia, they're very close with maybe even ATI coming out ahead some of the time. It also depends on where you test. In some areas (like when you just get out of the sewer and make your way to the City), even a single 7800 is right there with the X1900 XTX. As far as Crossfire, in the comparison I saw, it didn't do very well but apparently it wasn't really working due to a driver bug. I haven't seen a comparison since the chuck patch was released (which was supposed to help with that).

Here are various results I posted in the oblivion forum. You can find the actual links by searching on my username and Frames (that how I found them anyway).

1920x1200 with HDR on, no AA, and usually 16xAF forced, The X2 4800+ were normally set to the same speeds when doing the comparisons. The exception is the AF test since I was only trying to show the hit caused by AF (the cpu speeds varied in that one but I noted them) and trying to see how the CPU affected the particular area I was testing. The drivers used were the 84.25s and the Cat 6.3s (pre-chuck patch).

Grassy Area

HQ

2006-04-01 09:50:46 - Oblivion (Nv Save 25, HQ, single)
Frames: 407 - Time: 26143ms - Avg: 15.568 - Min: 7 - Max: 22

2006-04-01 13:02:30 - Oblivion (Nv Save 25, HQ, SLI)
Frames: 717 - Time: 25300ms - Avg: 28.340 - Min: 15 - Max: 38

2006-04-01 09:38:29 - Oblivion (ATI save 25, HQ)
Frames: 576 - Time: 27426ms - Avg: 21.002 - Min: 17 - Max: 25

Q

2006-04-01 09:59:42 - Oblivion (Nv Save 25, Q, Single)
Frames: 546 - Time: 27009ms - Avg: 20.215 - Min: 10 - Max: 28

2006-04-01 13:04:36 - Oblivion (NV Save 25, Q, SLI)
Frames: 827 - Time: 25306ms - Avg: 32.680 - Min: 18 - Max: 42

2006-04-01 10:06:57 - Oblivion (ATI Save 25, Q)
Frames: 671 - Time: 25440ms - Avg: 26.376 - Min: 20 - Max: 31


In the City

HQ

2006-04-01 10:46:02 - Oblivion (Nv Save 17, HQ, CO, Single)
Frames: 1648 - Time: 52677ms - Avg: 31.285 - Min: 17 - Max: 53

2006-04-01 13:06:46 - Oblivion (NV Save 17, HQ, SLI)
Frames: 2781 - Time: 52095ms - Avg: 53.383 - Min: 26 - Max: 87

2006-04-01 11:02:27 - Oblivion (ATI Save 17 HQ CO)
Frames: 2502 - Time: 53391ms - Avg: 46.862 - Min: 26 - Max: 79


Q

2006-04-01 10:49:51 - Oblivion (Nv Save 17, Q, CO, Single)
Frames: 2020 - Time: 51987ms - Avg: 38.856 - Min: 20 - Max: 66

2006-04-01 13:09:12 - Oblivion (NV Save 17, Q, SLI)
Frames: 3263 - Time: 52565ms - Avg: 62.076 - Min: 26 - Max: 106

2006-04-01 11:05:17 - Oblivion (ATI Save 17 Q CO)
Frames: 2901 - Time: 52898ms - Avg: 54.841 - Min: 26 - Max: 88

================================================== =============

With both CPUs running at stock (2.4ghz), 1920x1200, HDR on, 16xAF forced on

Just out of the sewer

2006-03-22 20:00:10 - Oblivion (7800 GTX 512 SLI forceware 84.25)
Frames: 2947 - Time: 64699ms - Avg: 45.549 - Min: 23 - Max: 71

2006-03-22 20:03:06 - Oblivion (7800 GTX 512 forceware 84.25)
Frames: 1814 - Time: 65616ms - Avg: 27.646 - Min: 13 - Max: 43

2006-03-22 20:25:00 - Oblivion (X1900 XTX Cat 6.3s)
Frames: 1866 - Time: 64704ms - Avg: 28.839 - Min: 15 - Max: 44

Single cards (quality vs cat ai on)

2006-03-22 21:24:32 - Oblivion (7800)
Frames: 2160 - Time: 65093ms - Avg: 33.183 - Min: 18 - Max: 48

2006-03-22 21:19:41 - Oblivion (XTX)
Frames: 2363 - Time: 66378ms - Avg: 35.599 - Min: 24 - Max: 49


================================================== =======================

Here are some results I just made. The test is forced 16xAF in the CP versus leaving it at Application Control in the CP. I also wanted to see how the CPU impacted things and the result was, not much, in this test anyway.


Nvidia AF Comparison

SLI
2006-03-23 17:04:16 - Oblivion (16xAF, D3D 0 ahead, SLI, cpu@2.7)
Frames: 2837 - Time: 64834ms - Avg: 43.758 - Min: 22 - Max: 66

2006-03-23 17:07:33 - Oblivion (App Ctl, D3D 0 ahead, SLI, cpu@2.7)
Frames: 3419 - Time: 64106ms - Avg: 53.334 - Min: 32 - Max: 76

Single
2006-03-23 17:12:37 - Oblivion (16xAF, D3D 0 ahead, single gpu, cpu@2.7)
Frames: 1804 - Time: 65592ms - Avg: 27.503 - Min: 13 - Max: 42

2006-03-23 17:09:38 - Oblivion (App Ctl, D3D 0 ahead, single gpu, cpu@2.7)
Frames: 2329 - Time: 67581ms - Avg: 34.462 - Min: 19 - Max: 49

2006-03-23 17:31:41 - Oblivion (16xAF, D3D 0 ahead, single gpu, cpu@2.4)
Frames: 1708 - Time: 62515ms - Avg: 27.321 - Min: 13 - Max: 42

2006-03-23 17:34:45 - Oblivion (App Ctl, D3D 0 ahead, single gpu, cpu@2.4)
Frames: 2255 - Time: 64808ms - Avg: 34.795 - Min: 20 - Max: 51

ATI AF Comparison

2006-03-23 17:20:23 - Oblivion (16xAF, cpu@2.41)
Frames: 1972 - Time: 66224ms - Avg: 29.778 - Min: 15 - Max: 43

2006-03-23 17:23:07 - Oblivion (App Ctl, cpu@2.41)
Frames: 2554 - Time: 65019ms - Avg: 39.281 - Min: 28 - Max: 54

|MaguS|
04-16-06, 11:46 AM
Prove it?;)

Try to download that save file and place it in your MyDocuments\Games\Oblivion\Save folder.

Please remember no optimisation to .ini file. Load the save file and take a screen shot with fraps. Its all easy and simple.

1600*1200 everything MAX
HDR on
No AA/AF

Cheers

N.B Please be honest

Why would I lower my settings to prove it? The tweaks I have enable increase the draw distance and improve the over all picture quality, I don't use any performance aiding tweaks other then the one that increases the memory available.

Just for ****s and giggles, Ran it and was running 20FPS. I did notice though that once I took a step foward the FPS jumped to 30, seems that the NPC is a big performance hit.

Screenshot (faster then 20FPS :p)
http://img54.imageshack.us/img54/6163/oblivion9bh.th.jpg (http://img54.imageshack.us/my.php?image=oblivion9bh.jpg)

Saintster
04-16-06, 11:48 AM
Ppl who have 7900GTX/7800GTX single or SLI, and want to test, please take a screenshot of this save file and compare. When you load the game, do not move your mouse, just take a screenshot for BEST comparison.

My driver versions 84.43(Latest official BETA)
You should run the game without any tweaks on .ini file.
All in-game settings are MAX!
1600*1200
HDR ON
No AF
http://www.uploadtemple.com/view.php/1145179702.rar

On this scene, my SLI gets 11fps and X1900XTX single gets 20fps.
SLI 78000GTX=11,22fps, single X1900XTX=20,31fps respectively
http://img506.imageshack.us/img506/434/oblivion20060416023323506eh.th.jpg (http://img506.imageshack.us/my.php?image=oblivion20060416023323506eh.jpg)http://img53.imageshack.us/img53/8633/oblivion20060416023127296vp.th.jpg (http://img53.imageshack.us/my.php?image=oblivion20060416023127296vp.jpg)http://img510.imageshack.us/img510/350/sandeep2nr.th.jpg (http://img510.imageshack.us/my.php?image=sandeep2nr.jpg)http://img51.imageshack.us/img51/1319/sandeep16tw.th.jpg (http://img51.imageshack.us/my.php?image=sandeep16tw.jpg)

SLI 78000GTX=47fps,single X1900XTX=55fps respectively
http://img77.imageshack.us/img77/7127/oblivion20060416175359769xq.th.jpg (http://img77.imageshack.us/my.php?image=oblivion20060416175359769xq.jpg)http://img55.imageshack.us/img55/3103/sandeep56cb.th.jpg (http://img55.imageshack.us/my.php?image=sandeep56cb.jpg)

After a long investigation, I have found that nVidia cards are running **** compared to ATI.

More details are found here:
http://forums.tweakguides.com/showthread.php?p=29854#post29854

I have done the tests personally and that is what I concur.

Cheers

I think CF X1900XTX will run great.


Go Away you are a troll this is what i concur after reading your post.:thumbdwn:

sandeep
04-16-06, 01:22 PM
Just for ****s and giggles, Ran it and was running 20FPS. I did notice though that once I took a step foward the FPS jumped to 30, seems that the NPC is a big performance hit.

Well you still don't have self shadow turned on( there is no shadow on farmers hand). But nevermind....I don't intend to bother you but I was really curious because it appears that ATI does perform very well and superior to its nVidia counter part.
When walking south from Bravil, the dense forest has a hit on performance almost making it unplayable(around mid 20fps) on my 7800GTX SLI at 1600*1200 HDR Max ingame settings, no AF. And so I pondered to find what X1900XTX gets and to my surprise, they yield more fps than SLI in areas with dense vegetation. Its also very obvious from OWAs tests.
Grassy Area

HQ
2006-04-01 09:50:46 - Oblivion (Nv Save 25, HQ, single)
Frames: 407 - Time: 26143ms - Avg: 15.568 - Min: 7 - Max: 22

2006-04-01 13:02:30 - Oblivion (Nv Save 25, HQ, SLI)
Frames: 717 - Time: 25300ms - Avg: 28.340 - Min: 15 - Max: 38

2006-04-01 09:38:29 - Oblivion (ATI save 25, HQ)
Frames: 576 - Time: 27426ms - Avg: 21.002 - Min: 17 - Max: 25


Anyway, I can see why X1900XTX owners are happy and those wtih CF must be seeing impressive numbers.
cheers

Marvel_us
04-16-06, 01:46 PM
The 256 7800GTX is not the X1900XTX's counter part though.

Compare Xfired X1800s to the 7800GTX and that would be a better battle.

|MaguS|
04-16-06, 01:50 PM
Well you still don't have self shadow turned on( there is no shadow on farmers hand). But nevermind....I don't intend to bother you but I was really curious because it appears that ATI does perform very well and superior to its nVidia counter part.


Your comparing different generations of cards first of all and I would also think that 16xAF and HQ Tweaks would be a larger performance hit then self shadows which are broken on both vendors (though worse on ATI Cards).

parecon
04-16-06, 02:13 PM
The 256 7800GTX is not the X1900XTX's counter part though.

Compare Xfired X1800s to the 7800GTX and that would be a better battle.

Dang! you beat me...

Seriously, compare dual X1800's to dual 7800's. Should I try and compare my 7900 GTX to dual X1800's in Quake 4? It makes no sense, of course future, and better cards will be faster.

OWA
04-16-06, 02:17 PM
It almost seems like a bug at the initial save point. Things are crawling on my nvidia system. My framerate with SLI was around 14 or 15. On the XTX, it was 20. But, using fraps to record a benchmark by walking around the inside of the fence (basically a lap inside the fenced in area), things change a little bit. The nvidia cards smooth out just a little ways away from your save point and when returning to the save point area are still smooth. I have to leave now so I'll do a single card later and post the screenshots, double-check my results, etc.

Edit: The save point area is also weird in that optimizations on versus optimizations off give almost the same results (on both systems).

2006-04-16 14:57:04 - Oblivion (sandeep save, ATI, App AF, 16x12, HDR On, Cat AI off)
Frames: 1068 - Time: 34524ms - Avg: 30.935 - Min: 17 - Max: 47

2006-04-16 15:04:55 - Oblivion (sandeep save, ATI, App AF, 16x12, HDR On, Cat AI On)
Frames: 1132 - Time: 36213ms - Avg: 31.259 - Min: 14 - Max: 52


2006-04-16 14:42:21 - Oblivion (sandeep save point, SLI, 16x12, HDR on, no AF, HQ)
Frames: 1080 - Time: 29688ms - Avg: 36.378 - Min: 12 - Max: 65

2006-04-16 14:49:27 - Oblivion (sandeep save point, SLI, 16x12, HDR on, no AF, Q)
Frames: 1154 - Time: 31132ms - Avg: 37.068 - Min: 12 - Max: 67

2006-04-16 14:58:53 - Oblivion (sandeep save point, SLI, 16x12, HDR on, app AF, Q)
Frames: 1475 - Time: 38140ms - Avg: 38.673 - Min: 12 - Max: 69

2006-04-16 15:00:59 - Oblivion (sandeep save point, SLI, 16x12, HDR on, app AF, HQ)
Frames: 1409 - Time: 37322ms - Avg: 37.753 - Min: 12 - Max: 68

sandeep
04-16-06, 02:41 PM
It almost seems like a bug at the initial save point. Things are crawling on my nvidia system. hehe LOL (kngt) Things were crawling on my nVidia system as well :-)
Well its not a bug. Just that while playing the game, I arrived at Bravil stable and I was shocked with the performance. As a result I happen to save it at the most demanding area and asked other forum members to test with their ATI cards.

An important thing I noticed was that the shadows rendered by nVidia were better in quality than X1900XTX. Self Shadow and shadows from grass in particular
(both of which have options ingame menu). So perhaps the low fps which nVidia gets is possibly due to the fact its image quality is better??? Maybe you could also disable self shadow and shadows by grass and then compare the fps??

But yeah, X1900XTX got 20fps while my 2*7800GTX 256Meg got 11fps. Our results are very similiar.

Thanks for your time OWA!!! Much appreciated mate. Look forward to your results.

Cheers

MindBlank
04-16-06, 03:51 PM
No, there's something wrong with either the game or the drivers.

I get 5 FPS in that exact area at 1280x1024 max with my 7900GT (can't go above that rez - TFT).

But as soon as i turn off Shadows on Grass it jumps tp 20-21.

If i take one or two steps forward the framerate jumps to 25+. It gets smooth wherever i look, except this scene. There is definately something wrong here.