PDA

View Full Version : X1900XTX beats up a 7900GTX (Oblivion related)


Pages : [1] 2

Shamrock
03-31-06, 11:58 PM
Uhm...this is bad.

In Oblivion, the 7900GTX gets beat on by a X1900XTX

http://www.bit-tech.net/gaming/2006/03/31/elder_scrolls_oblivion/4.html

http://www.bit-tech.net/content_images/elder_scrolls_oblivion/highend_bps.jpg

They even had to turn off Grass Shadows to get the 7900 Playable. The XTX had HQ AF and max details.

Clay
04-01-06, 12:03 AM
OWA will probably post here with his findings. He has all of the hardware above and the last I knew he experienced similar results with either card.

killahsin
04-01-06, 12:51 AM
i had grass shadows on on max details the game provided and the game was quite playable on a gt, so i suspect there to be more then that. In fact i just recentl turned grass shadows off in order to render everything else 2x as far as the game goes(via ini), well i also made the grass density 120, for the trade offs. So i call fud or bad testing on this one. Because their machine is meaner then mine, yet i could play at higher settings then they deemed playable?

superklye
04-01-06, 01:16 AM
I guess it's all in the eye of the beholder. I was getting 17FPS or so, sometimes less in crazy outdoor areas, using a single 6800GT and I thought it was definitely playable. Going to SLI really made it buttery smooth though...maybe they consider the absolute minimum 30 or 40FPS?

parecon
04-01-06, 03:29 AM
Yeah, but who the hell likes those lame catalyst drivers and .net framework :)

OWA
04-01-06, 03:30 AM
Yeah, my results were very close (about 1-5fps difference) but I'll try to test some areas more like what they're testing. Mine was mixed between low framerate areas and some more normal areas, trying to get a little of everything (grass, trees, water, grass-less areas, and some of the city paths/roads). In most areas I've checked, changing settings hasn't really made much of a difference in performance so I'm a little surprised they had to turn grass shadows off, for example. That's part of what is frustrating about this game (besides the widely fluctuating framerate), changing settings doesn't seem to have much of an effect (at least for me).

OWA
04-01-06, 10:31 AM
There does seem to be a bigger difference in some of the lower framerate areas especially if you compare ATI's Cat AI on versus HQ mode on the Nvidia side. In the other area I was testing, even comparing Cat AI on versus HQ there wasn't a large difference.

2006-04-01 09:50:46 - Oblivion (Nv Save 25, HQ)
Frames: 407 - Time: 26143ms - Avg: 15.568 - Min: 7 - Max: 22

2006-04-01 09:38:29 - Oblivion (ATI save 25, HQ)
Frames: 576 - Time: 27426ms - Avg: 21.002 - Min: 17 - Max: 25

2006-04-01 09:59:42 - Oblivion (Nv Save 25, Q)
Frames: 546 - Time: 27009ms - Avg: 20.215 - Min: 10 - Max: 28

2006-04-01 10:06:57 - Oblivion (ATI Save 25, Q)
Frames: 671 - Time: 25440ms - Avg: 26.376 - Min: 20 - Max: 31

Here are a few screenshots showing the area I was testing in and the framerate at the start of the benchmark.

1. Nvidia HQ, 1920x1200, 16xAF, HDR On
2. ATI Cat AI Off, 1920x1200, 16xAF, HDR On, HQ AF off
3. Nvidia Q, 1920x1200, 16xAF, HDR On
4. ATI Cat AI On, 1920x1200, 16xAF, HDR On, HQ AF off

1..............................2
http://img2.freeimagehosting.net/uploads/th.e38d7c07a0.jpg (http://img2.freeimagehosting.net/image.php?e38d7c07a0.jpg) http://img2.freeimagehosting.net/uploads/th.c3ffc3a6a8.jpg (http://img2.freeimagehosting.net/image.php?c3ffc3a6a8.jpg)

3..............................4
http://img2.freeimagehosting.net/uploads/th.b8bcbe5c7e.jpg (http://img2.freeimagehosting.net/image.php?b8bcbe5c7e.jpg) http://img2.freeimagehosting.net/uploads/th.432cf90f7d.jpg (http://img2.freeimagehosting.net/image.php?432cf90f7d.jpg)

I'll do a city walk around the block test next and maybe an inside test also to see how those compare.

Edit: Hmm, even more of a gap in the city. Remember 7800 GTX 512 vs the X1900XTX but still, the gap b/w them is a little surprising to me.

2006-04-01 10:46:02 - Oblivion (Nv Save 17, HQ, CO)
Frames: 1648 - Time: 52677ms - Avg: 31.285 - Min: 17 - Max: 53

2006-04-01 11:02:27 - Oblivion (ATI Save 17 HQ CO)
Frames: 2502 - Time: 53391ms - Avg: 46.862 - Min: 26 - Max: 79

2006-04-01 10:49:51 - Oblivion (Nv Save 17, Q, CO)
Frames: 2020 - Time: 51987ms - Avg: 38.856 - Min: 20 - Max: 66

2006-04-01 11:05:17 - Oblivion (ATI Save 17 Q CO)
Frames: 2901 - Time: 52898ms - Avg: 54.841 - Min: 26 - Max: 88

1. Nvidia HQ, 1920x1200, 16xAF, HDR On
2. ATI Cat AI Off, 1920x1200, 16xAF, HDR On, HQ AF off
3. Nvidia Q, 1920x1200, 16xAF, HDR On
4. ATI Cat AI On, 1920x1200, 16xAF, HDR On, HQ AF off

1..............................2
http://img2.freeimagehosting.net/uploads/th.c0e95415e8.jpg (http://img2.freeimagehosting.net/image.php?c0e95415e8.jpg) http://img2.freeimagehosting.net/uploads/th.98874f797a.jpg (http://img2.freeimagehosting.net/image.php?98874f797a.jpg)

3..............................4
http://img2.freeimagehosting.net/uploads/th.94bc0b1a3c.jpg (http://img2.freeimagehosting.net/image.php?94bc0b1a3c.jpg) http://img2.freeimagehosting.net/uploads/th.974e292a5d.jpg (http://img2.freeimagehosting.net/image.php?974e292a5d.jpg)

Sazar
04-01-06, 12:07 PM
It's probably coz you aren't running and gunning.

But as long as it's playable, what the heck right?

:)

Nice pics, is the HDR possible with AA in this game?

Sazar
04-01-06, 12:09 PM
Yeah, but who the hell likes those lame catalyst drivers and .net framework :)

It's not like people live in the console for the graphics card m8.

Q
04-01-06, 12:13 PM
OWA...are those results with SLI vs. the x1900 or just an individual card? I ask because I know your running two of those in that rig and that would be a bigger difference if it was SLI.

-=DVS=-
04-01-06, 12:23 PM
Well ATI wins this one hands down (wack)

OWA
04-01-06, 12:37 PM
OWA...are those results with SLI vs. the x1900 or just an individual card? I ask because I know your running two of those in that rig and that would be a bigger difference if it was SLI.
It was a single 7800 GTX 512 vs the X1900 XTX.

OWA
04-01-06, 12:42 PM
Nice pics, is the HDR possible with AA in this game?
No, they don't allow you to use both together.

Q
04-01-06, 12:45 PM
It was a single 7800 GTX 512 vs the X1900 XTX.

Thank sweet baby Jesus, I was worried. My other 7800 GTX gets here Monday and I was gonna poop myself if it was gonna get completely owned.

killahsin
04-01-06, 12:53 PM
again people keep saying ati wins this one hands down but there is little real world performance difference between the two. Literally. Any site that uses this game as a benchmark in its current form is a site trying to soley promote a specific card. Because the engine fluctuates so much that depending not only the area but the ai and physics constantly changing. That the only place you can actually use as a good comparision is the exit of the first dungeon, ONLY when you exit it. There is far to much random things that occur with this engine to reproduce results repeatedly that are standard in benchmarks. And yes i would still be saying the same thing if the nv card came on top. Because its the truth.

OWA
04-01-06, 01:25 PM
Thank sweet baby Jesus, I was worried. My other 7800 GTX gets here Monday and I was gonna poop myself if it was gonna get completely owned.

Here are the SLI results.

2006-04-01 13:02:30 - Oblivion (Nv Save 25, HQ, SLI)
Frames: 717 - Time: 25300ms - Avg: 28.340 - Min: 15 - Max: 38

2006-04-01 13:04:36 - Oblivion (NV Save 25, Q, SLI)
Frames: 827 - Time: 25306ms - Avg: 32.680 - Min: 18 - Max: 42

2006-04-01 13:06:46 - Oblivion (NV Save 17, HQ, SLI)
Frames: 2781 - Time: 52095ms - Avg: 53.383 - Min: 26 - Max: 87

2006-04-01 13:09:12 - Oblivion (NV Save 17, Q, SLI)
Frames: 3263 - Time: 52565ms - Avg: 62.076 - Min: 26 - Max: 106

Sazar
04-01-06, 01:48 PM
No, they don't allow you to use both together.

Those bastards (pirate)

Q
04-01-06, 01:55 PM
Thanks for those SLI results. It seems like it really helps in those save 17 benches.

Son Goku
04-01-06, 03:06 PM
I guess it's all in the eye of the beholder. I was getting 17FPS or so, sometimes less in crazy outdoor areas, using a single 6800GT and I thought it was definitely playable. Going to SLI really made it buttery smooth though...maybe they consider the absolute minimum 30 or 40FPS?

tbh, anything below 30 fps does appear to stutter to my eye... Even 29 fps is stuttering, so if it were me I'd be wanting a higher frame rate... People's eyes do very however, where some claim not to see much difference from 30 to 60 fps, and others claim to see differences up around 90 fps even...

However slight variation in people's eye balls aren't unknown. Just take color vision for instance, where on the one end of the spectrum some people can be color blind wrt certain colors (probably matching pairs within a particular cone, such as red/green which are distingushed by one set of cones, etc) On the other hand, there are people who can distinguish to a greater then normal degree... Same probably happens here to a degree.

NoDamage
04-01-06, 04:49 PM
again people keep saying ati wins this one hands down but there is little real world performance difference between the two. Literally. Any site that uses this game as a benchmark in its current form is a site trying to soley promote a specific card. Because the engine fluctuates so much that depending not only the area but the ai and physics constantly changing. That the only place you can actually use as a good comparision is the exit of the first dungeon, ONLY when you exit it. There is far to much random things that occur with this engine to reproduce results repeatedly that are standard in benchmarks. And yes i would still be saying the same thing if the nv card came on top. Because its the truth.Sorry, I am going to have to disagree. It is easy to figure out whether engine fluctuations are causing discrepancies simply by loading the same save multiple times and noting the average framerate for each run. As long as the average framerates are consistent then I think the results are fine. Besides, loading a save game should produce reasonably consistent results anyway because the game state is identical at the time of loading.

There is only a problem if the same hardware on the same settings produces wildly inconsistent results for each run, and such a thing would be easily noticeable. If one card gets 10 fps, 30 fps, and then 20 fps across three runs then yes there would be a problem, but I highly doubt something like that is occurring.

Shamrock
04-01-06, 10:19 PM
OWA, you benched a 7800GTX, I quoted a 7900GTX, and it got beat too!

killahsin
04-02-06, 04:02 AM
Sorry, I am going to have to disagree. It is easy to figure out whether engine fluctuations are causing discrepancies simply by loading the same save multiple times and noting the average framerate for each run. As long as the average framerates are consistent then I think the results are fine. Besides, loading a save game should produce reasonably consistent results anyway because the game state is identical at the time of loading.

There is only a problem if the same hardware on the same settings produces wildly inconsistent results for each run, and such a thing would be easily noticeable. If one card gets 10 fps, 30 fps, and then 20 fps across three runs then yes there would be a problem, but I highly doubt something like that is occurring.

sure that all works nice and all but there are many things that are random in this game that change in between loads, including mob placement spawning npc placement(of npcs you cant see) and ai and so forth. Nothing reproduces consistant results outside. And yes that does occur. Do a test yourself and see.

the only real consistant way to benchmark this engine is INSIDE a dungeon or inside a town building sadly. Which tends to be an issue because every card performs great inside them. Outisde is EXTREMLY inconsistant, its why you don't see 100s of sites posting their normal benchmark of new uber game engine 15 page articles.

rflair
04-02-06, 08:05 AM
sure that all works nice and all but there are many things that are random in this game that change in between loads, including mob placement spawning npc placement(of npcs you cant see) and ai and so forth. Nothing reproduces consistant results outside. And yes that does occur. Do a test yourself and see.

the only real consistant way to benchmark this engine is INSIDE a dungeon or inside a town building sadly. Which tends to be an issue because every card performs great inside them. Outisde is EXTREMLY inconsistant, its why you don't see 100s of sites posting their normal benchmark of new uber game engine 15 page articles.


Ya but if results show that ATI performs better outdoors overall then your thinking is wrong, the randomness of the outdoors like you said would effect both ATI and NV cards. The only ones who can test this are the ones with both cards, and even OWA has shown that ATI is performing better outdoors and better overall in this game.

Bah!
04-02-06, 08:57 AM
Wow, anyone look at the settings these guys use for their comparisions? Some of those are ridiculous to be using on this game, using 16xAF with this engine is just stupid for the performance hit/IQ ratio. It gets even worse in their mid-range IQ tests.

I'd put my IQ and FPS(6800gt) up aginst their "best playable settings" any day of the week...and i'd win.

I don't care if the 1900 is faster in Oblivion, those guys are idiots.

swaaye
04-02-06, 04:12 PM
I use 4X AF. Higher adds too much sharpness to distant things and makes it look like there's even more aliasing.