PDA

View Full Version : So whats better?


Pages : [1] 2

ATi
05-22-04, 02:12 PM
Whats better a HIS X800XT or a Gainward PowerPack Ultra/2600 Golden Sample "6800"?Any one knows where i can pre-order a Gainward Ultra/2600
________
Kawasaki KH500 (http://www.cyclechaos.com/wiki/Kawasaki_KH500)

Kamel
05-22-04, 02:20 PM
your name and avitar confuses the hell out of me...

it depends on what really, afaik they are nearly the same (sometimes x800 is better, sometimes the 6200 is better). the 6200 does have some other features that the ati doesn't have, but ati's x800 is out so no wait. the 6200 is better in opengl games, so if you're a linux user the choice is easy.

in other words, that's for you to find out ;/. this thread is a duplicate btw, do a search for x800 2600 i'm sure you'll find your answer repeated a million times, lol.

D.K.Tronics
05-22-04, 02:24 PM
The Gainward PowerPack Ultra/2600 Golden Sample ?
You mean, like this ?

http://www.atariage.com/images/general/icon_2600.gif

Only now, it's in gold, lol. :p

Sorry, just messing around. ;)

Caliche
05-22-04, 02:55 PM
Both are about the same performance wise, but from my view Nvidia is a better hold over card.

Caliche

CaiNaM
05-22-04, 03:24 PM
Both are about the same performance wise, but from my view Nvidia is a better hold over card.

Caliche

statements like these are silly.. why is it a "better hold over card"? because it supports sm3? was the fx5800 a "better hold over card" because it supported sm2? obviously not.

the fact is there is no right answer. each has their own strengths and weaknesses. 6800's have a better feature set, but they're not working yet. performance may or may not get better with newer drivers. far cry (a twimtbp game) has some serious performance/iq issues with the early 6800 software/hardware. the x800's seem to hold a distinct advantage at hi res/aa/af. will this continue? no one can say.

if you're really THAT worried about which is better, then hold off until shipping cards and drivers are avail from nvidia. at that time you should have a better picture, tho the effectiveness of sm3 feature set may still be vague. hopefully far cry will shed some light on this soon.

personally i don't think anyone will be disappointed in the performance of either card, but at this point it's simply too early to tell whether one will continue to have much of an advantage over the other.

jimmyjames123
05-22-04, 04:30 PM
statements like these are silly.. why is it a "better hold over card"? because it supports sm3? was the fx5800 a "better hold over card" because it supported sm2? obviously not.

The FX5800 supported SM 2.0, but the R3xx cards also supported SM 2.0, and had much faster pixel shader 2.0 speed in comparison. On the other hand, SM 3.0 is the standard that the entire industry is moving towards, and many developers are embracing this technology as we speak.

the fact is there is no right answer. each has their own strengths and weaknesses. 6800's have a better feature set, but they're not working yet. performance may or may not get better with newer drivers.

I agree that each has its own strengths and weaknesses. However, the NV4x's SM 3.0 feature set cannot be exposed until DirectX 9.0c is out. Shouldn't be long now though. As for driver issues, this is also valid at the moment. The current 6x.xx series Forceware drivers are somewhat raw at the moment, and they are optimizing for what is in many ways an entirely new architecture, so we can only hope for the best here.

far cry (a twimtbp game) has some serious performance/iq issues with the early 6800 software/hardware.

This is also true at the moment, and it is clear that the current beta Forceware 6x.xx drivers have some major bugs with FarCry. At the same time, we also know that the NV driver team is making this a top priority, and CryTek will also be releasing a SM 3.0 add-on, presumably sometime this summer.

the x800's seem to hold a distinct advantage at hi res/aa/af. will this continue? no one can say.

That statement is not entirely accurate. The X800XT does have the advantage over the 6800U in some games at 1600x1200 with 4xAA/8xAF, but the 6800U also has the advantage in other games. Also, the 6800GT generally seems to be slightly faster than the X800Pro. Finally, let's not forget that some reviews compared the NV cards using full trilinear filtering vs ATI's optimized approach.

Skynet
05-22-04, 04:36 PM
Finally, let's not forget that some reviews compared the NV cards using full trilinear filtering vs ATI's optimized approach.heh um yea some of us may have heard of this issue LOL.

Honestly the best answer is to wait. The 6800's are not even in peoples hands yet. In 2-3 months things will be a lot clearer.

jimmyjames123
05-22-04, 04:40 PM
I agree. I don't think it is really worthwhile preordering any of these products, unless you absolutely must have the product in your hands asap. Shortly after they are released, prices should naturally go down anyway, because of price competition between vendors.

CaiNaM
05-22-04, 06:56 PM
The FX5800 supported SM 2.0, but the R3xx cards also supported SM 2.0, and had much faster pixel shader 2.0 speed in comparison. On the other hand, SM 3.0 is the standard that the entire industry is moving towards, and many developers are embracing this technology as we speak.

which, as i stated means nothing at this point. we have no idea what kind of performance enhancements, if any, will result, nor the advantages of any of the upcoming sm3 games, if any, in peformance, iq, or visual enhancements sm3 will provide. it could be useful, it could also be useless. either way, anything we might envision is simply a guess at this point.

it's like dx9/sm2 was on fx5800 - useless as the 5800 simply did not have the power.

I agree that each has its own strengths and weaknesses. However, the NV4x's SM 3.0 feature set cannot be exposed until DirectX 9.0c is out. Shouldn't be long now though. As for driver issues, this is also valid at the moment. The current 6x.xx series Forceware drivers are somewhat raw at the moment, and they are optimizing for what is in many ways an entirely new architecture, so we can only hope for the best here.

again, as i stated previously, something we can only speculate on at this point.

This is also true at the moment, and it is clear that the current beta Forceware 6x.xx drivers have some major bugs with FarCry. At the same time, we also know that the NV driver team is making this a top priority, and CryTek will also be releasing a SM 3.0 add-on, presumably sometime this summer.

there seems to be a theme, eh? once again, speculative at best.. we'll know more in a few months, but at this time, it's all conjecture. i hope (and assume) this will be the case as i want an nv40 to go along with my r420.

That statement is not entirely accurate. The X800XT does have the advantage over the 6800U in some games at 1600x1200 with 4xAA/8xAF, but the 6800U also has the advantage in other games. Also, the 6800GT generally seems to be slightly faster than the X800Pro. Finally, let's not forget that some reviews compared the NV cards using full trilinear filtering vs ATI's optimized approach.

i see it as entirely accurate. the majority of times w/ high res/aa/af x800t wins - the pro even garners some wins. will this hold true? well, i'd like to think that due to the new nv architecture, nv drivers have more headroom for improvement, but again, that's speculation (tho it's a logical assumption).

you would also think that yields will be better and there's more headroom for clockspeeds as well, however early reports of nv 40 not being able to attain "extreme" speeds with standard cooling are disappointing. perhaps this will change.. but at this time, there doesn't seem to be much headroom for increased ultra core speeds.

i agree somewheat on the trilinear optimization (tho many were benched with nvdia running similar optimizations, as the "opt off" setting was broken in the forceware drivers).

Kamel
05-22-04, 08:37 PM
I agree. I don't think it is really worthwhile preordering any of these products, unless you absolutely must have the product in your hands asap. Shortly after they are released, prices should naturally go down anyway, because of price competition between vendors.


not true, if you have linux the choice is a total no-brainer. ati drivers cause about a 25-40% slow down in linux.

buuut, i agree to wait. the prices will drop quickly after they stop selling out every time they get in stock. i say give it till right after christmas time and prices will be mint.

SH64
05-22-04, 08:46 PM
I'd go for Gainward's 6800U .. looks tempting to me :drooling:
& overclock wise.

Nv40
05-22-04, 08:46 PM
The Gainward PowerPack Ultra/2600 Golden Sample ?
You mean, like this ?

http://www.atariage.com/images/general/icon_2600.gif

Only now, it's in gold, lol. :p

Sorry, just messing around. ;)


hehe.. that was a cool machine.. :D
i find amazing how much enjoyed game graphics at that time :)

D.K.Tronics
05-22-04, 09:09 PM
hehe.. that was a cool machine..

And I thought I was the old(er) git around here ;)


i find amazing how much enjoyed game graphics at that time

When I saw 2600 Asteroids, I nearly crapped myself. lol.
But then, I also remember seeing E.T. :eek: And I was scarred for life.
I wonder if it's true that they buried a few thousand of the buggers ?

Riptide
05-22-04, 09:20 PM
The rumor about them burying all those ET cartridges is true. I think I saw something about it over on snopes.

NightFire
05-22-04, 10:41 PM
Two words can effectively describe ET on the Atari 2600:
Hopping penis

Skynet
05-22-04, 10:56 PM
At one time Atari made more ET games than game consoles in use, so to sell every one of them, all 2600 users had to buy at least one. Very smart marketing there. And we wonder why Atari the hardware company is not around anymore. (I have a 2600 in the box unopened)

I know this has been asked a billion times, but what is the ETA of the 6800, any version?

Clay
05-22-04, 11:55 PM
(I have a 2600 in the box unopened)Seriously? I mean I believe you but I'd die to see a pic of it if you care to entertain. :) How did you get it? Is it rare to own an unopened 2600? I had one when I was about six...played the crap out of that thing. I loved Yars Revenge, River Raid and Pitfall.

Sorry to get OT guys...wood grain, manual switches and single button joysticks...those were the days. :D

jimmyjames123
05-23-04, 03:46 AM
which, as i stated means nothing at this point.

Whether you like it or not, SM 3.0 is for real, and it is the standard that the entire industry is moving towards. That is hardly comparable to SM 2.0 support for the 5800 FX cards! What is very obvious here is that NV arguably has the leg up with respect to advanced featureset that is the basis for the next DirectX update. Developers are embracing this technology as we speak. ATI's next gen hardware will have full SM 3.0 support. This is hardly comparable to SM 2.0 support on the FX 5800, because the FX cards did not enjoy any tangible advantage in SM 2.0 support vs the R3xx cards.

it's like dx9/sm2 was on fx5800 - useless as the 5800 simply did not have the power.

No, this situation is hardly applicable at all here. The 6800 has very fast pixel shader 2.0 speed. SM 3.0 is intended to enhance performance relative to SM 2.0, and is intended to help developers too with respect to ease of programming.

again, as i stated previously, something we can only speculate on at this point.

I can pretty much guarantee, using common sense, that the experience in FarCry will be enhanced once DirectX 9.0c and the add-on for FarCry are released and once the Forceware drivers become more mature. There would be no reason to code an add-on otherwise. Let's revisit this "pure speculation" in two months time and see what happens.

i see it as entirely accurate. the majority of times w/ high res/aa/af x800t wins - the pro even garners some wins. will this hold true? well, i'd like to think that due to the new nv architecture, nv drivers have more headroom for improvement, but again, that's speculation (tho it's a logical assumption).

Your statement was not accurate, for the reasons I explained above. It depends entirely on what game was tested, and what settings were used in the testing. See MikeC's chart on performance victories for reference. I'd say that the X800XT is generally slightly faster than the 6800U, and the 6800GT is generally slightly faster than the X800Pro, but then again it depends entirely on what game is being tested and what settings are being used. Some reviews also compared ATI's optimized filtering to NV's trilinear filtering with optimizations off. Also be careful about comparing "R420 vs NV40" in general terms because performance differences will depend entirely on what model is being compared within each lineup.

CaiNaM
05-23-04, 04:57 AM
Whether you like it or not, SM 3.0 is for real, and it is the standard that the entire industry is moving towards. That is hardly comparable to SM 2.0 support for the 5800 FX cards! What is very obvious here is that NV arguably has the leg up with respect to advanced featureset that is the basis for the next DirectX update. Developers are embracing this technology as we speak. ATI's next gen hardware will have full SM 3.0 support. This is hardly comparable to SM 2.0 support on the FX 5800, because the FX cards did not enjoy any tangible advantage in SM 2.0 support vs the R3xx cards.

lol.. it has nothing to do with what i like. it has everything to do with what it's capable of - which is something we don't know at this point. doesn't matter if the competition can do it or not. the 6800 hasn't proven it can. that's the point.

maybe it will, or maybe won't; we'll just have to wait and see.

No, this situation is hardly applicable at all here. The 6800 has very fast pixel shader 2.0 speed. SM 3.0 is intended to enhance performance relative to SM 2.0, and is intended to help developers too with respect to ease of programming.

how so? intent is great, but it doesn't prove anything. the proof is "in the pudding" as they say, and the pudding hasn't been served.

I can pretty much guarantee, using common sense, that the experience in FarCry will be enhanced once DirectX 9.0c and the add-on for FarCry are released and once the Forceware drivers become more mature. There would be no reason to code an add-on otherwise. Let's revisit this "pure speculation" in two months time and see what happens.

see..that's it.. you can't PROVE anything, and neither can I. your last sentence is exactly what i've been saying - we'll just have to see what happens :)

Your statement was not accurate, for the reasons I explained above.

the reasons you used prove nothing.

It depends entirely on what game was tested, and what settings were used in the testing. See MikeC's chart on performance victories for reference.

you can't really use it as reference for anything - there's no consistency.

I'd say that the X800XT is generally slightly faster than the 6800U, and the 6800GT is generally slightly faster than the X800Pro, but then again it depends entirely on what game is being tested and what settings are being used.[/quote]

there is a trend. dx9 games with high aa/af run faster for the most part on ati. just aa runs a bit faster on nv for the most part (when using the terms ati/nv or nv40/r420, you'll just have to use your head and keep em in the same class; pro vs gt for example). the reason for this is based in the architecture, where nv40 has to share alu when texture filtering is applied, impacting performance more than r420, which uses seperate units.

Some reviews also compared ATI's optimized filtering to NV's trilinear filtering with optimizations off.

also keep in mind most used the newer beta nv driver, where opt off was broken in forceware, resulting in brilinear filtering.

Also be careful about comparing "R420 vs NV40" in general terms because performance differences will depend entirely on what model is being compared within each lineup.

again, most here are aware enough to distinguish appropriate models when comparing, as they generally follow the same pattern when compared across comparable models. the one downside is the lack of gt parts for comparison.

again, everything is speculation when it comes to nv40 and sm3. let's hope flat rocks are not the only "improvements" offered by the new far cry patch :p

jimmyjames123
05-23-04, 02:57 PM
maybe it will, or maybe won't; we'll just have to wait and see.

Let's put it more clearly: there is little chance that SM 3.0 support will hurt the 6800 cards. I think 3dCenter said it best: NV is giving the option of using SM 3.0, while ATI is not. It is always good to have options. This is also an "option" that the entire industry is moving towards. There are already at least about a dozen games that will be using SM 3.0, as you can see from the NV launch details. This is not speculation, this is fact. It is just a matter of time really.

how so? intent is great, but it doesn't prove anything. the proof is "in the pudding" as they say, and the pudding hasn't been served.

The 6800GT/U PS 2.0 speed has been well documented and tested, and we can see that these cards have very fast PS 2.0 speed in general. That is already a given. SM 3.0 is designed to improve efficiency in certain cases vs SM 2.0, in terms of performance and ease of programming. CryTek was able to code a SM 3.0 add-on to FarCry in only 3 weeks! The CEO of CryTek has stated himself that the use of SM 3.0 in FarCry was used primarily to improve performance given a set of effects relative to SM 2.0. Sure, the pudding has not been served, but it will be soon enough. Place your bets now ;)

the reasons you used prove nothing.

Well, if one doesn't look at MikeC's consolidated reviewer data with a blind eye, then one will see that your statement was not entirely accurate.

you can't really use it as reference for anything - there's no consistency.

This is just consolidation of general reviewer results. More reviewers than not have the X800XT and 6800U trading off victories in benchmarks using 4xAA/8xAF at 1600x1200, and it depends entirely on what game is being tested.

there is a trend. dx9 games with high aa/af run faster for the most part on ati.

LOL! What is your sample size, two games of FarCry and Tomb Raider AOD? I think it is a bit premature right now to declare a winner in DirectX 9.0 games, especially considering how buggy the NV cards are in FarCry. Did you know that FarCry in it's current form uses primarily PS 1.1 shaders, with only relatively few PS 2.0 shaders?

the reason for this is based in the architecture, where nv40 has to share alu when texture filtering is applied, impacting performance more than r420, which uses seperate units.

I am well aware of this argument being made at B3D for differences in the AF performance. This is somewhat speculative as well, as we have no good way of knowing how much performance differences are due to an issue such as this. I could just as easily argue that the differences in AF performance are due to ATI's optimized filtering approach. I don't think that this issue is as simple as it sounds, and that's why you see the NV cards trading off victories with the ATI cards in many games.

also keep in mind most used the newer beta nv driver, where opt off was broken in forceware, resulting in brilinear filtering.

Some reviewers did use the 61.11 drivers, some reviewers did not. Also, several reviewers tested both the 60.72 and 61.11's. In more games than not, the 61.11's had little to no performance boost. In fact, in some games, performance decreased relative to 60.72. It was mainly in FarCry that the 61.11 drivers resulted in very noticeable gains. On the ATI cards, performance may also have been dependent on whether or not the reviewer set AF through control panel or through application. Anyway, using NV's trilinear optimizations turned on, vs ATI's standard optimizations that cannot be turned off, is arguably an appropriate approach.

again, most here are aware enough to distinguish appropriate models when comparing, as they generally follow the same pattern when compared across comparable models.

And again, this is somewhat simplistic. The X800XT is a 16 pipeline part, while the X800Pro is a 12 pipeline part. The 6800U and 6800GT are 16 pipeline parts. The 6800GT also has higher memory clocks than the X800Pro, a situation that is not true when comparing X800XT to 6800U. If you look at the Shadermark PS 2.0 tests, you will see that the X800XT generally has slightly faster performance than the 6800U, while the 6800GT generally has slightly faster performance than the X800Pro.

I realize that you are the owner of a shiny new X800Pro, but really it is not logical to argue that the 6800 cards are not more futureproof than the X800 cards, considering that the entire industry is moving towards SM 3.0. Take a look at MSFT's roadmap and ATI's roadmap moving forward, and this becomes quite clear.

Blacklash
05-23-04, 05:07 PM
It depends on which reviewers you listen to.

If you read them all and feel for a middle ground you will find this round settles squarely at user preference. Both cards will play anything you want at good frame rates. So answer, it depends on what the end user values. Both of the cards are good investments. Better is relative to what you value and who you ask.

reever2
05-23-04, 05:09 PM
CryTek was able to code a SM 3.0 add-on to FarCry in only 3 weeks!

And this is supposed to be an amazing feat when they already have an installed Sm2.0 code base to work with?

I think it is a bit premature right now to declare a winner in DirectX 9.0 games, especially considering how buggy the NV cards are in FarCry.

Usually when you fix bugs which disabled effects you don't gain performance...

Did you know that FarCry in it's current form uses primarily PS 1.1 shaders, with only relatively few PS 2.0 shaders?

No, far cry in it's current form on an NV40 is running less PS2.0 code, there is a difference. Do you expect performance to jump up by a noticeable amaount when using code that runs slower?

Blacklash
05-23-04, 05:11 PM
I know this has been asked a billion times, but what is the ETA of the 6800, any version?

If we are lucky some like EVGA (6800/6800U) will ship the 29th of May. Everyone should be on the same page June 7th. Unless things change again :D

Lfctony
05-23-04, 05:29 PM
I think Albatron will ship next week.

TheTaz
05-23-04, 05:47 PM
As others have said, I don't think you can go wrong with either card.

For me, it will boil down to price / performance.

A small advantage that ATi has is, that since their cards are available, and some people are buying the Pros ABOVE MSRP of $399, when nVidia's cards do hit the market, it will be easier for ATi and partners to drop prices.

I would like to thank all the "gimmie" people that rush out and buy a new card over suggested retail price. They pay for my discounts! :D :p

Hehehehe,

Taz