PDA

View Full Version : digit life farcry 1.2 review


Pages : [1] 2 3

videocardguy
07-16-04, 04:33 PM
looks like nvidia marketing is at it again. this is getting pathetic. nvidia is hurting the games industry because it is unable to produce hardware that can compete with atis.

http://www.ixbt-labs.com/articles2/gffx/fc12.html

the bottom part is very interesting. apparently vertex instancing is completely useless. and

"FarCry 1.2 example demonstrated that Shaders 3.0 are real and effective and that many games will be capable of their support in case of a flexible engine. Obtained results are somewhat ambiguous: on the one hand, we witnessed performance gain in several places; on the other hand, it seems that the improvement, which brought considerable rendering gain (several light sources at one pass), could have been implemented with Shaders 2.x (possibly even 2.0). Thus I would like to wish the developers not only to promote the latest technologies, but also not to forget and optimize the game performance with the existing video cards. However all performance gains in the above tests are actually due to pass reduction at rendering lighting. That is the current implementation has no merits achieved via the features of Shaders 3.0. "

now every time i read some uninformed jackass rant on about shader model 3 in the current gen i will just post this

evilchris
07-16-04, 04:42 PM
looks like nvidia marketing is at it again. this is getting pathetic. nvidia is hurting the games industry because it is unable to produce hardware that can compete with atis.

http://www.ixbt-labs.com/articles2/gffx/fc12.html

the bottom part is very interesting. apparently vertex instancing is completely useless. and

"FarCry 1.2 example demonstrated that Shaders 3.0 are real and effective and that many games will be capable of their support in case of a flexible engine. Obtained results are somewhat ambiguous: on the one hand, we witnessed performance gain in several places; on the other hand, it seems that the improvement, which brought considerable rendering gain (several light sources at one pass), could have been implemented with Shaders 2.x (possibly even 2.0). Thus I would like to wish the developers not only to promote the latest technologies, but also not to forget and optimize the game performance with the existing video cards. However all performance gains in the above tests are actually due to pass reduction at rendering lighting. That is the current implementation has no merits achieved via the features of Shaders 3.0. "

now every time i read some uninformed jackass rant on about shader model 3 in the current gen i will just post this

:lol2: :retard:

SM 3 can produce effects faster than SM 2.0 on a card which supports it. Sorry. It can also generate more complex effects. Again, sorry. Did you register just for a ban? SM 3.0 is also a MICROSOFT SPECIFICATION AS PART OF DIRECTX 9. It isn't a creation of NVIDIA's nor marketing hype. ATI is only downplaying it because they don't have it. End of story.

Clay
07-16-04, 04:44 PM
Say it with me now... F U D :D Anyone that's every written a lick of code can clearly see the advantages of SM3.0 over v2.0. :rolleyes:

videocardguy
07-16-04, 04:46 PM
oh so digitlife is just making this all up? and crytek was lying when they said they were using STATIC branching because dynamic is too slow.....some of you need to wake up

videocardguy
07-16-04, 04:47 PM
:lol2: :retard:

SM 3 can produce effects faster than SM 2.0 on a card which supports it. Sorry. It can also generate more complex effects. Again, sorry. Did you register just for a ban? SM 3.0 is also a MICROSOFT SPECIFICATION AS PART OF DIRECTX 9. It isn't a creation of NVIDIA's nor marketing hype. ATI is only downplaying it because they don't have it. End of story.

im not saying it cant, im saying in this generation its nothing but a marketing feature. and why should i be banned? this is a very valid point.

evilchris
07-16-04, 04:53 PM
im not saying it cant, im saying in this generation its nothing but a marketing feature. and why should i be banned? this is a very valid point.

Your post reads more like a biased, fanatic flame. Tone it down a bit.
nvidia is hurting the games industry because it is unable to produce hardware that can compete with atis.


That is not a valid point. That is a crock of BS. NVIDIA's hardware is showing itself to be superior to ATI's in a lot of respects. That's why a 150MHz slower clocked BFG 6800 GT OC can beat an X800XT in a lot of games. In fact, NVIDIA is competing SO WELL with ATI, ATI is in a frenzy trying to come up with new SKU's ( the rumored X800 GT ) as the 6800 GT is completely superior to its intended rival, the X800 Pro.

Razor1
07-16-04, 05:03 PM
im not saying it cant, im saying in this generation its nothing but a marketing feature. and why should i be banned? this is a very valid point.

Hmm interesting so if ATi came out with SM 3.0 support would you say the same thing?

I'm a programmer I have used both SM 3.0 and branching in 2.0. Branching in 2.0 is limited to certain effects it can't do what SM 3.0 can anyway you put it unless you want code specific branch paths for each type of shader. That just defeats the purpose of SM 3.0's ease of programing. You don't really need to worry about it too much other then a couple of calls

Was it Humus who made the demo for SM 2.0 for branching for number of lights? Lets see him do that with multipass reflective surfaces with raytracing and with texture look up in a vertex shader. How about HDRL in a single pass with SM 2.0?

First off sm 2.0 doesn't support texture look ups in vertex shaders.

The cards that are out now are very similiar in performance. nV's are cheaper for compairable cards, and have more features. So why complain about sm 3.0? Its an added benefit.

anzak
07-16-04, 05:07 PM
im not saying it cant, im saying in this generation its nothing but a marketing feature.

That is an opinion not a fact. Everyone said the same thing about the Radeon 9700 pro when it was released.

videocardguy
07-16-04, 05:07 PM
Hmm interesting so if ATi came out with SM 3.0 support would you say the same thing?

I'm a programmer I have used both SM 3.0 and branching in 2.0. Branching in 2.0 is limited to certain effects it can't do what SM 3.0 can anyway you put it unless you want code specific branch paths for each type of shader. That just defeats the purpose of SM 3.0's ease of programing. You don't really need to worry about it too much other then a couple of calls

Was it Humus who made the demo for SM 2.0 for branching for number of lights? Lets see him do that with multipass reflective surfaces with raytracing and with texture look up in a vertex shader. How about HDRL is in a single pass with SM 2.0?

First of sm 2.0 doesn't support texture look ups in vertex shaders.

The cards that are out now are very similiar in performance. nV's are cheaper for compairable cards, and have more features. So why complain about sm 3.0? Its an added benefit.

lets see the nv40 run those effects at playable rates in an actual game.

videocardguy
07-16-04, 05:08 PM
Your post reads more like a biased, fanatic flame. Tone it down a bit.


That is not a valid point. That is a crock of BS. NVIDIA's hardware is showing itself to be superior to ATI's in a lot of respects. That's why a 150MHz slower clocked BFG 6800 GT OC can beat an X800XT in a lot of games. In fact, NVIDIA is competing SO WELL with ATI, ATI is in a frenzy trying to come up with new SKU's ( the rumored X800 GT ) as the 6800 GT is completely superior to its intended rival, the X800 Pro.

its very valid. nvidia is paying off crytek to not include PIXEL SHADER 2.0 OPTIMIZATIONS on ati cards.

Razor1
07-16-04, 05:08 PM
lets see the nv40 run those effects at playable rates in an actual game.


LOL nice come back, lets see ATI's do it too?

videocardguy
07-16-04, 05:09 PM
That is an opinion not a fact. Everyone said the same thing about the Radeon 9700 pro when it was released.

well seeign as how crytek said dynamic branching is too slow(which is the only feature of ps 3.0 that isnt in ps2.0b. 512 instructions is more than will ever be used this or next generation.) id say its more of a fact. digit life just proved that vertex instancing is doing nothing.

anzak
07-16-04, 05:10 PM
So why complain about sm 3.0? Its an added benefit.

Thank you! SM3.0 is a benefit. People say why buy a card with SM3.0 if it's not going to be used, well I say why not. What it wrong with supporting new tech?

videocardguy
07-16-04, 05:11 PM
LOL nice come back, lets see ATI's do it too?

ati isnt running around claiming how superior 3.0 is for this time frame. they also dont have milliosn fo people on forums posting every other post about how good sm 3.0 is

Revgen
07-16-04, 05:11 PM
im not saying it cant, im saying in this generation its nothing but a marketing feature. and why should i be banned? this is a very valid point.

The article is half-right, one-pass per pixel lighting can be done in ps 2.0. The problem is that it would take programmers a lot longer to implement it in PS 2.0 than using PS 3.0.

You need to remember that it took Crytech only four weeks to implement these changes. PS 3.0 gives programmers more options and flexibility when they program shaders in their games, which allows them to optimize their code in less time.

The bottom line is SM3.0 shaders usually are faster than SM 2.0 because they are easier for the programmers to code for. Sorry, but Nvidia wins this argument.

for more info about SM 3.0 go here http://www.beyond3d.com/previews/nvidia/nv40/index.php?p=6

videocardguy
07-16-04, 05:12 PM
Thank you! SM3.0 is a benefit. People say why buy a card with SM3.0 if it's not going to be used, well I say why not. What it wrong with supporting new tech?

i never said ps 3.0 was BAD(altho it is useless in this generation), i said the way nvidia is using its marketing campaign to hurt the game sindustry is.

evilchris
07-16-04, 05:12 PM
its very valid. nvidia is paying off crytek to not include PIXEL SHADER 2.0 OPTIMIZATIONS on ati cards.

No it is invalid. You don't know wtf NVIDIA is doing with it's money. Quit spewing BS which is YOUR OPINION and labeling it AS FACT. NVIDIA's hardware competes with ATI's and in many ways SURPASSES IT. I think behind closed doors even ATI would admit this. IMO ATI right now is busy playing "catch up" in R&D.

Can you explain to me how a 370MHz GT keeps up with a 520MHz X800XT in a lot of games? Can you explain to me why the Gt *smokes* the X800 pro if NVIDIA "can't produce hardware that competes" ??

O, and FYI:

FARCRY IS NOT THE BE ALL END ALL OF 3D CARD BENCHMARKING

videocardguy
07-16-04, 05:13 PM
Your post reads more like a biased, fanatic flame. Tone it down a bit.


That is not a valid point. That is a crock of BS. NVIDIA's hardware is showing itself to be superior to ATI's in a lot of respects. That's why a 150MHz slower clocked BFG 6800 GT OC can beat an X800XT in a lot of games. In fact, NVIDIA is competing SO WELL with ATI, ATI is in a frenzy trying to come up with new SKU's ( the rumored X800 GT ) as the 6800 GT is completely superior to its intended rival, the X800 Pro.

which area is it superior in? i cant think of one. it has better drivers, but thats not hardware is it

evilchris
07-16-04, 05:13 PM
i never said ps 3.0 was BAD(altho it is useless in this generation), i said the way nvidia is using its marketing campaign to hurt the game sindustry is.

YOU don't know if it is useless or not. Why do you think it is? BECAUSE ATI TOLD YOU SO? It's only "useless" in their MARKETING HYPE because THEY DON'T HAVE IT.

anzak
07-16-04, 05:13 PM
ati isnt running around claiming how superior 3.0 is for this time frame. they also dont have milliosn fo people on forums posting every other post about how good sm 3.0 is

No because no programs take advantage of it yet. In 6 months we might have games that support SM3.0, if not, who cares. Since The Geforce 6800 is every bit as fast if not faster than the Radeon X800 in PS2.0 whats wrong with supporting SM3.0 rather it will be used or not.

evilchris
07-16-04, 05:14 PM
which area is it superior in? i cant think of one. it has better drivers, but thats not hardware is it

Game performance. Compatibility. AVAILABILITY. Still waiting to see an X800 Pro at a B&M. I see 100's of GTs around town.

Razor1
07-16-04, 05:15 PM
ati isnt running around claiming how superior 3.0 is for this time frame. they also dont have milliosn fo people on forums posting every other post about how good sm 3.0 is

Ok so it can't do it right?

Also if the card can't do how can a programmer make software for it? Have you looked at the new games out today, they are all cpu limited. Forget the graphics cards you need a high end cpu to totally utilize the 6800 or the x800 true potential. So if the cards are running Far Cry at what 60 fps? The minimum is 30 fps. That means nV's cards will be just fine for next gen games too they will last for 2 years. Just like the ATi 9700.

And don't say the 9700 didn't last 2 years.

videocardguy
07-16-04, 05:15 PM
No it is invalid. You don't know wtf NVIDIA is doing with it's money. Quit spewing BS which is YOUR OPINION and labeling it AS FACT. NVIDIA's hardware competes with ATI's and in many ways SURPASSES IT. I think behind closed doors even ATI would admit this. IMO ATI right now is busy playing "catch up" in R&D.

Can you explain to me how a 370MHz GT keeps up with a 520MHz X800XT in a lot of games? Can you explain to me why the Gt *smokes* the X800 pro if NVIDIA "can't produce hardware that competes" ??

O, and FYI:

FARCRY IS NOT THE BE ALL END ALL OF 3D CARD BENCHMARKING

the gt doesnt keep up with the xt in any games except old opengl games. when you bench actual game play, and not demos which have all sorts of optimizations, the pro is usually matching the ultra. the xt just blows it away.

lets see cuz it has 4 more pixel pipelines and more memory bandwidth?

videocardguy
07-16-04, 05:17 PM
where are you guys getting that the 6800 ultra is faster than the xt?? the xt is the fastest card available....

anzak
07-16-04, 05:18 PM
nvidia is using its marketing campaign to hurt the game sindustry is.

Hurt the game industry? How the **** is it hurting the game industry? nVidia is only giving people options! So what if it's not used, ATI dont have it. WHAT IS YOUR POINT!