PDA

View Full Version : CryTek Comments on Shader 3.0


Pages : [1] 2

Nv40
04-29-04, 06:01 PM
nice interview.. to clear up the misunderstanding that still may exist about Sm3.0 and FArcry. the answers comes from straight from the developers this time :)


1) As a developer, what are the most convincing arguments for the use of Shader 3.0 over 2.0?

· In VS3.0 shader model actually is possible to support general displacement mapping (with smart shader design when vertex shader has to do something during waiting for texture access).

· In PS3.0 shaders it’s possible to decrease number of shaders using dynamic branching (single shader for general lighting) and in such way decrease number of shader switches, passes, and as result increase speed, and also we can utilize dynamic conditional early reject for some cases in both PS and VS and this also will increase speed. As to NV40 generally possible to use co-issues better to take advantage of super-scalar architecture (we can execute 4 instructions per cycle in a single pipeline).

· We can handle several light sources in single pixel shaders by using dynamic loops in PS3.0.

· We can decrease number of passes for 2-sided lighting using additional face register in PS3.0.

· We can use geometry instancing to decrease number of draw-calls (remove CPU limitations as much as possible).

· We can use unrestricted dependent texture read capabilities to produce more advanced post-processing effects and other in-game complex particles/surfaces effects (like water).

· We can use full swizzle support in PS3.0 to make better instructions co-issue and as result speedup performance.

· We can take advantage of using 10 texture interpolators in PS3.0 shader model to reduce number of passes in some cases.


http://www.pcper.com/article.php?aid=36

creedamd
04-29-04, 06:02 PM
7) What aspects of the screenshots seen at the launch event are specific examples of the flexibility and power of Shader 3.0?

In current engine there are no visible difference between PS2.0 and PS3.0. PS3.0 is used automatically for per-pixel lighting depending on some conditions to improve speed of rendering.

creedamd
04-29-04, 06:03 PM
I still think they tried to pull a fast one with those 1.1vs3.0 shots. I can't believe Pelly fell for it.

Nv40
04-29-04, 06:08 PM
just remember that FArcry Sm3.0 is a mod.. not a new game..
so they are not going to fully exploit the potentials , there.. only some enhancements here and there. Must likely in their next project they will. an Nv40 will run in farcry in Sm3.0 path..that will not be available until directx9c is officially released this summer.

creedamd
04-29-04, 06:09 PM
I think PS3.0 may become of importance, but it will probably for stuff that the gamers won't see. I really don't think that it is going to give you a huge leap in fps or anything like that.

walkndude
04-29-04, 06:11 PM
we get the point digi.. erm, uhm, creed...

creedamd
04-29-04, 06:14 PM
we get the point digi.. erm, uhm, creed...

you got a mouse in your pocket?

CaiNaM
04-29-04, 06:17 PM
I still think they tried to pull a fast one with those 1.1vs3.0 shots. I can't believe Pelly fell for it.

lol. yes, despite the fact that, along with the screenshots, nvidia explicitly stated the comparison was between 1.1 and 2.0 shaders.

Recently there’s been some controversy surrounding a set of screenshots NVIDIA sent to the press involving shader model 3.0 and Far Cry. In the screenshots, NVIDIA provides a comparison of 1.1 shaders to 3.0 shaders. According to NVIDIA: “The following before-and-after images are from the CryEngine. Developer Crytek uses Shader Model 3.0 techniques (vs. 1.x shaders) to add more depth and realism to the scenes. Notice the more realistic look and feel when SM 3.0 is applied to the bricks and stones that make up the staircase. In the scene featuring the Buddha, the full image comes to life with the use of SM 3.0, with the technique applied to multiple objects.” (Editor’s Note: The preceding quote came directly from NVIDIA’s email to members of the press which accompanied the screenshots) :

http://firingsquad.com/hardware/far_cry_nvidia/

creedamd
04-29-04, 06:23 PM
lol. yes, despite the fact that, along with the screenshots, nvidia explicitly stated the comparison was between 1.1 and 2.0 shaders.

http://firingsquad.com/hardware/far_cry_nvidia/

they said that in an email after the lan event, if I'm not mistaken, Brian Burke (chief scientist at nvidia) said that it was indeed ps2.0vs ps3.0. The email was to smooth things over. But too many nvidiots had plastered the pics all over the web saying "PS3.0 ROXORS YOUR SOXORS!! N YO FACE ATI!!" It was hilarious.

walkndude
04-29-04, 06:26 PM
I pointed that quote out here awhile back but was ignored...

It is still misleading though, if you have farcry turn shadows and lighting to "high" and your in 1.xx land -which still looks miles better than the comparison screens.

Still a mountain out of a molehill situation either way.

creedamd
04-29-04, 06:26 PM
From here:
http://www.gamers-depot.com/events/nvidia/6800_launch/002.htm

Pixel Shader 3.0 is probably what we considered the single most stunning change in the way that graphics are delivered to you, the gamer—NVIDIA’s CEO made it clear during his time on stage that the company’s focus continues to be on beauty and speed—and Pixel Shader 3 comes through in a big way. Examples of this last night were titles like Far Cry, which is a flat gorgeous title as it stands now—but with NVIDIA’s new Pixel Shader added into the mix, what was stunning to look at before became awe-inspiring—from any angle and distance, bump-mapping, texturing, real-time shadows, and water reflections look flawless and most definitely run smoother. The folks at Crytek, who also joined us at the event, were so impressed with NVIDIA’s new Pixel Shader that they dropped a bomb on stage last night and announced that they were releasing a mod in the very near future for Far Cry, which allows the entire game to be run using NVIDIA’s new Pixel Shader 3 technology. When asked how long it took to convert the entire Far Cry game engine to utilize Pixel Shader 3, Crytek’s answer was “3 weeks.” Unconfirmed reports have us looking for that conversion mod sometime this June.

Comedy folks!

CaiNaM
04-29-04, 06:30 PM
they said that in an email after the lan event, if I'm not mistaken, Brian Burke (chief scientist at nvidia) said that it was indeed ps2.0vs ps3.0. The email was to smooth things over. But too many nvidiots had plastered the pics all over the web saying "PS3.0 ROXORS YOUR SOXORS!! N YO FACE ATI!!" It was hilarious.

watch the presentation video.. he never said that at all.. although he also didn't state it was 1.1 either...

creedamd
04-29-04, 06:34 PM
watch the presentation video.. he never said that at all.. although he also didn't state it was 1.1 either...

http://www.bit-tech.net/feature/43/

They fell for it too.. the point is they wanted to "wow" people and knew that they had to be deceptive to do so. They wow'd people but then it ran out of gas because there was no foundation. Looks shady to me, why show a bunch of slides comparing the specs between ps2.0vs ps 3.0, then show screens shots of something different. Yeah, that's who I want to believe.

Jarred
04-29-04, 06:34 PM
Why would pixel shader 3.0 be a BAD feature to have?

I"m sick of the same isssue comming up, ps3.0 has it's advantages, and you will probably see those advantages within a year, so, most people that want a card to last them at least a year, will benifit from buying a card that supports ps3.0

I haven't seen to many posts of people claiming it's the greatest **** since sliced bread, it's just a GOOD thing to have.

Does this mean that cards that don't support ps3.0 are going to suck donkey balls? my guess is, ...no.

so, please, lets stop trying to downplay or upplay PS3.0 :rolleyes:

walkndude
04-29-04, 06:35 PM
Is there anyway to block quotes from users on your ignore list, could have sworn it happened automatically before the last bbs update.

creed is blatantly trolling at this point so I hope others can resist the urge to bite...

CaiNaM
04-29-04, 06:38 PM
http://www.bit-tech.net/feature/43/

They fell for it too.. the point is they wanted to "wow" people and knew that they had to be deceptive to do so. They wow'd people but then it ran out of gas because there was no foundation. Looks shady to me, why show a bunch of slides comparing the specs between ps2.0vs ps 3.0, then show screens shots of something different. Yeah, that's who I want to believe.

well, you believe what you want to believe regardless of the facts.. so what's your point? just showing us more of your "selective" reasoning?

Jarred
04-29-04, 06:38 PM
http://www.bit-tech.net/feature/43/

They fell for it too.. the point is they wanted to "wow" people and knew that they had to be deceptive to do so. They wow'd people but then it ran out of gas because there was no foundation. Looks shady to me, why show a bunch of slides comparing the specs between ps2.0vs ps 3.0, then show screens shots of something different. Yeah, that's who I want to believe.

Dude, if you really want to get into people or companies trying to mislead their consumers both parties (ati and nvidia) are guilty, so lets not dwell on issues here, or whine, and point fingers at who said what,

The proof is in the benchies.

creedamd
04-29-04, 06:41 PM
Why would pixel shader 3.0 be a BAD feature to have?

I"m sick of the same isssue comming up, ps3.0 has it's advantages, and you will probably see those advantages within a year, so, most people that want a card to last them at least a year, will benifit from buying a card that supports ps3.0

I haven't seen to many posts of people claiming it's the greatest **** since sliced bread, it's just a GOOD thing to have.

Does this mean that cards that don't support ps3.0 are going to suck donkey balls? my guess is, ...no.

so, please, lets stop trying to downplay or upplay PS3.0 :rolleyes:

Nah, any features are good to have. I would love it if the 420 had PS3.0 just for having another feature for my money. PS3.0 isn't bad, it just hasn't proved itself to be a "great thing" to have or a reason to sacrifice something else to have. But who knows what the future will bring.

creedamd
04-29-04, 06:43 PM
Dude, if you really want to get into people or companies trying to mislead their consumers both parties (ati and nvidia) are guilty, so lets not dwell on issues here, or whine, and point fingers at who said what,

The proof is in the benchies.

All companies do it. That's not the argument.

creedamd
04-29-04, 06:45 PM
well, you believe what you want to believe regardless of the facts.. so what's your point? just showing us more of your "selective" reasoning?

My point was to base my opinion on the threads title. You don't like it so you cry. Just like all of the other threads. Talk about the topic or leave.

creedamd
04-29-04, 06:46 PM
Is there anyway to block quotes from users on your ignore list, could have sworn it happened automatically before the last bbs update.

creed is blatantly trolling at this point so I hope others can resist the urge to bite...

pm a mod, or either feedback forum... but you knew that didn't you?

Jarred
04-29-04, 06:47 PM
Nah, any features are good to have. I would love it if the 420 had PS3.0 just for having another feature for my money. PS3.0 isn't bad, it just hasn't proved itself to be a "great thing" to have or a reason to sacrifice something else to have. But who knows what the future will bring.

I think what it's seriously going to come down to my purchase at this point is this

6800 may be slightly slower, (and were talkin under 10fps in most cases)
takes whooping power supply...
BUT has better features.

X800XT Slightly faster,
Cheaper on the Power supply end
BUT doens't have features I my want in the future.

Both will probably be REALLY good buys, but to me it also depends on who can get it out faster, and what games will be out to take advantage of it.

If DOOMIII comes out on june 15th like it's supposed to, then I'm gettin a 6800 :P

creedamd
04-29-04, 06:52 PM
I think what it's seriously going to come down to my purchase at this point is this

6800 may be slightly slower, (and were talkin under 10fps in most cases)
takes whooping power supply...
BUT has better features.

X800XT Slightly faster,
Cheaper on the Power supply end
BUT doens't have features I my want in the future.

Both will probably be REALLY good buys, but to me it also depends on who can get it out faster, and what games will be out to take advantage of it.

If DOOMIII comes out on june 15th like it's supposed to, then I'm gettin a 6800 :P

I can understand that way of thinking, if that's the case, 6800u will be in my top 2 machines for sure! I'd like to go back to Nvidia this generation. It will take a slam dunk by ati to get my eyes from that 6800u :drooling:

walkndude
04-29-04, 06:53 PM
Gee, is someone trying to pump their post count ?

Dirty
04-29-04, 06:54 PM
Shader 3 is going to rock. Just look at the nalu (or whatever that lady fish is) video/pics - it freaking owns.