PDA

View Full Version : Why all the fuss about PS1.4?


Pages : [1] 2 3

DSC
02-23-03, 01:22 PM
Matrox Parhelia, 3DLabs P10, Sis Xabre ALL DO NOT support PS1.4, only 1.3. Why is there so much fuss about it? If everyone but Nvidia supported it, then I see a reason to bash Nvidia, but now the rest of the industry players don't even do PS1.4.

I know, it's a more flexible PS, but whats the use when no one else supports it? fanATIcs seems to be keen on using this against Nvidia, when the truth is no one else supports it, not Matrox, not 3DLabs, not SiS. :confused:

Shinri Hikari
02-23-03, 01:37 PM
They need a reason to bash nVidia for bad mouthing futuremark for using PS1.4(an ATI invention :rolleyes: ).

SlyBoots
02-23-03, 03:05 PM
"not Matrox, not 3DLabs, not SiS"

as if they equal Microshaft :(

ahebl
02-23-03, 03:30 PM
It was made by microsoft, not ATI. the 8500 was just the first card that supported it. Note, PS 2.0 includes all of the instructions in 1.4. It's a fricking standard, not an ATI invention.

DX 9 hardware is required to support it, and when DX 9 software comes out, PS 1.4 will be used, A LOT, by every company that has DX 9 hardware.

Shinri Hikari
02-23-03, 04:47 PM
Microsoft desided to support it, that does not mean they invented it.;) You might want to research the issue a little deeper.:cool:

ChrisW
02-23-03, 05:33 PM
What it all comes down to is the current generation of games use pixel shader versions 1.1 to 1.3 and the next generation of games are going to use 1.4 and 2.0. All DirectX 9 cards are required to support version 1.4 either in hardware or by emulation.

The GeForce 4 didn't have version 1.4 for one reason and one reason only...the 8500 had it. This is why nVidia spent so much time slamming 1.4. They didn't include it in an attempt to kill it and they knew if they didn't include support for it then most games released would not use it. As far as the other graphics cards, they didn't include support for version 1.4 simply because nVidia didn't.

There is a reason why Carmack said the 8500 will run Doom3 better than a GeForce4 and that is because of pixel shader version 1.4. NVidia knew the 8500 would be faster than the GeForce4 which is why they didn't include 1.4. Since most developers optimize for nVidia's cards, that means by not including pixel shader 1.4, games would not use it and because of that, the GeForce4 would be faster than the 8500.

The fact of the matter is that the next generation games will not be using pixel shader version 1.1, 1.2, or 1.3! That is old technology that uses outdated hardware to accomplish it. Game developers know the GeForce4 MX will not be a target for any future games as it has absolutely no pixel shaders. They do know, however, that a customer will be able to purchase a Radeon 9000 card for about the price of a game and that is all that is required to make it the most common denominator. As far as the GeForce 4 4200-4800...I feel your pain. But you have to realise the next generation of games are just not going to be written for that hardware. That was the gamble nVidia made and they will have to pay the consequences for it. Sure, you will be able to run these games but they will probably not even be as fast as a Radeon 9000.

SlyBoots
02-23-03, 06:01 PM
Originally posted by Shinri Hikari
Microsoft desided to support it, that does not mean they invented it.;) You might want to research the issue a little deeper.:cool:

you mean the same way they decided to support ps1.1 rather than 1.0?

what goes around comes around, eh!:p

Shinri Hikari
02-23-03, 06:08 PM
Hopefully they will support PS2.0 as well...:D :cool: Then I would not mind the momentary one-sided support.:rolleyes: This is especially important to me as I am going to get the ti 4200;) :cool: and then maybe the next NVxx.:eek: :p

ChrisW
02-23-03, 06:09 PM
Another thing is that although ATI invented version 1.4, they made it available to everyone for free. Had nVidia invented it, they would have put the NV_ extension on everything and charged everyone else to use it. And Microsoft liked 1.4 so much that they based 2.0 on it. If you are not slamming version 2.0 then you have no place to slam 1.4.

John Reynolds
02-23-03, 06:51 PM
Originally posted by ChrisW
Another thing is that although ATI invented version 1.4, they made it available to everyone for free. Had nVidia invented it, they would have put the NV_ extension on everything and charged everyone else to use it. And Microsoft liked 1.4 so much that they based 2.0 on it. If you are not slamming version 2.0 then you have no place to slam 1.4.

Outstanding point, Chris. There's nothing open about proprietary GL extensions.

Skuzzy
02-23-03, 09:47 PM
Well,..here is one developer who has already migrated all shaders to 1.4 and later. This is a preparation for getting to the DX9 SDK.
I think it is a safe bet that all DX9 games will use PS1.4 or later. PS1.3 and earlier are dead. The instruction sets were pulled from the PS2.0 specification which means software emulation of some type for older shaders.

Good news though. It will be another year,..possibly longer before DX9 games come out.

night
02-24-03, 01:03 AM
i thought 1.4 was the DX8.1 spec ?

Chalnoth
02-24-03, 03:13 AM
Originally posted by night
i thought 1.4 was the DX8.1 spec ?
No. I believe PS 1.2 and PS 1.3 were also added in DX 8.1.

Lezmaka
02-24-03, 04:27 AM
Originally posted by ChrisW
There is a reason why Carmack said the 8500 will run Doom3 better than a GeForce4 and that is because of pixel shader version 1.4. NVidia knew the 8500 would be faster than the GeForce4 which is why they didn't include 1.4.

Care to point to a link for that?

What I found seems to contradict what you say he said. In the previous paragraphs, he was talking about GF3/4 vs 8500.
http://webdog.org/cgi-bin/finger.plm?id=1&time=20020211165445
I can set up scenes and parameters where either card can win, but I think that
current Nvidia cards are still a somewhat safer bet for consistent performance
and quality.

The GF4MX line will be a target for game devs because it (along with GF2MX, etc) is the mass market card. There will be lots of them in use. Doesn't make it a good thing, but the devs will take those people into consideration. It would be stupid to make a game that requires cards with shader support because it would immediately shrink the potential market by quite a bit.

Another thing is that although ATI invented version 1.4, they made it available to everyone for free. Had nVidia invented it, they would have put the NV_ extension on everything and charged everyone else to use it.

man thats kinda funny. PS 1.4 is DirectX, yet you talk about extensions which is what would allow people to use something that isn't in the current OpenGL standard. I've never heard of DirectX extensions, is that something new?

Chalnoth
02-24-03, 08:22 AM
By the way, after looking, the GeForce4 soundly trounces the Radeon 8500 in 3DMark03 (on my CPU, looking at the ORB, the GF4 Ti 4200 gets around 2000, while the 8500 gets around 1500).

Hanners
02-24-03, 08:42 AM
Originally posted by Chalnoth
By the way, after looking, the GeForce4 soundly trounces the Radeon 8500 in 3DMark03 (on my CPU, looking at the ORB, the GF4 Ti 4200 gets around 2000, while the 8500 gets around 1500).

Which is as you'd expect, considering the Radeon 8500 was released to compete with the GeForce3 Ti cards, not the GeForce4.

I'm not sure if anyone has run an 8500 with the Pixel Shaders forced down to 1.1 with the latest version of Rage3D Tweak yet, it would be interesting to see how big a difference the lack of PS 1.4 would make to the scores.

Chalnoth
02-24-03, 08:45 AM
Well, just wanted to say that PS 1.4 once again shows that it cannot really do much for the Radeon 8500.

I had thought earlier that nVidia's main complaints with 3DMark03 were with their GeForce4 line, but now it no longer looks like that's the case. Looks like the GF4 does just fine compared to other DX8 cards (however, its score is disproportionately-low vs. DX9 cards compared to any real games, now and, likely, in the near future).

Moose
02-24-03, 08:57 AM
Originally posted by DSC
Matrox Parhelia, 3DLabs P10, Sis Xabre ALL DO NOT support PS1.4, only 1.3. Why is there so much fuss about it? If everyone but Nvidia supported it, then I see a reason to bash Nvidia, but now the rest of the industry players don't even do PS1.4.

I know, it's a more flexible PS, but whats the use when no one else supports it? fanATIcs seems to be keen on using this against Nvidia, when the truth is no one else supports it, not Matrox, not 3DLabs, not SiS. :confused:

I think you are very wrong on this..

all of ATI's cards 8500 and up support ps1.4
the Matrox Parehelia supports it
the upcoming SIS Xabre II will support it
all future DX9 cards (including nvidia's if they ever ship them) must support it.

Only the older cards do not support it.

Hanners
02-24-03, 09:11 AM
Originally posted by Chalnoth
Well, just wanted to say that PS 1.4 once again shows that it cannot really do much for the Radeon 8500.

I really don't see how you can say that unless you have a side-by-side comparison of the 8500 running PS 1.1 against an 8500 running PS 1.4 - Who knows how much less the 8500 would score using PS 1.1?

I find it hard to believe that rendering effects in a single pass with less polygons rather than multiple passes with more polygons is not going to make any difference to the final score...

jbirney
02-24-03, 09:33 AM
Actaully the P10 can support PS1.4 as well (requires to be "programed for it"). Remember that card was very flexible.

R.Carter
02-24-03, 10:37 AM
Originally posted by Moose
I think you are very wrong on this..

all of ATI's cards 8500 and up support ps1.4
the Matrox Parehelia supports it
the upcoming SIS Xabre II will support it
all future DX9 cards (including nvidia's if they ever ship them) must support it.

Only the older cards do not support it.

The Matrox Parhelia is a DirectX8.1 card as it doesn't support PS 2.0. I've no idea if it can be made to support PS1.4 or not. But currently I think it only supports PS1.3.

Chalnoth
02-24-03, 12:14 PM
Originally posted by Hanners
I really don't see how you can say that unless you have a side-by-side comparison of the 8500 running PS 1.1 against an 8500 running PS 1.4 - Who knows how much less the 8500 would score using PS 1.1?
I meant in terms of how poor the design is compared to the GF4. The GF4 is at quite a disadvantage in most shader benchmarks, and yet still manages to (usually) come out ahead. I believe the main reason is actually the crossbar memory controller.

sebazve
02-24-03, 12:51 PM
Originally posted by DSC
Matrox Parhelia, 3DLabs P10, Sis Xabre ALL DO NOT support PS1.4, only 1.3. Why is there so much fuss about it? If everyone but Nvidia supported it, then I see a reason to bash Nvidia, but now the rest of the industry players don't even do PS1.4.

I know, it's a more flexible PS, but whats the use when no one else supports it? fanATIcs seems to be keen on using this against Nvidia, when the truth is no one else supports it, not Matrox, not 3DLabs, not SiS. :confused:

i dont know i have a GF4MX lol jajajajajaajaj!:lol2: :lol2:

well really, cause you can make all the pretty things of ps 1.1 and 1.3 in one pass...

jbirney
02-24-03, 02:08 PM
Originally posted by Chalnoth
I meant in terms of how poor the design is compared to the GF4. The GF4 is at quite a disadvantage in most shader benchmarks, and yet still manages to (usually) come out ahead. I believe the main reason is actually the crossbar memory controller.

Yeap I would be willing to bet thats the reason as well. Kodos for the GF4 having that :)

StealthHawk
02-24-03, 03:14 PM
Originally posted by Moose
I think you are very wrong on this..

all of ATI's cards 8500 and up support ps1.4
the Matrox Parehelia supports it
the upcoming SIS Xabre II will support it
all future DX9 cards (including nvidia's if they ever ship them) must support it.

Only the older cards do not support it.

Matrox Parhelia does not support PS1.4. and the guy you quoted was talking about Xabre, not XabreII.