Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-26-03, 12:09 PM   #49
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

Quote:
Originally posted by jbirney
Its been show that currently CG only supports features found in NV hardware (no support for Turform or PS1.4). If a developer has all the tools he needs to write HLSL which is a standard for all hardware, why not write just in HLSL?
Cg has full support for PS 2.0 and the ARB extension fragment/vertex programs (Meaning it has full support for the R300 architecture). Truform needs no support in shaders.
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Old 02-26-03, 12:18 PM   #50
UDawg
Retired spammer
 
Join Date: Nov 2002
Posts: 1,799
Default

I can't shoot anything or interact with anything in 3DMark 2003 so I have no use for it. In other words, it isn't a game.
UDawg is offline   Reply With Quote
Old 02-26-03, 01:41 PM   #51
gokickrocks
Registered User
 
Join Date: Nov 2002
Posts: 409
Default

2k3 = 2k3, lets just leave it at that, people get the idea

as for 2k3 being a shortcut, thats just absurd and plain laziness for not typing 1 more number
__________________
"never argue with an idiot, they will bring you down to their level, and beat you with experience"

Last edited by gokickrocks; 02-26-03 at 01:47 PM.
gokickrocks is offline   Reply With Quote
Old 02-26-03, 02:14 PM   #52
kyleb
Registered User
 
Join Date: Jan 2003
Posts: 364
Default

aw it is not lazyness it is just consiceness. als 2k3c would be 2300, at least in my part of the world, but that would be pointless for sure.
kyleb is offline   Reply With Quote
Old 02-26-03, 08:01 PM   #53
Cotita
Nvidia God
 
Join Date: Jul 2002
Posts: 341
Default

Quote:
Originally posted by jbirney
Its been show that currently CG only supports features found in NV hardware (no support for Turform or PS1.4). If a developer has all the tools he needs to write HLSL which is a standard for all hardware, why not write just in HLSL?
Truform has nothing to do with either CG or HLSL.

If ATI wants PS1.4 support they can alway make their own compiler.
__________________
Sometimes I hate being right everytime.
Cotita is offline   Reply With Quote
Old 02-26-03, 08:05 PM   #54
Cotita
Nvidia God
 
Join Date: Jul 2002
Posts: 341
Default

Quote:
Originally posted by DaveBaumann
But this is still a DirectX benchmark - what is the point of using a non-Microsoft/DirectX specific HLSL when DirectX9 comes with its own? DX9 HLSL is optimised - its optimised for DirectX, which is exactly what Futuremark are looking for.
Again, Futuremark didn't use HLSL. So your argument is not valid.

By the way HLSL won't produce ATI optimized code either. So why develop in HLSL when using Cg will deliver faster nvidia performance and same ATI performance as HLSL?
__________________
Sometimes I hate being right everytime.
Cotita is offline   Reply With Quote
Old 02-26-03, 08:23 PM   #55
DaveBaumann
Registered User
 
Join Date: Jan 2003
Posts: 98
Default

Quote:
Originally posted by Cotita
Again, Futuremark didn't use HLSL. So your argument is not valid.
Eh? I know, I asked, and I documentented that! I'm merely pointing out that if they were to have used a HLSL then it would make no sense for them to use Cg given they are making a DirectX benchmark and DirectX ships with its own native HLSL.

Quote:
By the way HLSL won't produce ATI optimized code either. So why develop in HLSL when using Cg will deliver faster nvidia performance and same ATI performance as HLSL?
And thats exactly what Futuremark do not want - they have created optimised code, but not opimised for any particular vendors product its optimised to DirectX.

Futuremark are already being called 'biased' for sticking to what DirectX offers - if they started using HLSL's produced by a vendor, that produces optimised code for that vendor, DESPITE the API they are working on providing a perfectly functional and neutral HLSL, what credibility would they have then?
DaveBaumann is offline   Reply With Quote
Old 02-26-03, 08:29 PM   #56
kyleb
Registered User
 
Join Date: Jan 2003
Posts: 364
Default

well i am sure nvidia would put them up on a white horse again Dave.
kyleb is offline   Reply With Quote

Old 02-26-03, 08:49 PM   #57
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
Originally posted by Cotita
If ATI wants PS1.4 support they can alway make their own compiler.
Ummm PS1.4 is already supported by HLSL so why should ATI spend time and money to write a back end for CG to support PS1.4 again?
jbirney is offline   Reply With Quote
Old 02-26-03, 09:39 PM   #58
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

I think that Futuremark's "sticking with the standards to remain neutral" is flawed. They should optimize for all video cards, not the standards.

As a side note, I had thought that I read 3DMark03 did use DX9 HLSL to aid in writing the shaders (I believe DX9 HLSL, however, has no option to do runtime compiling, which makes for a serious flaw).
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Old 02-26-03, 10:04 PM   #59
gokickrocks
Registered User
 
Join Date: Nov 2002
Posts: 409
Default

since this 2k3 has not been solved, here is my take on it...

in engineering notation and scientific for that matter...
k = e^3

you dont have numbers after the notation unless its the order of the power, so you would have to take the 3 as a multiplication

it would be...
2(e^3)3 = 6000

for 2003, you would have to put...
2.003k or 2k+3
__________________
"never argue with an idiot, they will bring you down to their level, and beat you with experience"
gokickrocks is offline   Reply With Quote
Old 02-26-03, 10:06 PM   #60
kyleb
Registered User
 
Join Date: Jan 2003
Posts: 364
Default

Quote:
Originally posted by Chalnoth
I think that Futuremark's "sticking with the standards to remain neutral" is flawed. They should optimize for all video cards, not the standards.

exactly, scew standards, lets make this all much harder on programs because it is too easy for them to make good games any other way!
kyleb is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
The Most Popular Linux News Of Eight Years News Latest Tech And Game Headlines 0 06-06-12 01:50 PM
Popular Surveillance Cameras Open to Hackers, Researcher Says News Latest Tech And Game Headlines 0 05-15-12 05:30 AM
nCore Schedules Popular Multicore Programming Course for Houston News Latest Tech And Game Headlines 0 05-14-12 05:00 PM

All times are GMT -5. The time now is 04:31 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.