Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 09-17-03, 01:59 PM   #109
serAph
The Original Superfreak
 
serAph's Avatar
 
Join Date: Jul 2003
Location: Atlanta, GA
Posts: 341
Send a message via ICQ to serAph Send a message via AIM to serAph Send a message via Yahoo to serAph
Default

Quote:
Originally posted by Skuzzy
I kind of hope both will stay around. The competition between them is good and it does push the envelope in design of the API's.

Microsoft will never adopt OpenGL, at least not directly. It may slide in over time, but it won't be one of those, "Hey, let's forget about DX and go to OpenGL", kind of things.

Of course,..DX10 may have some surprises for a lot of people. 'Surprises' may be too weak of a word. The future looks neat.
/me is fantasizing about having DX built into the OS - a totally gpu rendered GUI...

damn you longhorn - hurry up and COME OUT!!
serAph is offline   Reply With Quote
Old 09-17-03, 02:05 PM   #110
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

I want OpenGL and DX to stay around for a while. What really gets me is a lot of the agruments to kill OpenGL reside on few fundamental flawed points right now.


#1. ATI's OpenGL support is not as good as Nvidias (So What? thats not a reason to kill an API)

#2. Nvidia uses propiterary extensions. (So What? OpenGL is open source. This is one of the perks)

#3. OpenGL is behind DirectX 9.0 (Again OpenGL can always catch up. And through the support on the fly extensions. These features can be enabled before they get Arbed)
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 09-17-03, 02:07 PM   #111
serAph
The Original Superfreak
 
serAph's Avatar
 
Join Date: Jul 2003
Location: Atlanta, GA
Posts: 341
Send a message via ICQ to serAph Send a message via AIM to serAph Send a message via Yahoo to serAph
Default

Quote:
Originally posted by ChrisRay
#3. OpenGL is behind DirectX 9.0 (Again OpenGL can always catch up. And through the support on the fly extensions. These features can be enabled before they get Arbed)
I think this means that DX is inherently flawed and can never be as powerful as OpenGL until it inherits this same nature. Thats just my opinion tho...

anyway, ATi will get OpenGL down eventually, so thats no biggie.
serAph is offline   Reply With Quote
Old 09-17-03, 02:10 PM   #112
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

Quote:
#3. OpenGL is behind DirectX 9.0 (Again OpenGL can always catch up. And through the support on the fly extensions. These features can be enabled before they get Arbed)
In what respect? I'm no DX expert, so I dont know, but as far as fragment processing goes, you can get more power in GL, if you're prepared to use NV's own extension. The ARB version is probably identical to DX9.

NV's own Vertex Shader extension again has more power, due to full dynamic branching, which is not due until VS3.0 IIRC.

The only part I think GL currently lacks is floating point render targets, though I could be wrong. NV currently only allow floating point textures of type rectangle, whereas ATI offer much better support. But I dont think this is ARB'ified yet.

But seriously, if you can list the differences I'd be interested and greatful.


EDIT: Meant VS3.0 not PS3.0

Last edited by Nutty; 09-17-03 at 02:13 PM.
Nutty is offline   Reply With Quote
Old 09-17-03, 02:11 PM   #113
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

OpenGL is not going anywhere. It is the choice API for the business class application developer for graphics (CAD, most all movie generation/editing software). It has plenty of support outside of the gaming industry, while DX has little to none outside the gaming industry.
__________________
Stuff Happenz!
Skuzzy is offline   Reply With Quote
Old 09-17-03, 02:15 PM   #114
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

Nutty, what version of OpenGL added PS/VS shaders? Which one added the fragment branching, or is that NVidia specific?

I can coorelate feature addtions if I know what revisions of OpenGL added things, as opposed to DX. Currently, my understanding is that ATI/NVidia are at version 1.5 or so of the ARB? Is that a fair assumption?

EDIT: We are getting very OT here. Maybe another discussion thread about this is in order. Sorry for the OT mods and thread starter.
__________________
Stuff Happenz!
Skuzzy is offline   Reply With Quote
Old 09-17-03, 02:30 PM   #115
serAph
The Original Superfreak
 
serAph's Avatar
 
Join Date: Jul 2003
Location: Atlanta, GA
Posts: 341
Send a message via ICQ to serAph Send a message via AIM to serAph Send a message via Yahoo to serAph
Default

Quote:
Originally posted by Skuzzy
EDIT: We are getting very OT here. Maybe another discussion thread about this is in order. Sorry for the OT mods and thread starter.

yes - lets start a thread in general software or something. We'll get less noobs in there anyway.

edit:

ok - I moved us:

http://www.nvnews.net/vbulletin/show...049#post197049

Last edited by serAph; 09-17-03 at 02:42 PM.
serAph is offline   Reply With Quote
Old 09-17-03, 03:00 PM   #116
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

Quote:
Nutty, what version of OpenGL added PS/VS shaders? Which one added the fragment branching, or is that NVidia specific?

I can coorelate feature addtions if I know what revisions of OpenGL added things, as opposed to DX. Currently, my understanding is that ATI/NVidia are at version 1.5 or so of the ARB? Is that a fair assumption?
Well NV introduced vertex shaders along time ago. About Detonator 7 they were available as emulation in the drivers. This was months before the GF3 emerged with them in hardware. This was NV_VertexProgram, which was the equivelent of DX8 Vertex Shaders, however DX8 wasn't out at this point.

DX8 class pixels shaders arrived with the GF3, in the form of NV_TextureShader, and NV_RegisterCombiners. Technically they could achieve more than DX8 pixel shaders, but were a bugger to code for.

Current drivers support opengl 1.4. Currently the ARB standards, ARB_VertexProgram (which come to think of it, is more akin to DX8 class Vertex Shaders) and ARB_FragmentProgam (DX9 class pixel shader) are only ARB extensions. They should be in the core of 1.5 though.

NV_VertexProgram2, is the GF-FX Vertex Shader, which has dynamic branching. NV_FragmentProgram is basically PS2.0, but slightly better. It has conditional statements, (not true branching) and much higher instruction counts.

Now, if we're talking about pure ARB only stuff. we currently have
ARB_VertexProgram = DX8+ Vertex shader
ARB_FragmentProgram = DX9 Pixel Shader.

Now, the new GLSlang language, introduces two new versions called

ARB_VertexShader, and ARB_FragmentShader. These are the two vertex and fragment shader implementations for the upcoming GLSlang high level shading language compiler.

I haven't looked at them much, but they're probably similar to DX9 equivelents. Perhaps with a few extra features, I dont know.

Anyone know if a new Vertex Shader was introduced in dx9, and what features it has over dx8 vertex shaders?

Last edited by Nutty; 09-17-03 at 03:03 PM.
Nutty is offline   Reply With Quote

Old 09-17-03, 03:23 PM   #117
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

Ok,..so we will do this here. hehe. So, if you stick to the generic path in OpenGL you are sort of at a combination of DX8 and 9? The other option is to use the NVidia path, correct? But this path will only work with NVidia cards, correct (sounds like a DOH! kind of question, but I do not like to assume).

The question about new shaders does not really apply to DX (if I am reading the intent correctly). The language has been enhanced for all shaders. DX9 defines up to PS3.0 (no one supports just yet due to the hardware changes required). PS2.0, 2.1, and 3.0 are defined in DX9 with PS1.4 and earlier in DX8/7.
PS2.1 (NVidia specific) and PS3.0 define full branching and loops.

In DX, MS just modifies the assembly code for each revision of DX. This lead to a real mess as up to PS1.3 every minor number change represented a whole new set of instructions with no requirement to be backward compatible.
This slowed shader usage tremendously in DX. Finally, with 2.0 and later, video cards are required to support all previous versions. Microsoft added the further assurance that all forward versions of the shaders will require full backward compatibilty.
Also, Microsoft had never made a requirement about what level of shader support was needed. NVidia and ATI never used the same version. Microsoft ended that with DX9 and force 2.0 to be required.

These changes finally made DX shaders worth using and a nightmare for video card makers to implement (better them than me).

DX9 shaders are the first to have a high-level language with them. Previous shaders were all done using assembly shader ops.
__________________
Stuff Happenz!
Skuzzy is offline   Reply With Quote
Old 09-17-03, 03:30 PM   #118
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

Quote:
So, if you stick to the generic path in OpenGL you are sort of at a combination of DX8 and 9?
Yeah, NV_VertexProgram, was equivelent of DX8 Vertex Shader, and ARB_VertexProgram is pretty much the same as the nvidia one.

I dont know what changed in VertexShaders from DX8 to 9.

Quote:
DX9 shaders are the first to have a high-level language with them. Previous shaders were all done using assembly shader ops.
You can actually create NV2X style Pixel shaders and Vertex shaders in Cg, so technically they were the 1st to have a high level language. Although its not very flexable for those profiles, and didn't come out until months after Cg debuted.
Nutty is offline   Reply With Quote
Old 09-17-03, 03:51 PM   #119
eesa
the original postmasta'
 
Join Date: Aug 2003
Posts: 386
Default

Quote:
Originally posted by lukar
OpenGL doesn't have to offer anything over DX9.0.
That's the fact, and I was talking about OpenGL being death in that sense.
Everything what comes to the OpenGL API, comes from DX.
If I have to make choice between DX or OpenGL in order to run the game, I choose DX.
OGL is absolutely amazing. In general it's more stable, better looking, less cpu dependent, not platform dependent, and it's not M$. I care mostly about the first three, but the last two are important pirks. Just look at any game that has both implementations. The OGL renderer is ALWAYS better. Even with the original UT, with that hacked together OGL renderer, fps was still more stable than D3D. I'm sorry, but in this case, I can't even repect your opinion. It's really sad that D3D seems to dominate these days. It's because of M$ pumping all this money into the program and manipulating developers. These days D3D is beginning to become more advanced than OGL, but back then D3D was a piece of garbage and yet somehow with all the cash M$ has it has managed to brute force introduce it and have it be used.
eesa is offline   Reply With Quote
Old 09-17-03, 03:55 PM   #120
eesa
the original postmasta'
 
Join Date: Aug 2003
Posts: 386
Default

Quote:
Originally posted by Deathlike2
Noone forced you to use Linux.. Linux isn't perfect... if you don't like it, don't use it (same goes for hardware, software) etc.

People running servers practically use Linux (if not Windows)... there's still much to go for Linux friendliness in comparison to Windows.. but complaining about things that YOU don't understand is pointless.
yeah but windows costs money and the idea of paying M$ doesn't exactly excite me. In so many ways, linux is a more elegant and trustworthy OS and the more M$ exclusive things are introduced, the worse the situation gets. It's at a point where I would prefer to use linux for everything else but I still want to play games so I have no choice but to use windoze. Sucks don't you think?
eesa is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 08:35 PM
NVIDIA announces GeForce GTX 670 (Blog GSM.ARENA / AnandTech Benchmark Charts) News GeForce GTX 670 Reviews 0 05-11-12 06:10 AM
NVIDIA GeForce GTX 670 Video Card Tests (Benchmark Reviews) News GeForce GTX 670 Reviews 0 05-11-12 06:00 AM

All times are GMT -5. The time now is 08:58 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.