Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 09-17-03, 03:55 PM   #121
saturnotaku
Apple user. Deal with it.
 
Join Date: Jul 2001
Location: The 'burbs, IL USA
Posts: 12,502
Default

Quote:
Originally posted by eesa
but back then D3D was a piece of garbage and yet somehow with all the cash M$ has it has managed to brute force introduce it and have it be used.
Except the original Lithtech engine (talkin' Shogo here). It was amazing for its time.
saturnotaku is offline   Reply With Quote
Old 09-17-03, 03:56 PM   #122
eesa
the original postmasta'
 
Join Date: Aug 2003
Posts: 386
Default

Quote:
Originally posted by Deathlike2
Too bad on the original HL, OpenGL mode runs BETTER than DirectX... IQ wise
And faster. Any time there's an option of renderers, why in the world would anyone choose D3D? It's more cpu dependent, more ugly, and has these dips in fps that just aren't there with OGL.
eesa is offline   Reply With Quote
Old 09-17-03, 04:02 PM   #123
eesa
the original postmasta'
 
Join Date: Aug 2003
Posts: 386
Default

Quote:
Originally posted by serAph
hey you never know around here. Well - actually, I bet if one did a #of posts vs. company preferance tally, the higher posts numbered ppl would probably prefer ATi atm...

...even back when GeForce 3's were new and stuff, was this place so anti-nvidia?
no man. we worshiped nvidia.
eesa is offline   Reply With Quote
Old 09-17-03, 04:04 PM   #124
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

NUTTY, GO TO THE OTHER THREAD! hehe
__________________
Stuff Happenz!

Last edited by Skuzzy; 09-17-03 at 04:12 PM.
Skuzzy is offline   Reply With Quote
Old 09-17-03, 04:07 PM   #125
saturnotaku
Apple user. Deal with it.
 
Join Date: Jul 2001
Location: The 'burbs, IL USA
Posts: 12,502
Default

Quote:
Originally posted by eesa
no man. we worshiped nvidia.
You got that right. We had to beat fan ATIcs off with a stick when the supposed GeForce3-killing 8500 hit the streets. Sure it could get 1000 pos higher in 3dmark, but it couldn't play games for diddly. Ask digitalwanderer, he'll regale you with many a tale of his exploits with his 8500.
saturnotaku is offline   Reply With Quote
Old 09-17-03, 04:09 PM   #126
eesa
the original postmasta'
 
Join Date: Aug 2003
Posts: 386
Default

Quote:
Originally posted by Hellbinder

OpenGL Will be a major payer in the industry until the Day Carmack Dies
Carmack picked up a d3d manual, said wtf, and went back to his ogl hacking. Good for him.
eesa is offline   Reply With Quote
Old 09-17-03, 04:09 PM   #127
serAph
The Original Superfreak
 
serAph's Avatar
 
Join Date: Jul 2003
Location: Atlanta, GA
Posts: 341
Send a message via ICQ to serAph Send a message via AIM to serAph Send a message via Yahoo to serAph
Default

Quote:
Originally posted by saturnotaku
You got that right. We had to beat fan ATIcs off with a stick when the supposed GeForce3-killing 8500 hit the streets. Sure it could get 1000 pos higher in 3dmark, but it couldn't play games for diddly. Ask digitalwanderer, he'll regale you with many a tale of his exploits with his 8500.
that really shows you where nVidia has gone recently...

*points towards the garbage can next to him*
serAph is offline   Reply With Quote
Old 09-17-03, 04:24 PM   #128
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

Quote:
Originally posted by eesa
OGL is absolutely amazing. In general it's more stable, better looking, less cpu dependent, not platform dependent, and it's not M$. I care mostly about the first three, but the last two are important pirks. Just look at any game that has both implementations. The OGL renderer is ALWAYS better. Even with the original UT, with that hacked together OGL renderer, fps was still more stable than D3D. I'm sorry, but in this case, I can't even repect your opinion. It's really sad that D3D seems to dominate these days. It's because of M$ pumping all this money into the program and manipulating developers. These days D3D is beginning to become more advanced than OGL, but back then D3D was a piece of garbage and yet somehow with all the cash M$ has it has managed to brute force introduce it and have it be used.
You are waaaaaaaay off base eesa. There are many reasons DX dominates, but MS has not had to bend devs arms around to get them to use it. I have been in development for a long time, and Microsoft has never contacted me about using DX. It was my choice. Purely a marketing choice.

DX solved a lot of problems for developers. The actual look of a game is up to the developer, it can look better on either API. That is just a programming problem, not an API issue.
DX gained momentum as it comes with every Windows box, and has support for every device in the gaming system. It's not just a graphics API.

Today, DX is just as stable as OpenGL is.

I'm not in this business to make a political statement. I just want to feed my family and DX does a good job of allowing me to do that.
__________________
Stuff Happenz!
Skuzzy is offline   Reply With Quote

Old 09-17-03, 04:29 PM   #129
AnteP
Nordic Nerd
 
AnteP's Avatar
 
Join Date: Dec 2002
Location: Sweden, Lund
Posts: 552
Send a message via ICQ to AnteP
Default

Quote:
Originally posted by serAph
While yer at it smarty - why dont you look for a comparison between the QuadroFX3000 and the FireXGL equivalent that ATi butchered and offered in contest to it. It might blow your fragile little mind.
The FireGL X2 256 costs less than half of a Quadro FX 3000 and yet it delivers almost equal performance in most tests and will surely outperform it in shader intense work in the future.

I find it hard to see anything special about a 2000 USD videocard outperforming a 900 USD videocard by 20 % or so, or actually I find it a pretty uninteresting difference considering how much you're paying for it.
AnteP is offline   Reply With Quote
Old 09-17-03, 04:31 PM   #130
serAph
The Original Superfreak
 
serAph's Avatar
 
Join Date: Jul 2003
Location: Atlanta, GA
Posts: 341
Send a message via ICQ to serAph Send a message via AIM to serAph Send a message via Yahoo to serAph
Default

Quote:
Originally posted by AnteP
The FireGL X2 256 costs less than half of a Quadro FX 3000 and yet it delivers almost equal performance in most tests and will surely outperform it in shader intense work in the future.

I find it hard to see anything special about a 2000 USD videocard outperforming a 900 USD videocard by 20 % or so, or actually I find it a pretty uninteresting difference considering how much you're paying for it.
nah dude - it friggen embarasses it. Did you read that XBit labs comparison? Also the CineFX addon is SWEEET in 3dsMax.

its more like 40-70% difference and in a LARGE amount of tests too - not just 2 or 3 out of 5....

its like 6 out of 7.

Last edited by serAph; 09-17-03 at 04:34 PM.
serAph is offline   Reply With Quote
Old 09-17-03, 04:37 PM   #131
reever2
Registered User
 
reever2's Avatar
 
Join Date: Apr 2003
Posts: 489
Default

Quote:
Originally posted by serAph
nah dude - it friggen embarasses it. Did you read that XBit labs comparison? Also the CineFX addon is SWEEET in 3dsMax.

its more like 40-70% difference and in a LARGE amount of tests too - not just 2 or 3 out of 5....

its like 6 out of 7.
que? http://www.3dchips.net/content/review.php?id=62&page=22

The quadro 3000 is a total waste of money unless you need the extra features, since it gives a 2% increase from the 2000 yet adds hundreds of bucks to the price because of features no normal person would use
reever2 is offline   Reply With Quote
Old 09-17-03, 05:09 PM   #132
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by reever2
que? http://www.3dchips.net/content/review.php?id=62&page=22

The quadro 3000 is a total waste of money unless you need the extra features, since it gives a 2% increase from the 2000 yet adds hundreds of bucks to the price because of features no normal person would use
Since when have professional graphic cards been designed with the "normal" user in mind.


Quote:
In what respect? I'm no DX expert, so I dont know, but as far as fragment processing goes, you can get more power in GL, if you're prepared to use NV's own extension. The ARB version is probably identical to DX9.

NV's own Vertex Shader extension again has more power, due to full dynamic branching, which is not due until VS3.0 IIRC.

The only part I think GL currently lacks is floating point render targets, though I could be wrong. NV currently only allow floating point textures of type rectangle, whereas ATI offer much better support. But I dont think this is ARB'ified yet.

But seriously, if you can list the differences I'd be interested and greatful.


EDIT: Meant VS3.0 not PS3.0
Actually. This argument is in retrospect to the Old OpenGL 1.4 compared to DX 9.0 era. Before the Shader 2.0 equiv was put into OpenGL.

Of Course that old complaint is a bit dated. I am curious. Does OpenGL have an arbed vertex shader 2.0?
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 08:35 PM
NVIDIA announces GeForce GTX 670 (Blog GSM.ARENA / AnandTech Benchmark Charts) News GeForce GTX 670 Reviews 0 05-11-12 06:10 AM
NVIDIA GeForce GTX 670 Video Card Tests (Benchmark Reviews) News GeForce GTX 670 Reviews 0 05-11-12 06:00 AM

All times are GMT -5. The time now is 04:22 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.