Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-27-03, 04:16 AM   #25
Morrow
Atari STE 4-bit color
 
Morrow's Avatar
 
Join Date: May 2003
Posts: 798
Default

Quote:
Originally posted by StealthHawk
GAMES optimize. ie, the coders who write the game are the ones optimizing. nvidia shouldn't be able to "optimize" whatever way they choose, because it is not their place to be doing such "optimization."

...nvidia does the same thing with Cg.
no, Cg is not an optimization kit or whatever you want to call it. Far from that. Cg is a tool to develop high-quality shader using high-level language, but you can't optimize your game, read: your games won't run faster when using Cg

So, if hardware vendors are not allowed to optimize their drivers (that's what you have been saying in your post), than I'm really wondering what the ATI and nvidia driver coder are doing all day long.. bug fixes? ATI maybe yes

You don't seem to know that ATI and nvidia have both special teams working on the drivers which only purpose is to optimize the drivers for one particular game! That's how it works this day.

anyway, seems that some people are more eagerly trying to get a GF FX 5900 Ultra now than ever before:

e-bay

current bid is 900$

Last edited by Morrow; 05-27-03 at 04:38 AM.
Morrow is offline   Reply With Quote
Old 05-27-03, 03:35 PM   #26
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Morrow
no, Cg is not an optimization kit or whatever you want to call it. Far from that. Cg is a tool to develop high-quality shader using high-level language, but you can't optimize your game, read: your games won't run faster when using Cg
Fact: Cg is a compiler for shader programs which optimizes shader code for nvidia GPUs. It can also output standard shader code that should run on every GPU. It is a high level language that compiles to optimized DX9 code of optimized OGL code.

Opinion: It seems obvious that nvidia created Cg not to quicken and make easier the adoption of shaders, but to optimize shaders for nvidia hardware, which has been proven to need tweaking to run well as compared to using standards(DX9 and non-proprietary extensions in OGL). Note that there is no dubugging feature in Cg AFAIK...something I'm sure would benefit coders. So either nvidia's purpose was never to make shader coding in general easier(its purpose is to make creating nvidia optimized shaders easier), or else this dream was poorly realized
  Reply With Quote
Old 05-27-03, 06:52 PM   #27
Star_Hunter
Registered User
 
Star_Hunter's Avatar
 
Join Date: Oct 2002
Posts: 104
Default

SImpy put not everything need 32FP some can look almost the same with FP16 and even something with FX12 but since its not written in the program what the everything need to run at nvidia has to put in the drivers what each value is. Once you go off rail they no longer have the correct values so the IQ goes down a lot and their are artifiacts. the n3x like I said likes the split it up and if something doesnt need 32FP its in 16 if not even 16 FX12 ect.

So if 3dmark03 put values on everything (nv3x pathway) then you could move around off rail and it would be shown ALMOST the same as it was intended. In most cases you wont see the difference unless you look really hard and in some you might see none. just accept this is how nv3x works kind of like dont bring a tank to a fight just bring your self(16FP maybe even 12FX) unless its a warrzone then you need the tank (32FP)
Star_Hunter is offline   Reply With Quote
Old 05-27-03, 11:48 PM   #28
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Star_Hunter
SImpy put not everything need 32FP some can look almost the same with FP16 and even something with FX12 but since its not written in the program what the everything need to run at nvidia has to put in the drivers what each value is. Once you go off rail they no longer have the correct values so the IQ goes down a lot and their are artifiacts. the n3x like I said likes the split it up and if something doesnt need 32FP its in 16 if not even 16 FX12 ect.
Not all of the IQ problems were caused by inserting clipping planes. And I'll leave it at that.
  Reply With Quote
Old 06-01-03, 08:10 PM   #29
The Analog Kid
Registered User
 
Join Date: Dec 2002
Location: New Jersey, USA
Posts: 75
Send a message via ICQ to The Analog Kid Send a message via AIM to The Analog Kid
Default

I don't care much, I don't play benchmarks their for it doesn't matter. I care about ingame FPS and that's it. Ingame the 5900 crushes the 9800pro. People put too much worry into benchmarks.
__________________
Microsoft Palladium:Where will we let you go today?
The Analog Kid is offline   Reply With Quote
Old 06-01-03, 10:07 PM   #30
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by The Analog Kid
I don't care much, I don't play benchmarks their for it doesn't matter. I care about ingame FPS and that's it. Ingame the 5900 crushes the 9800pro. People put too much worry into benchmarks.
Um, no. Using in-game benchmarks the 5900 wins against the 9800. Once again I ask, how do you know they aren't cheating in the timedemo benchmarks as well? The answer is that you don't.
  Reply With Quote
Old 06-02-03, 01:42 AM   #31
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by StealthHawk
Um, no. Using in-game benchmarks the 5900 wins against the 9800. Once again I ask, how do you know they aren't cheating in the timedemo benchmarks as well? The answer is that you don't.

Judging by the capabilities (and lack there of) I don't think Nvidia would need to cheat in the current games. UT2003 is just standard shaders, Doom 3 is already optimised, No other game really taxes the FX hardware, As its using standard DX 8.1 shaders where the Nv30 doesn't really have issues
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 06-02-03, 03:29 AM   #32
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by ChrisRay
Judging by the capabilities (and lack there of) I don't think Nvidia would need to cheat in the current games. UT2003 is just standard shaders, Doom 3 is already optimised, No other game really taxes the FX hardware, As its using standard DX 8.1 shaders where the Nv30 doesn't really have issues
They might not need to cheat to win. But maybe they cheat to increase their advantage and make their cards look better? It's certainly a possibility.
  Reply With Quote

Old 06-02-03, 03:54 AM   #33
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by StealthHawk
They might not need to cheat to win. But maybe they cheat to increase their advantage and make their cards look better? It's certainly a possibility.

It's possible. But judging by its specs, it seems to be performing about where it "should" IMO. there's definately no gaurentee tho
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 06-02-03, 01:24 PM   #34
Megatron
Powered by 6800GT
 
Megatron's Avatar
 
Join Date: Jul 2002
Location: Massachusetts
Posts: 239
Default

Quote:
Originally posted by silence
i hear ppl bitch also how Doom III has optimization for NV cards, so what??...if they gonna give better IQ and more fps in game using those optimizations -> I AM HAPPY.

I could be mistaken here, but I was under the impressions that the "optimizations" youre thinking of for Nvidia based cards in Doom3 sacrificed some image quality for the speed.

Supposedly (again if im not mistaken), the Higher IQ settings are on the Arb Path,(which the ATI based cards are running on), not on the optimized paths the Nv30 is running. Nvidias cards crawl when running the higher detail paths.

Now if im mistaken, or have mixed the facts...I apologize. However if im not off the mark..I wonder..are you still happy?
__________________
Athlon64 3200+
1Gb PC3200
BFG 6800GT
Windows XP
Megatron is offline   Reply With Quote
Old 06-02-03, 04:00 PM   #35
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Megatron
I could be mistaken here, but I was under the impressions that the "optimizations" youre thinking of for Nvidia based cards in Doom3 sacrificed some image quality for the speed.

Supposedly (again if im not mistaken), the Higher IQ settings are on the Arb Path,(which the ATI based cards are running on), not on the optimized paths the Nv30 is running. Nvidias cards crawl when running the higher detail paths.

Now if im mistaken, or have mixed the facts...I apologize. However if im not off the mark..I wonder..are you still happy?

It's like this, Quality was from carmacks own mouth


Arb2 path (Nvidia) > Arb2 Path ATI > Nv30 path ATI


But Carmack has stated numerous times that the quality difference between them is marginal at bestr
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Verizon's shared data plans won't save solo users much money News Archived News Items 0 06-12-12 10:40 AM
Choosing The Right GPU To Take Adobe Workflows To The Max News Archived News Items 0 05-29-12 05:40 PM
nvidia: Sharing feedback from CUDA users, "transofrmed the science we can do", "see t News Archived News Items 0 05-15-12 05:30 PM
Learn How Other People Hear Your Voice with Two Folders [Video] News Archived News Items 0 05-14-12 09:00 PM

All times are GMT -5. The time now is 06:35 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.