Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 03-30-03, 10:30 PM   #25
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by StealthHawk
can you explain this? why does the R200 path "have to" use FP16 for the R300?
Actually I believe its 16 bit fixed, Because its using r200 pixel shader functions, I'm not sure how backward compatible the R300 is, Meaning if isd Pixel Shader 1.4 is seperate or if the Pixel Shader 2.0 does the 1.4 Shaders like the Nvidia card,

So therefore I can only speculate.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 03-30-03, 10:47 PM   #26
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by ChrisRay
Actually I believe its 16 bit fixed, Because its using r200 pixel shader functions, I'm not sure how backward compatible the R300 is, Meaning if isd Pixel Shader 1.4 is seperate or if the Pixel Shader 2.0 does the 1.4 Shaders like the Nvidia card,

So therefore I can only speculate.
yeah, you're right about that, I think. I have heard that R300 supports 16 bit fixed, which I take is INT16. but not FP16.
  Reply With Quote
Old 03-30-03, 11:05 PM   #27
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

AFAIK, the R300 doesn't support INT16. Could be wrong, though.
The R300 is all about massive FP24 performance.


Uttar
Uttar is offline   Reply With Quote
Old 03-31-03, 01:06 AM   #28
Steppy
Radeon 10K Pro
 
Join Date: Aug 2002
Posts: 351
Default

Quote:
Originally posted by ChrisRay
Since aparently its just 24 bit downsampling to 16 bit (which I think is retarded for any given number of reasons)
Either or, I think ATIS implementation of its floating point precision kinda leaves a little bit to be desired. Expecially when you consider DirectX 9.0 current specification. As ATis card is just a bare minimum for DX 9.0 I'm not quite sure they chose to stick with strict 24 bit precision. DX 9.0 specifications be damned. Probably to save Die space on their already crazily overloaded 0.15 micron proccess.

From a programmers point of view, They leave little room for modification or tweaking, And thats always a bad thing, I can see why John Carmack Stated he has become limited by the r300 programmability. Kinda disapointing to me. Oh well.
You're making some faulty statement here.

1. Both the 32-bit FP(would be 128-bit color as its 32-bit PER CHANNEL) and 24-bit FP( results are downsampled when they are rendered. The FP people are talking about is for color calculations NOT output. It's much the same as the old 3dfx 22-bit color thing...it rendered higher quality 16-bit because all of its calculations were done in 32-bit.

2. It's been shown time and time again that going above DX specs yields nothing as those feature never get supported in games using that API. They may get used in OGL, but few use them.

3. Carnack's comments had NOTHING to do with ATI being 24-bit as opposed to 32...it was about the instruction count being used in situations he was screwing around with NOT game situations, and this has already been addressed in R350 which surpasses the FX's dx9"+"specs.

4. ATI's card being "minimum DX9" is pretty irrelevant as we're probably at LEAST a year a way from games making ANY use of DX9, and TWO years away from games using DX9 as much as DX8 is used now.
__________________
Here's my clever comment
Steppy is offline   Reply With Quote
Old 03-31-03, 02:52 AM   #29
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Uttar
AFAIK, the R300 doesn't support INT16. Could be wrong, though.
The R300 is all about massive FP24 performance.


Uttar
B3D said it could use INT16 when they did their technical comparison of a gfFX vs r9700. some of that info about the gfFX is probably wrong, as more things became known later, but the r9700 info should be correct.
  Reply With Quote
Old 03-31-03, 04:11 AM   #30
AGP64
 
Join Date: Dec 2002
Posts: 33
Default

Quote:
Originally posted by Chalnoth
I think the problem is that DirectX offers no way to expose FX12 functionality (12-bit integer).

Since the FX can execute FX12 and floating-point ops in serial, this is a major problem for the performance of the FX.
I would say the other way around. Both Ati and Nvidia knew Directx 9.0 specs when they went off and designed their R300 and NV30. It just seems that the one made some smarter decisions than the other.
AGP64 is offline   Reply With Quote
Old 03-31-03, 08:42 AM   #31
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

Quote:
Originally posted by AGP64
I would say the other way around. Both Ati and Nvidia knew Directx 9.0 specs when they went off and designed their R300 and NV30. It just seems that the one made some smarter decisions than the other.
Um, no. Both chips were in development for at least two years before the release of DirectX 9. Do you really believe that the specs of DX9 were finalized two years before its release?

The features and specs of DX9 are there because of the hardware that nVidia and ATI developed, not the other way around.
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Old 03-31-03, 08:43 AM   #32
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

Quote:
Originally posted by Uttar
AFAIK, the R300 doesn't support INT16. Could be wrong, though.
The R300 is all about massive FP24 performance.

Uttar
The R300 certainly doesn't appear to have any separate execution units, but the FP24 units could be used to emulate INT16 quite easily (since it does have 1-bit sign and 15-bit mantissa).
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote

Old 03-31-03, 09:17 AM   #33
RobHague
I like cheese.
 
RobHague's Avatar
 
Join Date: Mar 2003
Location: UK
Posts: 904
Default

Quote:
and this has already been addressed in R350 which surpasses the FX's dx9"+"specs.
No, the R350 is an optimized R300 core. It has added programabiliy in some area's however it does not surpass the FX for programability. Nvidia have that at least - if you take a look at the comparison charts around the net for the R300/350 against DX9 against the FX then the FX clearley has more programability. But then if it never gets used its not going to matter is it. Im sure it was Carmack that said he was going to be basing his future projects on the things the NV30 made possible.

*edit*

Found the quote on this...

Quote:
Developers on GeForce FX - John Carmack (Id Software)
NVIDIA Is the first of the consumer graphics chip companies to firmly understand what is going to be happening with the convergence of consumer realtime and professional offline rendering. The architectural decision in the NV30 to allow full floating point precision all the way to the framebuffer and texture fetch, instead of just in internal paths, is a good example of far sighted planning. It has been obvious to me for some time how things are going to come together, but Nvidia has made moves on both the technical and company strategic fronts that are going to accelerate the timetable over my original estimations.

My current work on Doom is designed around what was made possible on the original GeForce, and reaches an optimal implementation on the NV30. My next generation of work will be designed around what is made possible on the NV30.
__________________

There used to be a signature here, but now there isnt.

Last edited by RobHague; 04-02-03 at 05:10 PM.
RobHague is offline   Reply With Quote
Old 03-31-03, 09:20 AM   #34
Captain Beige
 
Join Date: Feb 2003
Posts: 59
Default

Quote:
Originally posted by RobHague
No, the R350 is an optimized R300 core. It has added programabiliy in some area's however it does not surpass the FX for programability. Nvidia have that at least - if you take a look at the comparison charts around the net for the R300/350 against DX9 against the FX then the FX clearley has more programability. But then if it never gets used its not going to matter is it. Im sure it was Carmack that said he was going to be basing his future projects on the things the NV30 made possible.
NV30's "extra" programmability is meaningless due to hugely pathetic performance when using it
__________________
"If you want a picture of the future, imagine a fan blowing on a human face - forever." ([I]GeForce Orwell, 2004[/I])
Captain Beige is offline   Reply With Quote
Old 03-31-03, 09:22 AM   #35
RobHague
I like cheese.
 
RobHague's Avatar
 
Join Date: Mar 2003
Location: UK
Posts: 904
Default

lol in your completley un-bias opinion. If carmack thinks its worthwhile then thats more than enough credentials for me. I look forward to doom3 very much

I think the "fx perfromance is rubbish" thing is kinda old now, stop floging a dead horse lol.
__________________

There used to be a signature here, but now there isnt.
RobHague is offline   Reply With Quote
Old 03-31-03, 09:36 AM   #36
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by RobHague
lol in your completley un-bias opinion. If carmack thinks its worthwhile then thats more than enough credentials for me. I look forward to doom3 very much

I think the "fx perfromance is rubbish" thing is kinda old now, stop floging a dead horse lol.
WTF are you talking about?!
Carmack didn't say that the extra programmability is worthwhile for the customers! In fact, he even said he won't really use it before his next engine!

And the NV30's extra programmability on the Vertex Shader front could be an advantage in real-time scenarios. But the extra programmability in Pixel Shaders *can't* - it's pretty much death at 60 instructions already! Maybe less, not sure, been a long time since I seen those numbers.

The NV30 PS performance *is* pathetic, want it or not. That's the main reason while I have, a few months ago, gone from "The NV30 isn't all that great, but it's still quite good." to "The NV30 isn't good."


Uttar
Uttar is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Nforce AGP & unreal 2003 nichos NVIDIA Linux 1 10-18-02 05:21 PM
Does anyone like the cool water reflection effect in unreal 2003? imtim83 Gaming Central 15 09-20-02 10:18 PM
NV30 not shipping until Feb. 2003? sbp Rumor Mill 40 09-17-02 10:41 PM

All times are GMT -5. The time now is 07:37 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.