Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 04-01-03, 03:23 PM   #73
AngelGraves13
Registered User
 
Join Date: Aug 2002
Posts: 2,383
Default

I don't expect these cards to run DirectX 9 games at full speed and quality....these are to run the DirectX 8.1 games at full speed. When newer cards come out in year or so, then it'll run DirectX 9 a lot better. These cards are next gen in features, but not horsepower....and that's the way it's always been for years. You buy a new card to play recent games better, not new ones...and if you think you have then you're mistaken. By the time they release games to fully use the features of the new card, it's gonna run like a dog.
AngelGraves13 is offline   Reply With Quote
Old 04-01-03, 04:06 PM   #74
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by jbirney
Huh? You already can buy Dx9 cards at a varity of price points with the R9500 non pro going for $132 ATM The 9800 is not availble yet so I understand your point for the high end (eventough the MSRP for the R9700pro is now $300 which can be found for less on-line which can supply the high end needs for now). But on the mid-stream the RV-350 is just for ATI so they can increase their profit margins on a part with higher through put. If a gamer wants he can go out and by a DX9 video card today from ATI where as he may not be able to if he wanted to by an NV DX9 card. How is that a service?
I was only being semi-serious with my statement. I defintely don't believe ATI paper launching the r9600 and r9200 is doing anyone a service. if you can even call it a paper launch, since there have been no benchmarks. I heard there were going to be benchmarks today....so far there is nothing.

but the point of my post was more to poke fun at CaptainBeige. he said ATI wasn't "standing still" while nvidia pushed DX9 to the mainstream. and yet I don't think that paper launching said cards by a month is anything but standing still. with no benchmarks, that probably means cards won't ship for a few weeks after benchmarks appear, get my drift? in other words, even if benchmarks are released today, we still have a wait ahead of us.
  Reply With Quote
Old 04-01-03, 04:36 PM   #75
Myrmecophagavir
Registered User
 
Join Date: Dec 2002
Location: Oxford, UK
Posts: 102
Default

Quote:
Originally posted by ChrisRay
It wasn't an argument but rather than a fact until it was dragged out by someone who shall remain nameless. Its common knowledge that the r300 is more limited from a programmability function point of view.

Its not as much the instructions that bother me but the limited methods of Pixel Shader precision. I'm Going to college to become a graphic engineer so these things are relevent to me.

Anywho, As I said, it wasn't an argument rather than a fact that the r300 offered more limited programmability features than the r350 and the NV30 series.

And frankly. It was never meant to be more than that.
Sorry, I meant "argument" as in "line of reasoning", rather than an argument with someone else.
Myrmecophagavir is offline   Reply With Quote
Old 04-01-03, 05:41 PM   #76
Steppy
Radeon 10K Pro
 
Join Date: Aug 2002
Posts: 351
Default

Quote:
Originally posted by ChrisRay
That is completely dependent on the program your using. And what you are trying to render. There are cases where the extra shading power is relevent,

Whether your creating a nifty screensaver, or Developing a Texture embossing method for a Playstation emulator.

Your mind is narrowed down only to the games we have right now. And as such, your only thinking about games we have today




Uhh from a programmers point of view, 16 bit Floating Point Calculations would be useful. As I said. Its relevent to the subjective situation, In PSX emulation It would preferrable to use texture embossing in a 16 bit fashion. As there would be no benefit to using 24 bit,

In this case, The 300 cannot benefit from the extra speed given by the lower precision.

We're talking about subjective values. To the specific coder.




Never said he ran into limits with its precision, As a matter of fact, I said he preferred 16 bit precision for rendering in Doom 3, Savvy? The interview is there, It specifically says he would prefer 16 bit precision.

And the only reason he did somethings with 24 bit textures is because thats the way the r300 is programmed.




Thats all fine and dandy for you, But alot of people who buy a Geforce 4 Ti 4400 or a Radeon 9500 Pro will not upgrade these cards for like 3-4 years, So yes its rellevent.

Future proofing does exist, In a limited fashion, And most non hardware enthusiasts buy computers because they believe they are future proof.




Using Pixel Shader 2.0 in 16 bit precision to emulate Pixel Shader 1.4 would not be the cause of its current slow down. And theres nothing wrong with emulating 1.4 in 16 bit precision. As theres no benefits for using the higher precision on Pixel Shader 1.4
Ok, so other than a screensaver, or an emulator of a 7 year old console what uses have you got? And you certainly DID allude to carmack saying he ran into limitations using the 24-bit floating point of the R300. Do you even read what you type? You go on with a paragraph about the fp capabilities of the cards, and then say you can see where carmack ran into limitations. Without making mention of Carmack specifically saying he ran into limitations of the NUMBER OF INSTRUCTIONS. I'm also pretty sure R200 supported 8-bit integer max. I also beg to differ that most non enthusiests buy computers because they think they're futureproof...they buy them because they're cheap usually regardless of what's inside. I can tell you've gotta be an 18 year old kid or younger, because you don't know much about the real world, and your ideas go against what has been shown generation after generation for YEARS in the computer world.
__________________
Here's my clever comment
Steppy is offline   Reply With Quote
Old 04-01-03, 06:38 PM   #77
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Steppy
Ok, so other than a screensaver, or an emulator of a 7 year old console what uses have you got? And you certainly DID allude to carmack saying he ran into limitations using the 24-bit floating point of the R300. Do you even read what you type? You go on with a paragraph about the fp capabilities of the cards, and then say you can see where carmack ran into limitations. Without making mention of Carmack specifically saying he ran into limitations of the NUMBER OF INSTRUCTIONS. I'm also pretty sure R200 supported 8-bit integer max. I also beg to differ that most non enthusiests buy computers because they think they're futureproof...they buy them because they're cheap usually regardless of what's inside. I can tell you've gotta be an 18 year old kid or younger, because you don't know much about the real world, and your ideas go against what has been shown generation after generation for YEARS in the computer world.
*sigh* Pixel Shader embossing for textures is relevent in many cases, Its just amazing that you don't see that,

And other features do indeed make it relevent, I was simply giving you a scenerio where the Pixel Shader limitations are relevent for anyone who is doing programming for fun.

But you just wanna come off insulting rather than provide real arguments.

And yes, A Computer Novice who goes to a store, Buys a Pentium 3 Ghz, And whatever, Buys their computer in the hopes that they won't have to buy another one for another 4 years, But I imagine someone with limitless money wouldn't understand that.

And no, I stated that I can understand why Carmack chose to use the Nv30 series over the r300 series for his basis for future platforms, Offering more instructions, More programmable precision levels, Makes it a better basis for future engines.

You simply don't see that, And you'd rather come off insulting because I think the r300 has a limited Pixel Shader array featuring a fixed floating point shader,

sorry you disagree, Frankly I don't care, Because something tells me. You aren't doing much OpenGL/direct3d programming for it to matter to you.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 04-01-03, 07:10 PM   #78
gokickrocks
Registered User
 
Join Date: Nov 2002
Posts: 409
Default

i read a few posts of the posts in the beginning, and just skipped to this page, so excuse me if this has been asked...

this is directed to ChrisRay btw....

where did you find all the "preferred" specs, that you mentioned, and that microsoft supposedly gave? please direct me to some link on microsoft's site...i went through the dx9 docs and only came up with the minimum specs

btw, the 9700 supports 64 and 128bit float frame buffers IIRC and temp registers need to be fp24, not fp16, to meet dx9 specs
__________________
"never argue with an idiot, they will bring you down to their level, and beat you with experience"

Last edited by gokickrocks; 04-01-03 at 07:21 PM.
gokickrocks is offline   Reply With Quote
Old 04-01-03, 07:36 PM   #79
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by gokickrocks
i read a few posts of the posts in the beginning, and just skipped to this page, so excuse me if this has been asked...

this is directed to ChrisRay btw....

where did you find all the "preferred" specs, that you mentioned, and that microsoft supposedly gave? please direct me to some link on microsoft's site...i went through the dx9 docs and only came up with the minimum specs

btw, the 9700 supports 64 and 128bit float frame buffers IIRC and temp registers need to be fp24, not fp16, to meet dx9 specs

The preferred 32 bit is for texture input only. So I should have been more clear on that, "preferred" texture output is kinda irrelevent anyway,

And yes FP 16 does not meet DX 9.0 specs mainly due to the 24 bit texture cooridinates,

DX 9.0 used to (or still does) allow for partial 16 bit precision on floating point reads ect, But I'm not sure if the GeforceFX is doing that.

But that has nothing to do with texture output maps :P
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 04-01-03, 09:12 PM   #80
Steppy
Radeon 10K Pro
 
Join Date: Aug 2002
Posts: 351
Default

Don't know how a dp happened hours apart.
__________________
Here's my clever comment

Last edited by Steppy; 04-01-03 at 09:18 PM.
Steppy is offline   Reply With Quote

Old 04-02-03, 12:54 AM   #81
Steppy
Radeon 10K Pro
 
Join Date: Aug 2002
Posts: 351
Default

Quote:
Originally posted by ChrisRay
*sigh* Pixel Shader embossing for textures is relevent in many cases, Its just amazing that you don't see that,

And other features do indeed make it relevent, I was simply giving you a scenerio where the Pixel Shader limitations are relevent for anyone who is doing programming for fun.

But you just wanna come off insulting rather than provide real arguments.

And yes, A Computer Novice who goes to a store, Buys a Pentium 3 Ghz, And whatever, Buys their computer in the hopes that they won't have to buy another one for another 4 years, But I imagine someone with limitless money wouldn't understand that.

And no, I stated that I can understand why Carmack chose to use the Nv30 series over the r300 series for his basis for future platforms, Offering more instructions, More programmable precision levels, Makes it a better basis for future engines.

You simply don't see that, And you'd rather come off insulting because I think the r300 has a limited Pixel Shader array featuring a fixed floating point shader,

sorry you disagree, Frankly I don't care, Because something tells me. You aren't doing much OpenGL/direct3d programming for it to matter to you.
How is pixel shading embossing in 16-bit more important than doing it in 24-bit. I'd just like to hear some things that can be done in 16-bit that can't be done in 24. It seems to me that you're assuming the R300 would get a big performance boost using 16-bit over 24 in the same way the FX gets a boost from doing 16 over 32. That's how I'm interpreting you.
__________________
Here's my clever comment
Steppy is offline   Reply With Quote
Old 04-02-03, 01:07 AM   #82
Steppy
Radeon 10K Pro
 
Join Date: Aug 2002
Posts: 351
Default

Oh, and you HAVE programmed for variable FP pricision T&L cards to come to your position correct?
__________________
Here's my clever comment
Steppy is offline   Reply With Quote
Old 04-02-03, 01:11 AM   #83
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Steppy
How is pixel shading embossing in 16-bit more important than doing it in 24-bit. I'd just like to hear some things that can be done in 16-bit that can't be done in 24. It seems to me that you're assuming the R300 would get a big performance boost using 16-bit over 24 in the same way the FX gets a boost from doing 16 over 32. That's how I'm interpreting you.
Its impossible to speculate whether it would or would not because it can't. I imagine if it were capable of doing so. Then Ya I would predict a small performance boost.

I should clarify what Pixel Shader texture embossing is (I'm not meaning to insult your intelligence on the matter, But some people might be interested)

its Using the Pixel Shader to do the texture mapping on a surfaces, I used PSX Emulation, Like most feasible emulation right now. Its Limited to 16 bit merely as an example.

It could be done to sharpen the textures without stressing Bandwith, Rather than GPU fill rate,

Its rather a unique effect tho

Its not more important mind you either, But I believe a higher level of configurability at the users end is,

Maybe one user wants to turn up a specific setting, Then turn down another, The higher and more configurable a user options are, The more they can sacrafice something to get something else later,

Now this might not matter for PSX emulation, But the upcoming dreamcast emulation for Icarus might be able to benefit from this ect,

It also has possibilities elsewhere, Texture embossing is a DirectX 9.0 feature btw...
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 04-02-03, 01:13 AM   #84
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Steppy
Oh, and you HAVE programmed for variable FP pricision T&L cards to come to your position correct?
I have been messing with pixel Shader precision a bit. Just toying with the effects, No I am no master programmer by any means, I consider myself novice at best, But its nice to play with old engines like the Doom(tho OpenGL works best for Doom..) ones for example

As I said, I am a 6 month student in graphic design.. we have been toying with DirectX lately...
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Nforce AGP & unreal 2003 nichos NVIDIA Linux 1 10-18-02 06:21 PM
Does anyone like the cool water reflection effect in unreal 2003? imtim83 Gaming Central 15 09-20-02 11:18 PM
NV30 not shipping until Feb. 2003? sbp Rumor Mill 40 09-17-02 11:41 PM

All times are GMT -5. The time now is 12:11 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.