Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 03-31-03, 04:30 PM   #49
RobHague
I like cheese.
 
RobHague's Avatar
 
Join Date: Mar 2003
Location: UK
Posts: 904
Default

Quote:
Originally posted by Captain Beige
you really think many programmers will code for NV30's DX9+ abilities given how few are available and the NV30 is crippled by its 128 bit memory anyway?
Yes, because the NV30 is now going mainstream ok. The fact here is, one that you might not like - they will sell on the name alone. Nvidia, its been the dogs watsits for a long time and the new versions will sell on that. So developers are more likley to code games for the card that has the highest market share, and take advantage of features there in.

Lets just see what happens shall we? None of us really know and I for one am starting to get bored of the whole NV30 vs 9700 vs 9800 vs R400 blah blah, IQ comparisons and bechmarks... ffs its not worth it unless something has changed. New drivers then fine, but people are just regurgitating the same arguments over and over (and im not just talking about this forum - everywhere)
RobHague is offline   Reply With Quote
Old 03-31-03, 04:39 PM   #50
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
Originally posted by Nutty
Using the 42.68 drivers yes, PS2.0 performance _was_ crap.

Regardless of the IQ implications, the PS2.0 performance using the latest 43.45 drivers in 3dmark03 supasses the 9700 now.

They can probably get it faster too.
--------------------------------------------------------------------------------




Its debatable whether Nvidia just lowered its precision or if they actually improved pixel shader 2.0 performance.

I imagine a little of both has happened.

I'd have to lean towards lowered precision. Here are some new screens from ps2.0 tests on 3dmark03:

http://www.beyond3d.com/forum/viewtopic.php?t=5127

Towards the end they include the reference picture. I'd suggest downloading all 4 pictures and comparing. The 43.00 drivers were by far the worst, but even the 43.45 drivers aren't rendering correctly. The most obvious difference is the lighting on the pedestal. Maybe someday Nvidia will be back to the "gold standard" drivers. But for now it just ain't happening.
jjjayb is offline   Reply With Quote
Old 03-31-03, 04:39 PM   #51
Captain Beige
 
Join Date: Feb 2003
Posts: 59
Default

Quote:
Originally posted by RobHague
Yes, because the NV30 is now going mainstream ok.
by mainstream you mean 5600 and "DX9 for $79" 5200? 5200 and 5600 will be so crippled by low bandwidth that whether they support 50 or 5000 vertx/pixel instructions, or whatever you want to claim makes NV30 so fantasic, will be completely academic since they will be unusable at acceptable levels of performance.

and while nvidia is about the push DX9 to the mainstream ATI is obviously just standing still

it's not like they've already sold out every pre-order 9800pro. oh wait... yes they have, and that's just the 128MB version...
__________________
"If you want a picture of the future, imagine a fan blowing on a human face - forever." ([I]GeForce Orwell, 2004[/I])
Captain Beige is offline   Reply With Quote
Old 03-31-03, 04:43 PM   #52
RobHague
I like cheese.
 
RobHague's Avatar
 
Join Date: Mar 2003
Location: UK
Posts: 904
Default

Quote:
Originally posted by Captain Beige
by mainstream you mean 5600 and "DX9 for $79" 5200? 5200 and 5600 will be so crippled by low bandwidth that whether they support 50 or 5000 vertx/pixel instructions, or whatever you want to claim makes NV30 so fantasic, will be completely academic since they will be unusable at acceptable levels of performance.

and while nvidia is about the push DX9 to the mainstream ATI is obviously just standing still

it's not like they've already sold out every pre-order 9800pro. oh wait... yes they have, and that's just the 128MB version...
5800 Ultras disapear instantley also. Komplett had about 40 TerraTec cards in last week. They now have none.

The reason i think you were so bias btw is by looking at your signature and your avatar text .

Anyway, you seen the price of the 256mb version? How long will it take to arrive? Im not sure but its not like the 9800 128MB version has been delayed though... oh wait it has.

But you were not listening to my statement anyway. What ATI are doing for mainstream means **** all. What i said was the cards will sell on the Name alone. The average joe bloggs is going to walk in and look for the latest Nvidia card not ATI and he will find the 5600 Ultra. Performance means nothing at all - its the name. Apart from that the 9500 looks good against the 5600 right now but the 9500 is being phased out and ATI are being rather quiet about its replacements bechmarks... funny that (or have they actually released any yet?).
__________________

There used to be a signature here, but now there isnt.

Last edited by RobHague; 03-31-03 at 04:46 PM.
RobHague is offline   Reply With Quote
Old 03-31-03, 04:50 PM   #53
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Captain Beige
by mainstream you mean 5600 and "DX9 for $79" 5200? 5200 and 5600 will be so crippled by low bandwidth that whether they support 50 or 5000 vertx/pixel instructions, or whatever you want to claim makes NV30 so fantasic, will be completely academic since they will be unusable at acceptable levels of performance.
you assume there will be no driver updates. the 43.45 driver increased 3dmark2001 performance on the gfFX5200, who knows if any games benefitted?

Quote:
and while nvidia is about the push DX9 to the mainstream ATI is obviously just standing still

it's not like they've already sold out every pre-order 9800pro. oh wait... yes they have, and that's just the 128MB version...
yeah, ATI is doing gamers a service by paper launching products. where the hell are the r9200 and r9600 benchmarks? the cards were launched ~4 weeks ago now along with the r9800!
  Reply With Quote
Old 03-31-03, 05:46 PM   #54
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Captain Beige
no, you are still talking BS. phrases like "Offering the most limited DirectX 9.0 functionality" are BS because DX9 is not about just programmability. the R300 currently offeres the best DX9 functionality because it is the only with WHQL certified drivers and is DX9 compliant ALL the time (and is also widely available and relatively cheap - 9500pro is best bang for buck card ever). also, the programmability you're talking about is not to do with DX9 at all, it's DX9+.

if you want to just talk about programmabilty separately from any other issues (e.g. actually being available to buy) then don't use such BS general phrases like "Offering the most limited DirectX 9.0 functionality".

say things like "NV30 supports x number of instructions, whereas R300 has y number".

and you're also only talking about specs. you really think many programmers will code for NV30's DX9+ abilities given how few are available and the NV30 is crippled by its 128 bit memory anyway? (you think many will even code for DX9 at all until we're into the R400+ era?)
You obviously have limited understanding of programming. I actually doubt you have any,

There are several reasons why the r300 is limited programmability wise and the Nv30/r350 line are less limited,

More programmable precision, More programmable Pixel Shader instructions,

But that concept is above you I believe, Because you obviously have no capability of understanding,

Not even speaking of the limited number of instructions, It also misses half precision, Specific precision, Fixed integers in certain floating point precision levels. Again, 24 bit constant FP is not always a good thing.


Quote:
also, the programmability you're talking about is not to do with DX9 at all, it's DX9+.
This statement shows here that you have absolutely no comprehension of DirectX 9.0 Nor do you have any idea what you are talking about.

DirectX 9.0 specification calls for more than the R300 is capable of. IE preferred precision, half precision, intergers ect, And the R300 meets the "bare minimum" the absolute neccasary minimum to be called DirectX 9.0 compliant

The r300 pixel shader engine leaves no room for imagination when it comes to half precision on reads/writes when neccasary, Also lacking the true 32 bit "reccomended" precision specified Microsft DirectX,

Again Both the Nv30 and r350 were a step towards the compliancy needed for DirectX 9.0

Hopefully when we see the r400 and Nv40 Pixel Shader 3.0 and Vertex Shader 3.0 (both are part of DirectX 9.0 not supported by either card) Will be available by hardware means. Yes these are a part of DirectX 9.0


Quote:
if you want to just talk about programmabilty separately from any other issues (e.g. actually being available to buy) then don't use such BS general phrases like "Offering the most limited DirectX 9.0 functionality
it IS limited compared to the Nv30 and r350, It offers less programmability than the Nv30 and r350, it offers less features, Less Instructions, and less precision options than the Nv30 and r350, And ALL I have been talking about is the programmability features of the r300, Which ARE limited in comparison to the competition.


Quote:
and you're also only talking about specs. you really think many programmers will code for NV30's DX9+ abilities given how few are available and the NV30 is crippled by its 128 bit memory anyway? (you think many will even code for DX9 at all until we're into the R400+ era?)
Again, In the future yes, Heck even carmack is using the Nv30 as his basis for future products, IE he will use its extended programmability features in the future for his baseline of his next engine and future engines.

Just because the Nv30 has crippled architecture does not mean its not a step forward in programmability, Compilers, and rendering features

And this, For whatever reason may be, Seems to have gone beyond your grasp in comprehension.

The Geforce 256 was a crippled card which never saw its potential. But its architecture was the baseline for MANY future products to come, Including any card which went for Hardware T&L,

The R300 "is" a good card, Its shader instructions, functions and precision are limited in comparison to what is becoming available,

Like it or not, The r300 is out dated now in instruction/precision capabilities, Just like the Nv30 is outdated in instruction capabilities now that the r350 is out. This is because we have a little thing called "progress" in computer technology

But I wouldn't want to insult the precious r300. My god man. You'd think you are married to that piece of silicon
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 03-31-03, 07:11 PM   #55
Myrmecophagavir
Registered User
 
Join Date: Dec 2002
Location: Oxford, UK
Posts: 102
Default

Quote:
Originally posted by ChrisRay
Like it or not, The r300 is out dated now in instruction/precision capabilities, Just like the Nv30 is outdated in instruction capabilities now that the r350 is out. This is because we have a little thing called "progress" in computer technology
I think you're overestimating the R350's programmability enhancements over the R300. What are you specifically thinking of here? The f-buffer? ATI's been surprisingly quiet about that apart from the initial press flurry. It doesn't do that much anyway, just seems to remove instruction count limits. The core doesn't extend the R300's instruction set does it?

More generally, your argument that the R300 core is the least programmable D3D9 core available is a bit pointless - there are only 2 cores available, R300 and NV30! Unless they're identical, then of course one of them's going to be the least programmable, and wouldn't you know it, it's the one that was released over 6 months earlier. Obviously when the NV31/34 and RV350 actually hit the shops the situation will be more pronounced, assuming NV31/34 do indeed offer exactly the same programmability as NV30.
Myrmecophagavir is offline   Reply With Quote
Old 03-31-03, 07:23 PM   #56
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

Well, the f-buffer may also be able to remove other limits, such as the number of textures sampled (I don't know about the specific f-buffer implementation, but it is theoretically possible), and the number of temporaries available. It may even be possible to use this for data-dependent branching (it would be very slow, but it may be possible).

Hopefully nVidia has similar technology coming. It really is essential to remove all such limitations.
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote

Old 03-31-03, 07:28 PM   #57
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Myrmecophagavir
I think you're overestimating the R350's programmability enhancements over the R300. What are you specifically thinking of here? The f-buffer? ATI's been surprisingly quiet about that apart from the initial press flurry. It doesn't do that much anyway, just seems to remove instruction count limits. The core doesn't extend the R300's instruction set does it?

More generally, your argument that the R300 core is the least programmable D3D9 core available is a bit pointless - there are only 2 cores available, R300 and NV30! Unless they're identical, then of course one of them's going to be the least programmable, and wouldn't you know it, it's the one that was released over 6 months earlier. Obviously when the NV31/34 and RV350 actually hit the shops the situation will be more pronounced, assuming NV31/34 do indeed offer exactly the same programmability as NV30.
It wasn't an argument but rather than a fact until it was dragged out by someone who shall remain nameless. Its common knowledge that the r300 is more limited from a programmability function point of view.

Its not as much the instructions that bother me but the limited methods of Pixel Shader precision. I'm Going to college to become a graphic engineer so these things are relevent to me.

Anywho, As I said, it wasn't an argument rather than a fact that the r300 offered more limited programmability features than the r350 and the NV30 series.

And frankly. It was never meant to be more than that.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members

Last edited by ChrisRay; 03-31-03 at 07:31 PM.
ChrisRay is offline   Reply With Quote
Old 03-31-03, 10:23 PM   #58
Bopple
Registered User
 
Join Date: Mar 2003
Posts: 208
Default

My first post...only been reading.

I don't think you're wrong about programmability limit on R300, ChrisRay.
But you can say like this, "GeForce 4 doesn't support DX9. There be much to be done. It's a fact." with a same logic.
*shrug* R300/NV25 isn't/wasn't out of the standards.
Nothing really wrong with it.
And for weaknesses, ATi and nvidia getting out R350 and NV30.

Be it fact or not, it seems rather pointless.
__________________
Handsome fighter never loses battle.

Last edited by Bopple; 03-31-03 at 10:38 PM.
Bopple is offline   Reply With Quote
Old 03-31-03, 10:39 PM   #59
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
But you can say like this, "GeForce 4 doesn't support DX9. There be much to be done. It's a fact." with a same logic.
Well thats absolutely correct :P

I'd rather program for an r300 Pixel Shader functions than I would a Geforce 4, Just Like I'd rather program for NV30/R350 pixel Shader functions than I would an r300. Its evolutionary is all
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 03-31-03, 11:25 PM   #60
ahebl
Registered User
 
Join Date: Feb 2003
Posts: 15
Default ok

hey chrisray,
give up on captain beige, he is a lost cause.

let me try:
here's what he's saying, read carefully.

r300 --> least programmable, excellent card (but that doesn't f#cking matter for this argument).

r350, GFFX --> more programmable, (NOTE: it does not f#cking matter which card is the best overall in this argument).

There. Is that CLEAR?

A point from me:
The added programmability leaves a lot of room for new stuff, but at the current stage it doesn't really matter because we don't have any DX9 games yet.

So, in current apps, the 9700 looks just as good as these two cards and performs better than the fx with aa/af. Indisputable, right?

Sorry for the swearing, but I was angry at the ignorance.
ahebl is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Nforce AGP & unreal 2003 nichos NVIDIA Linux 1 10-18-02 05:21 PM
Does anyone like the cool water reflection effect in unreal 2003? imtim83 Gaming Central 15 09-20-02 10:18 PM
NV30 not shipping until Feb. 2003? sbp Rumor Mill 40 09-17-02 10:41 PM

All times are GMT -5. The time now is 04:08 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.