Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 04-01-03, 04:39 AM   #61
Captain Beige
 
Join Date: Feb 2003
Posts: 59
Default

Originally posted by RobHague
Anyway, you seen the price of the 256mb version? How long will it take to arrive?

not as long as the FX

Im not sure but its not like the 9800 128MB version has been delayed though... oh wait it has.

yeah it's not like you can actually buy it here in the UK. oh, wait... yes you can http://www.blisware.com/all_products..._connect3d.htm
__________________
"If you want a picture of the future, imagine a fan blowing on a human face - forever." ([I]GeForce Orwell, 2004[/I])

Last edited by Captain Beige; 04-01-03 at 04:43 AM.
Captain Beige is offline   Reply With Quote
Old 04-01-03, 05:25 AM   #62
mongoled
Registered User
 
mongoled's Avatar
 
Join Date: Dec 2002
Location: Sotira, Cyprus
Posts: 34
Default

Having read the thread in its entirity I have a few things to say myself. First of all its good to see such an open discussion about many of the technilogical features these cards offer. Some of which offer more then others.

Im not able to grasp alot of the technical info but im slowly learning.

My next point is directed at ChrisRay, yo dude you keep harping on about the R300 being the least programable dx9 card available; give it a rest you have made your point. Shouldnt that be expected? Its a 9 month old card, was the first dx9 compatible card AND is available to buy in almost any outlet store right now!

So dude chill, tell us something which hasnt been said before and quit trying to put your point across over and over again to Captain Beige. He has his opinion which he is entitled to.

To those of you talking abt the availabilty of the Geforce fx ultra, please get real, it IS NOT readily available to the masses and their are alot of indications that it may never be available to the masses. From a consumers point of view this is what is important, their is absolutely NO POINT banging on abt a cards superior programabilty if the majority of peeps CANT buy it!!!Geez use some common sense here. As for the comments made being directed at the availability of some of the ATI range and their delay, again this is quite lame IMHO. we are not talking abt a 6 month delay here are we? So why are is this being brought into the conversation?

I believe the thread title was abt something completely differenet to what alot of peeps have turned it into; another battle between ATI and NVIDIA.



Please get back to topic I was enjoying the points Uttar was trying to put across and some of the comments received by peeps here.
I already know what card is deemed the best buy right NOW irrespective of programabilty, I want to listen to Uttar's theory, I hope this thread gets back on track.

L8rs.
__________________
*Team CpuCity*
24/7 Settings, Prime95 Stable +9hrs
C3D X800GTO (16P) 580/580
DFI nforce4 Ultra-D
Opteron 146/CABNE0530APMW
@ 3000mhz (300x10) 1.55v (DMM)
300mhz 2.5-4-3-7 (1:1)
2x512 GSkill LC
2x80GB Hitachi SATA-II Raid0
WaterCooled. Fans @ 5v
mongoled is offline   Reply With Quote
Old 04-01-03, 06:02 AM   #63
RobHague
I like cheese.
 
RobHague's Avatar
 
Join Date: Mar 2003
Location: UK
Posts: 904
Default

Quote:
Originally posted by Captain Beige
yeah it's not like you can actually buy it here in the UK. oh, wait... yes you can http://www.blisware.com/all_products..._connect3d.htm
Oh wait no you can't. Did you hover your mouse over the "Buy now" button?

DELIVERY 6-7 WEEKS for the 9800 PRO.

For the 9800 (non pro) DELIVERY 10 WEEKS

2 1/2 Months..... lol no thanks.
__________________

There used to be a signature here, but now there isnt.

Last edited by RobHague; 04-01-03 at 06:16 AM.
RobHague is offline   Reply With Quote
Old 04-01-03, 06:20 AM   #64
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Quote:
Originally posted by RobHague
Oh wait no you can't. Did you hover your mouse over the "Buy now" button?

DELIVERY 6-7 WEEKS for the 9800 PRO.

For the 9800 (non pro) DELIVERY 10 WEEKS

2 1/2 Months..... lol no thanks.
You really shouldn't use the UK as a benchmark for how long products take to ship after launch, you'll always end up being disappointed...
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Old 04-01-03, 06:27 AM   #65
RobHague
I like cheese.
 
RobHague's Avatar
 
Join Date: Mar 2003
Location: UK
Posts: 904
Default

Quote:
Originally posted by Hanners
You really shouldn't use the UK as a benchmark for how long products take to ship after launch, you'll always end up being disappointed...
It was sir Captain Beige that said it was magically available over here now. But yes your right but even the US site i had a pre-order of the 9800 PRO with has pushed the date back. Its the FX allover again, and a company has told me they are getting in some FX's today so if they do then ill go Nvidia rather than hang around for yet another card to appear.
__________________

There used to be a signature here, but now there isnt.
RobHague is offline   Reply With Quote
Old 04-01-03, 07:00 AM   #66
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

I think it's a little early to say 'it's the FX all over again', seeing as the reviews of the 9800 Pro started popping up around March the 6th, so the 30 days until shipping aren't even up yet.

Even then, if I was that impatient for a new card (and I have to admit, I am getting pretty impatient for one myself!) then I'd go for a 9700 Pro over the GeForceFX.
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Old 04-01-03, 07:12 AM   #67
RobHague
I like cheese.
 
RobHague's Avatar
 
Join Date: Mar 2003
Location: UK
Posts: 904
Default

Yeah well ive been waiting since January as in being able to actually afford to "buy" one.

The 9700 is a last ditch for me - if i really cant secure anything else then im going to go 9700 because i need to get a decent graphics card sorted out.
__________________

There used to be a signature here, but now there isnt.
RobHague is offline   Reply With Quote
Old 04-01-03, 10:10 AM   #68
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
My next point is directed at ChrisRay, yo dude you keep harping on about the R300 being the least programable dx9 card available; give it a rest you have made your point. Shouldnt that be expected? Its a 9 month old card, was the first dx9 compatible card AND is available to buy in almost any outlet store right now!

So dude chill, tell us something which hasnt been said before and quit trying to put your point across over and over again to Captain Beige. He has his opinion which he is entitled to.
Uh of course its to be expected, That was the entire point, If you think I was harping on Captain Biege, I suggest you reread the thread,

Its sad when you have to reiterate yourself because people refuse to read or take a point. It was never meant to be said more than once until the guy came off as insulting. And calling me on it.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote

Old 04-01-03, 11:56 AM   #69
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
The average joe bloggs is going to walk in and look for the latest Nvidia card not ATI and he will find the 5600 Ultra. Performance means nothing at all - its the name.
While that may have been true a few months ago its slowly starting to change. Little by little people are learning that nVidia is no longer the fastest card. Also as ATI keeps racking up OEM wins only helps to sell their name.

Quote:
Apart from that the 9500 looks good against the 5600 right now but the 9500 is being phased out and ATI are being rather quiet about its replacements bechmarks... funny that (or have they actually released any yet?).
From news we learned at B3D ATI waited until they saw the first reviews of the 56/52 serries so then can finalize their clock speed. They wanted to try to place it so they not only get good yeilds but stay on top perfromance wise. You should know from playing cards sometimes it wise not to show all of your cards first. BTW B3D does have (or had) a review sample of the RV 350

Quote:
yeah, ATI is doing gamers a service by paper launching products. where the hell are the r9200 and r9600 benchmarks? the cards were launched ~4 weeks ago now along with the r9800!
Huh? You already can buy Dx9 cards at a varity of price points with the R9500 non pro going for $132 ATM The 9800 is not availble yet so I understand your point for the high end (eventough the MSRP for the R9700pro is now $300 which can be found for less on-line which can supply the high end needs for now). But on the mid-stream the RV-350 is just for ATI so they can increase their profit margins on a part with higher through put. If a gamer wants he can go out and by a DX9 video card today from ATI where as he may not be able to if he wanted to by an NV DX9 card. How is that a service?

Quote:
You obviously have limited understanding of programming. I actually doubt you have any, There are several reasons why the r300 is limited programmability wise and the Nv30/r350 line are less limited
You should think a bit before you leap. The R300 can loop back to run almost any effect that the nV30 can do. Is more work and requires a bit more code. Yes. But the point is it can if it has too.
jbirney is offline   Reply With Quote
Old 04-01-03, 12:03 PM   #70
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
You should think a bit before you leap. The R300 can loop back to run almost any effect that the nV30 can do. Is more work and requires a bit more code. Yes. But the point is it can if it has too.
Uh thats the point. Sometimes bloated code is not an efficient way to do things.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 04-01-03, 01:31 PM   #71
Steppy
Radeon 10K Pro
 
Join Date: Aug 2002
Posts: 351
Default

Quote:
Originally posted by ChrisRay
Look buddy I disagree with you on several levels ehre, So let me tackle them.
Improved programmability is never a bad thing, Coming from a programmer,

And the situation where the r300 is"limited" in programmability compared to the r350 and Geforce FX, is somewhat disapointing, Its in ability to be programmed at multiple levels of precision makes it "under" DirectX 9.0 specifications and limits what can be done with DirectX 9.0. There's no argument the r300 is DX 9.0 compliant in a way that leaves no room for imagination. Savvy?

Ok, I'll tackle it this way. ALL THREE OF THESE CARDS ARE TOO SLOW WITH SHADERS EVEN APPROACHING THE LIMITS OF THE R300, MUCH LESS THE FX OR THE 9800. I don't believe I said that technically it wasn't less programmable, but there ARE other limiting factors, you know that little thing called performance. Your "imagination" is already limited by the speed of these cards, at least in the gaming arena, which is what 95% of us do with these cards. The extra shaders are fine and dandy for non real time effect rendering and stuff, but that is pointless for most of us.

Irrelevent, Whether it gets used in DirectX 9.0 or DirectX 10. More programmability in the future is good. The r300 is the most limited DirectX 9.0 card available right now. If you don't buy for new tech, Then what are you buying for? People claim the r300 is very future proof, In Actuality, It's not, Your argument basically reiterates that.

I'm buying it to play games right now and for the next year or maybe two. I'm NOT buying it for a feature that will not be used until this card is equivilent to a GF2 MX card and that's how fast current games at that time play. I'll bet all those people with GF1's got a ton of games that pushed the T&L unit of it to it's max in its lifetime....oh wait, they didn't. I'll bet all those people who got the Radeon for its 3rd Texture unit made heavy use of it in games while they had the card...oh wait, they didn't. Well, those people with the original GF3's got a whole ton of games pushing that programmable T&L unit to the max....oh wait, they didn't. Those people who're saying "future proof" are fooling themselves, because there really is no such thing. It should have a slightly longer life than most other cards before it did, simply because it was SO fast on games at it's release(name another card that could run a current game at it's release at 1600x1200 4xAA and 8x aniso). I'm glad I reiterated that it's not "future proof", because there is no such thing, nor did I ever claim it was.


What the hell are you talking about here? He specifically stated he would prefer 16 bit precision, he also stated 24 bit yielded no significant IQ improvement over the r200's 16 bit. He also stated that 32 bit had marginal Image Quality improvements over 24 bit.

If you don't believe me. All you gotta do is load up your browser, Go to beyond3d.com and check out their carmack interviews. I'd point you there, But I am not gonna spoon feed it

The only time where I specifically saw Carmack state he'd run into limits with the R300 was when he was talking about instruction count. Where I saw him mention the FP precision was where he talked about the codepaths available to the cards, and that the GF3 had 2(16-bit and 32-bit) modes, whereas the 9700 had one(24). The FX ran it's mixed codepath slightly faster most of the time than R300's 24-bit one. The 9700 had a marginal increase in quality for a marginal. The FX running it's 32-bit FP at HALF the speed the 9700's 24-bit path with a marginal quality boost. I don't remember him saying that he'd run into "limits" with it only doing 24-bit. Anyway, the ONLY difference between running these different depths SHOULD be the bandwidth required to trasmit the data, as it should still do the the same number of calculations in a given amount of time(ie if the FX can do 1 16-bit every 10 clock cycles, it SHOULD be able to do 1 32-bit every 10 cycles as long as there is plenty of bandwidth for both, and the FX benchmarks don't indicate this to be so for it.) If we looked at the same interview at beyond3D, I think YOU may need the spoonfeeding here buddy.


Uh its irrelevent because its not actually outputting at that precision, How thats being done is irrelevent, Its the end result that matters.


Here is what you said

"Since aparently its just 24 bit downsampling to 16 bit (which I think is retarded for any given number of reasons)
Either or, I think ATIS implementation of its floating point precision kinda leaves a little bit to be desired. Expecially when you consider DirectX 9.0 current specification. As ATis card is just a bare minimum for DX 9.0 I'm not quite sure they chose to stick with strict 24 bit precision. DX 9.0 specifications be damned. Probably to save Die space on their already crazily overloaded 0.15 micron proccess.

From a programmers point of view, They leave little room for modification or tweaking, And thats always a bad thing, I can see why John Carmack Stated he has become limited by the r300 programmability. Kinda disapointing to me. Oh well.


From a programmers point of view there should be NO REASON to "downsample" something to 16-bit on the R300. The only reason to do so would be to cater to the NV30's.

I find this quite strange since people have been dogging on Nvidia for using its Pixel Shader 2.0 to emulate Pixel Shader 1.4

I think people are dogging it because PS 2.0 is pretty slow on NV30(right now), hence using it to emulate 1.4 is also slow. [/b]
__________________
Here's my clever comment

Last edited by Steppy; 04-01-03 at 01:40 PM.
Steppy is offline   Reply With Quote
Old 04-01-03, 01:59 PM   #72
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Steppy

Quote:
Ok, I'll tackle it this way. ALL THREE OF THESE CARDS ARE TOO SLOW WITH SHADERS EVEN APPROACHING THE LIMITS OF THE R300, MUCH LESS THE FX OR THE 9800. I don't believe I said that technically it wasn't less programmable, but there ARE other limiting factors, you know that little thing called performance. Your "imagination" is already limited by the speed of these cards, at least in the gaming arena, which is what 95% of us do with these cards. The extra shaders are fine and dandy for non real time effect rendering and stuff, but that is pointless for most of us.
That is completely dependent on the program your using. And what you are trying to render. There are cases where the extra shading power is relevent,

Whether your creating a nifty screensaver, or Developing a Texture embossing method for a Playstation emulator.

Your mind is narrowed down only to the games we have right now. And as such, your only thinking about games we have today


Quote:
From a programmers point of view there should be NO REASON to "downsample" something to 16-bit on the R300. The only reason to do so would be to cater to the NV30's.
Uhh from a programmers point of view, 16 bit Floating Point Calculations would be useful. As I said. Its relevent to the subjective situation, In PSX emulation It would preferrable to use texture embossing in a 16 bit fashion. As there would be no benefit to using 24 bit,

In this case, The 300 cannot benefit from the extra speed given by the lower precision.

We're talking about subjective values. To the specific coder.


Quote:
The only time where I specifically saw Carmack state he'd run into limits with the R300 was when he was talking about instruction count. Where I saw him mention the FP precision was where he talked about the codepaths available to the cards, and that the GF3 had 2(16-bit and 32-bit) modes, whereas the 9700 had one(24). The FX ran it's mixed codepath slightly faster most of the time than R300's 24-bit one. The 9700 had a marginal increase in quality for a marginal. The FX running it's 32-bit FP at HALF the speed the 9700's 24-bit path with a marginal quality boost. I don't remember him saying that he'd run into "limits" with it only doing 24-bit. Anyway, the ONLY difference between running these different depths SHOULD be the bandwidth required to trasmit the data, as it should still do the the same number of calculations in a given amount of time(ie if the FX can do 1 16-bit every 10 clock cycles, it SHOULD be able to do 1 32-bit every 10 cycles as long as there is plenty of bandwidth for both, and the FX benchmarks don't indicate this to be so for it.) If we looked at the same interview at beyond3D, I think YOU may need the spoonfeeding here buddy.

Never said he ran into limits with its precision, As a matter of fact, I said he preferred 16 bit precision for rendering in Doom 3, Savvy? The interview is there, It specifically says he would prefer 16 bit precision.

And the only reason he did somethings with 24 bit textures is because thats the way the r300 is programmed.


Quote:
I'm buying it to play games right now and for the next year or maybe two. I'm NOT buying it for a feature that will not be used until this card is equivilent to a GF2 MX card and that's how fast current games at that time play. I'll bet all those people with GF1's got a ton of games that pushed the T&L unit of it to it's max in its lifetime....oh wait, they didn't. I'll bet all those people who got the Radeon for its 3rd Texture unit made heavy use of it in games while they had the card...oh wait, they didn't. Well, those people with the original GF3's got a whole ton of games pushing that programmable T&L unit to the max....oh wait, they didn't. Those people who're saying "future proof" are fooling themselves, because there really is no such thing. It should have a slightly longer life than most other cards before it did, simply because it was SO fast on games at it's release(name another card that could run a current game at it's release at 1600x1200 4xAA and 8x aniso). I'm glad I reiterated that it's not "future proof", because there is no such thing, nor did I ever claim it was.

Thats all fine and dandy for you, But alot of people who buy a Geforce 4 Ti 4400 or a Radeon 9500 Pro will not upgrade these cards for like 3-4 years, So yes its rellevent.

Future proofing does exist, In a limited fashion, And most non hardware enthusiasts buy computers because they believe they are future proof.


Quote:
I think people are dogging it because PS 2.0 is pretty slow on NV30(right now), hence using it to emulate 1.4 is also slow.
Using Pixel Shader 2.0 in 16 bit precision to emulate Pixel Shader 1.4 would not be the cause of its current slow down. And theres nothing wrong with emulating 1.4 in 16 bit precision. As theres no benefits for using the higher precision on Pixel Shader 1.4
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members

Last edited by ChrisRay; 04-01-03 at 02:04 PM.
ChrisRay is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Nforce AGP & unreal 2003 nichos NVIDIA Linux 1 10-18-02 05:21 PM
Does anyone like the cool water reflection effect in unreal 2003? imtim83 Gaming Central 15 09-20-02 10:18 PM
NV30 not shipping until Feb. 2003? sbp Rumor Mill 40 09-17-02 10:41 PM

All times are GMT -5. The time now is 01:30 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.