Go Back   nV News Forums > Graphics Card Forums > NVIDIA Legacy Graphics Cards

Newegg Daily Deals

Reply
 
Thread Tools
Old 03-14-03, 10:38 PM   #13
Moose
Cheating is for losers!!!
 
Join Date: Jul 2002
Posts: 241
Default

Quote:
Originally posted by digitalwanderer
I'm not disagreeing with you on any of that...but on WHICH set of drivers is that based on is my big question?

They're pinning their "victory" on the 'review drivers' unless I'm badly mistaken, which means they ain't only not king but they cheated to try and usurp the title.

My comments were based on the "review" drivers that they are suggesting everyone use to benchmark with.

I haven't seen too many reviews where the 43.00's are used.

I guess once nvidia gets around to releasing some WHQL drivers we may really know for sure.

Too bad that WHQL used to be for ensuring compatibility not for discouraging cheating. ahh well times change I guess.
Moose is offline   Reply With Quote
Old 03-14-03, 11:24 PM   #14
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Here is the single most important thing people overlook or forget imo.

The Nv30 *SHOULD* have the same level of performance in FP32 as the 9700 in FP24.

The 9700 operates at full speed in FP24. Because it is designed that way through the entire chip. It only does FP24 and it does it with a grade of A+.

People will say that it is unfair that the Nv30 has to operate at FP32. Well not if it was CORRECTLY DESIGNED. The Nv30 should be running at full speed 100% of the time. This is not the same thing as PS/VS instruction execution. The Chip Simply should execute full FP32 at full speed. It is not an Unfair Comparrison to Force the Nv30 to Run at FP32.

Look at the evidence. M$ minimum Requirement for Dx9 is FP24. Becuase they were making an EXCEPTION for ATi's 96 bit color. The ideal was FP32. It was clearly passed off to everyone inside the industry that the Nv30 would perform at FP32 as its DEFAULT. Just like the 9700pro is FP24 default. then everything would be dithered down to 16 and 32bit color for older games. This however is not the truth as we have all learned. Nvidia for some reason have either been lying to even M$ and the industry in general OR the Nv30's hardware is simply a Disaster of Bugs.

Take your pic. Its really hard to say. Personally I think that its a bug ridden piece of hardware even in its current revision. Bugs that go far deeper than just Fog etc. The bugs also follow all the way through to the Nv31/34. If true this is the single most Bug Ridden piece of hardware since the Savage 2000. It is also absurd to try and blame TSMC for all the problems. Some of this screems of simply BAD DESIGN at the deepest levels of the core.
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]
Hellbinder is offline   Reply With Quote
Old 03-14-03, 11:25 PM   #15
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default Re: Re: Is the 5800 Ultra really the king?

Quote:
Originally posted by StealthHawk
there have never been WHQL NV30 drivers....right?
I thought the 43.00 set was.....but I don't know. I'm gonna go poke around and find out.

(BTW-I like the 43.00 9x/me driver set on me GF3 a lot after RT modifying 'em...so please don't think I'm knocking them. I just sort of assumed they were WHQL and included NV30 for some reason, I could very easily be wrong.)
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 03-15-03, 12:29 AM   #16
SurfMonkey
QuadCore G80 PS3 Overload
 
SurfMonkey's Avatar
 
Join Date: Jul 2002
Location: In a small room surrounded by vast, inscrutable, machines...
Posts: 491
Default

Quote:
Originally posted by Hellbinder
Here is the single most important thing people overlook or forget imo.

The Nv30 *SHOULD* have the same level of performance in FP32 as the 9700 in FP24.

The 9700 operates at full speed in FP24. Because it is designed that way through the entire chip. It only does FP24 and it does it with a grade of A+.

People will say that it is unfair that the Nv30 has to operate at FP32. Well not if it was CORRECTLY DESIGNED. The Nv30 should be running at full speed 100% of the time. This is not the same thing as PS/VS instruction execution. The Chip Simply should execute full FP32 at full speed. It is not an Unfair Comparrison to Force the Nv30 to Run at FP32.

Look at the evidence. M$ minimum Requirement for Dx9 is FP24. Becuase they were making an EXCEPTION for ATi's 96 bit color. The ideal was FP32. It was clearly passed off to everyone inside the industry that the Nv30 would perform at FP32 as its DEFAULT. Just like the 9700pro is FP24 default. then everything would be dithered down to 16 and 32bit color for older games. This however is not the truth as we have all learned. Nvidia for some reason have either been lying to even M$ and the industry in general OR the Nv30's hardware is simply a Disaster of Bugs.

Take your pic. Its really hard to say. Personally I think that its a bug ridden piece of hardware even in its current revision. Bugs that go far deeper than just Fog etc. The bugs also follow all the way through to the Nv31/34. If true this is the single most Bug Ridden piece of hardware since the Savage 2000. It is also absurd to try and blame TSMC for all the problems. Some of this screems of simply BAD DESIGN at the deepest levels of the core.

Maybe ATi just took the easy root out? What happens after ARTX? Do ATi have the balls, the cash, and the technical expertise to better this generation? What if with ARTX they blew their load?

Maybe FP24 was simple, but does it actually push us anyway forward? It's already in the spec, yesterdays news for developers. In the R350 they included the F-buffer, hmmm where does that fit in? Or is it a simple way into the offline rendering market.

Kudos to ATi, they took the simple and cheap way out and it works. But will it take us any further into a future that isn't one vision other than M$'s... naaah. And now I'm not knocking ATi, they made a smart decision. But was it future proof. I don't know, and nobody does until it happens.
__________________
Folding for Beyond3D
"A lie gets halfway around the world before the truth has a chance to get its pants on."
Sir Winston Churchill

"Halflife2 got halfway around the world before Gabe had a chance to get his pants on."
Anon
SurfMonkey is offline   Reply With Quote
Old 03-15-03, 01:06 AM   #17
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Quote:
Maybe ATi just took the easy root out? What happens after ARTX? Do ATi have the balls, the cash, and the technical expertise to better this generation? What if with ARTX they blew their load?

Maybe FP24 was simple, but does it actually push us anyway forward? It's already in the spec, yesterdays news for developers. In the R350 they included the F-buffer, hmmm where does that fit in? Or is it a simple way into the offline rendering market.

Kudos to ATi, they took the simple and cheap way out and it works. But will it take us any further into a future that isn't one vision other than M$'s... naaah. And now I'm not knocking ATi, they made a smart decision. But was it future proof. I don't know, and nobody does until it happens
What does this have to do with the Nv30 being designed correctly???

Fp24 is not simple. Its 96bit floating point color for crying out loud. Give me a break man.... yesterdays news?? I think you better go figure out what you are talking about and come back. Its pretty clear you are associating FP16/24/32 with something its not. Simple and cheap way out.. um ok.. I guess you are trying to say something mean about ATi?? It might actually work if you would go figure out what the Terms of the disccusion mean first.

Trust me, The next generation is going to impress all. This is not fanboy talk.
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]
Hellbinder is offline   Reply With Quote
Old 03-15-03, 05:22 AM   #18
Unit01
Registered User
 
Join Date: Feb 2003
Posts: 209
Default

To me it aint the king
Unit01 is offline   Reply With Quote
Old 03-15-03, 05:49 AM   #19
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Quote:
Originally posted by SurfMonkey
Maybe ATi just took the easy root out? What happens after ARTX? Do ATi have the balls, the cash, and the technical expertise to better this generation? What if with ARTX they blew their load?
I'm not too sure what you mean here - ATi bought out ArtX and took on their engineers, who were spread across both of ATi's design teams.

ArtX did not create the R300 - It was designed by one of the ATi design teams, which was a mixture of ATi and former ArtX engineers.

On top of that, those ArtX guys won't have just vanished once the R300 was released - They are still there, working on future ATi products.


I think it's important to remember that designing a GPU isn't about just cramming as many features as you can into it, you have to trade-off features against performance to create a well-balanced card. This is where ATi suceeded where nVidia failed this time around IMO - The R300 is an almost perfectly balanced chip, whereas the NV30 is much less so. As for future products, we'll have to wait and see.



Finally, to answer digitalwanderer's initial question, I believe StealthHawk is right in saying that no WHQL-certified drivers for the NV30 have yet been released.
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Old 03-15-03, 08:02 AM   #20
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by Hellbinder
Trust me, The next generation is going to impress all. This is not fanboy talk.
First of all, let me insist on the fact that the rumored R400 info *does* impress me.

But...

1. Does ATI have working R400 samples right now?
2A. If they do, does it work as intended?
2B. If they don't, you've got no idea if it isn't going to be even more bug ridden than the NV30.

If you got to 2A and responded "Yes", then please say so and I'll shud up

But otherwise, I think it's unfair to already declare it as the world's best thing. Problems *can* happen. Your post about the NV30 above prooved it, so thanks for doing it for me

The R400 is a very ambitious architecture. And anything ambitious can go very wrong, very fast.

I'd love it if ATI delivered with the R400. From the very little we know, it's could be an absolutely amazing card.
But then again, so could have been the NV30. If the NV30 had anywhere near the theorical performance according to 8x500, then it would have owned the R300 in every PS limited benchmark.
Sounds like the R300 OwNz it, however...


Uttar
Uttar is offline   Reply With Quote

Old 03-15-03, 10:27 AM   #21
Moose
Cheating is for losers!!!
 
Join Date: Jul 2002
Posts: 241
Default

Quote:
Originally posted by SurfMonkey
So I would say that if you just wanted to play games and not stand around and look at the scenery then, yes the GF FX is the fastest card you can buy (and that maybe by clockspeed only ). Otherwise... who cares
The only problem with this is that my old TI4600 could play games extremely fast with no eyecandy turned on. That wasn't enough for me. I wanted both. I wanted high FPS and to have high levels of AA and AF on all the time. Once you have seen the IQ that these features provide without a huge FPS penalty, there just in no going back. 4xAA and 16xAF is the minimum quality of eyecandy that is acceptable to me. The R9700 can do this in every game I have and still provide a very decent FPS. From what I've seen of the GFFX this is not the case.

If a person is only concerned about playing at a high FPS with no thought of good IQ then there is no point to get either card at this point since the last generation could play games just fine.

I thought the whole point of the dawn of cinematic age of computer games was to provide both high speed and IQ that rivals movie quality.

IMO the R9700 (and higher) come much closer to acheiving this than the GFFX. Just look at the benches all over the net. The R9700 smokes the GFFX when the highest IQ settings are used.
Moose is offline   Reply With Quote
Old 03-17-03, 02:51 AM   #22
ChrisW
"I was wrong", said Chris
 
Join Date: Jul 2002
Location: standing in the corner!
Posts: 620
Default

I really can't understand people criticizing ATI for purchasing ArtX. I think it was a very wise decision. With their purchase they got the Nintendo chip, the FireGL line of professional graphics cards, Linux drivers, and some extremely talented engineers. I think they have done an excellent job so far. on the other hand, what did nVidia get from their purchase of 3DFX? I thought the GFFX was supposed to include all this theoretically insane 3DFX technology that was about to be released in their next card, but went bankrupt. Can you name any of this 3DFX technology in the GFFX?
ChrisW is offline   Reply With Quote
Old 03-17-03, 03:36 AM   #23
rwolf
Rock Star
 
Join Date: Oct 2002
Posts: 122
Default

I think the FX is a poor choice right now because Nvidia seems to be having trouble getting WHQL certified drivers out the door.

It doesn't matter how great your card is if it is hard to program. Just think how great the support for it will be when the next big thing comes along.
__________________
Rock on.
rwolf is offline   Reply With Quote
Old 03-17-03, 07:18 AM   #24
Moose
Cheating is for losers!!!
 
Join Date: Jul 2002
Posts: 241
Default

Quote:
Originally posted by rwolf
I think the FX is a poor choice right now because Nvidia seems to be having trouble getting WHQL certified drivers out the door.
The problem isn't getting them out the door. The problem is getting them to equal the speed of the R9700 without resorting to lowering the FP precision to 12 or 16 bit.
Moose is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
King Arthur Devs Reveal Broken Sea News Archived News Items 0 06-11-12 06:30 AM
$899.99 - CyberpowerPC Gamer Ultra 2126 Desktop PC AMD FX-Series FX-8120(3.1GHz) 8GB News GeForce GTX 560 0 05-25-12 05:00 PM
$1,259.99 - CyberpowerPC Gamer Ultra 2103 Desktop PC AMD FX-Series FX-8150(3.6GHz) 16 News GeForce GTX 570 0 05-23-12 12:20 PM
The King Is Dead.Long Live The King! crawdady Other Desktop Graphics Cards 42 07-30-02 10:57 PM

All times are GMT -5. The time now is 10:26 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.