PDA

View Full Version : Is there anything nVidia isn't doing right right now?


Pages : [1] 2

PoorGuy
06-23-04, 09:22 PM
With the exception of hardware mpeg encoding/decoding, I can't think of anything nVidia isn't doing wrong to us consumers. They've delivered on every point in the paper launch of the Geforce 6800 series. Now we have seen consumer (nV News nVidia posters) reports of the GeForce 6800 ULTRA, non-ULTRA, and GT. All glowing reports.

The obvious choice is the GeForce 6800 GT and already we've seen posts of disappointment by ATi Radeon X800 PRO owners across boards.

All the belly-aching about availability of nVidia cards is all but forgotten now. Image quality is on par with the competition, etc etc, I could go on and on. Is there anything left to complain about?

jAkUp
06-23-04, 09:24 PM
The only problem I see with nV is their marketing.. I still don't agree with it.

I also didn't like the comment Jen said about how "You should buy a 6800U because it's newer technology than an X800XT." LOL That has to be the worst marketing ploy I have ever heard in my life, not to mention the dumbest reason to upgrade to a new card.

Besides that nV are good this round in my book :) And my 6800U rocks :D :D

Their midrange cards look good also, the 6800GT looks like an amazing card for the money. http://www.nvnews.net/vbulletin/images/icons/icon14.gif

Sazar
06-23-04, 09:34 PM
the nv40 is a positive step forward after the crap we endured (marketing and product-wise) during the early nv3x days and subsequent mis-steps...

nv's marketing definitely needs a kick in the rear but they have improved their AA which was a big knock on them before and they have improved their shader performance...

all in all not bad given what we had seen before... availability is still an issue because the ultra was launched 2 months ago BUT they are showing up..

Helanic Frost
06-23-04, 09:50 PM
the nv40 is a positive step forward after the crap we endured (marketing and product-wise) during the early nv3x days and subsequent mis-steps.

The 3D engine
The remainder of GeForce FX 5900’s engine remains unchanged from GeForce FX 5800. This means that the same 4-pixel pipeline architecture with two texture units per pipe (4x2) that caused so much controversy earlier this year is still used. ATI on the other hand has implemented eight pixel pipelines with one texture unit per pixel pipeline (8x1) in its RADEON 9700, 9800 series. ATI is also proud to proclaim that it has one more vertex engine than NVIDIA.

With GeForce FX 5900 Ultra’s 450MHz clock frequency, this means that there may be some cases where the GeForce FX 5800 Ultra actually outperforms it. GeForce FX 5800 Ultra sports a fill-rate of 2 Gigapixels/sec (500MHz core clock x 4 pixel pipelines), versus 1.8 Gigapixels/sec (450MHz core x 4 pipes) in GeForce FX 5900 Ultra. In comparison, RADEON 9800 PRO boasts just over 3 Gigapixels/second. The GeForce FX cards will have an advantage over ATI in multi-texturing, as the 5800 Ultra’s theoretical fill-rate is 4 Gigatexels/sec and the 5900 Ultra’s is 3.6 Gigatexels/sec. Meanwhile the RADEON 9800 PRO’s peak fill rate is 3.04 Gigatexels/sec.


If Nvidia had went with 8x1 instead of 4x2 i feel the FX series would have demolished the 97-8xx series. They made a bad choice based upon how they thought upcoming games would utilize the architecture. They were wrong and for the first time ever ATI made a gamble that paid off. Imagine if Nvidia had not went with 4x2 in it's architecture and went 8x1. We'd be in a very different market right now and both of the behemoth cards we see now may not have seen the light of day this soon. Competition bred the current generation. I'm almost happy NVIDIA screwed up laste generation now that we can see what the 6800 and X800 cards are capable of. I think progress might have been alot slower had ATI not shaken NVIDIA up with the 9700-9800 cards.

Ruined
06-23-04, 10:07 PM
I also didn't like the comment Jen said about how "You should buy a 6800U because it's newer technology than an X800XT." LOL That has to be the worst marketing ploy I have ever heard in my life, not to mention the dumbest reason to upgrade to a new card.

Actually for me that is a very valid reason. If the 6800 just offered a speed boost, odds are I wouldn't upgrade unless I got a great deal. The fact that it offers a bunch of new features just makes it more fun to play with and more like you are getting something worth your money. Plus it makes it feel like it will last longer.

Helanic Frost
06-23-04, 10:18 PM
Actually for me that is a very valid reason. If the 6800 just offered a speed boost, odds are I wouldn't upgrade unless I got a great deal. The fact that it offers a bunch of new features just makes it more fun to play with and more like you are getting something worth your money.

I agree, there are going to be a minimum of at least 5 game titles that will use SM 3.0 this year. ATI cards will play them just fine and at acceptable frame rates i am sure but SM 3.0 does offer a performance gain for cards that are capable of using it. Right now the 6800s and X800s are soo close but in SM 3.0 games that gap could widen significantly. I truly beleive NVIDIA has a product that will hold the performance crown for the next year.

SmuvMoney
06-23-04, 10:22 PM
The only fault I can think of is the lack of official WHQL drivers on its web site.

PaiN
06-23-04, 10:26 PM
imo there are only two probs....
1. The low supply and high demand of the HiPerformance cards (both nVidia & ATI) is making everyone nuts :screwy:
2. Its scary that cards are available.....but basicly users have to run beta drivers. I'd feel alot more comfortable with a new set of "official" 6800 supporting drivers up.

Helanic Frost
06-23-04, 10:33 PM
imo there are only two probs....
1. The low supply and high demand of the HiPerformance cards (both nVidia & ATI) is making everyone nuts :screwy:
2. Its scary that cards are available.....but basicly users have to run beta drivers. I'd feel alot more comfortable with a new set of "official" 6800 supporting drivers up.

I agree. It baffles me really. You'd really think they would have had at least one WHQL certified driver for the 6800 even if it didn't fully utilize the hardware performance wise. Really glad i am not buying just yet as i want to upgrade my CPU and Mobo first. Hopefully when i do buy i will see cards other than reference design and packaged with the demos and drivers appropriate to the card.

DivotMaker
06-23-04, 11:00 PM
If Nvidia had went with 8x1 instead of 4x2 i feel the FX series would have demolished the 97-8xx series.

Sorry, I hate to disagree, but the biggest failing of the FX series was nVidia's complete miscalculation of the competitive product's DX9 PS 2.0 shader capabilities, not pipeline configuration.

mikechai
06-23-04, 11:01 PM
Yes, the 6800 non ultra should be available in 256MB ram variant!
128MB just ain't enough these days ...

mikechai
06-23-04, 11:28 PM
*double post*

Ruined
06-23-04, 11:48 PM
Sorry, I hate to disagree, but the biggest failing of the FX series was nVidia's complete miscalculation of the competitive product's DX9 PS 2.0 shader capabilities, not pipeline configuration.

I would say nVidia's biggest miscalculation would be not being able to get FP16 set as the standard for DX9 SM2.0 minimum precision. It's pretty obvious the FX series was designed with FP16 in mind as the standard precision for shaders, with FP32 just there as a bonus to be used now and then for really complex shaders.

Sazar
06-23-04, 11:51 PM
I thought this thread was about things nvidia is doing now... not 2-3 years ago :confused:

Ruined
06-23-04, 11:52 PM
I thought this thread was about things nvidia is doing now... not 2-3 years ago :confused:

True, right now they seem to be doing everything right now as the original poster stated tho so there aint much to talk about there ;)

Vapor Trail
06-24-04, 12:35 AM
They still need to get the Farcry IQ problem cleared up and I'm not buying it that its a problem with Farcry so much as a special path being run for Nvidia's previous card to remain competitive. The IQ is fine with the NV40 running the R300 path so I don't believe this is not intentional.

I'd also like to see some WHQL official drivers released on time for my 6800GT delivery. :)

I'll reserve any other comments until I actually have a card in my system but things are definately looking up this round for Nvidia.

anzak
06-24-04, 12:38 AM
They still need to get the Farcry IQ problem cleared up and I'm not buying it that its a problem with Farcry so much as a special path being run for Nvidia's previous card to remain competitive. The IQ is fine with the NV40 running the R300 path so I don't believe this is not intentional.

I'd also like to see some WHQL official drivers released on time for my 6800GT delivery. :)

I'll reserve any other comments until I actually have a card in my system but things are definately looking up this round for Nvidia.

I agree 100%. Only 9 more days... :dance:

Helanic Frost
06-24-04, 12:57 AM
Sorry, I hate to disagree, but the biggest failing of the FX series was nVidia's complete miscalculation of the competitive product's DX9 PS 2.0 shader capabilities, not pipeline configuration.

Many of the Benchmarks the 9800 pro won were by 5-10 fps. I Beleive an 8X1 pipeline configuration would have made a big difference and so do many experts, it could have made up the 5-10 fps leads the ATI cards held in many circumstances. I've seen it disussed alot in alot of forums and in hardware review sites. It could have made up for poorer shader performance. I agree the shaders were not up to snuff but there is more than one way to skin a cat when designing a card to accomplish a goal. You don't see NVIDIA using a similar configuration in NV40 and for good reason and NV40 has fantastic performance.

anzak
06-24-04, 01:56 AM
The FX 5800 and 5900 line only had 4 shader units vs. 8 on the 9700 and 9800 cards (one per pipeline). This is what lead to the lacking PS performance.

2 shader units: FX 5200-570
4 shader units: 5800-5950 Ultra 9550-9600XT
8 shader units: 9700-9800XT
12 shader units: X800 pro
16 shader units: X800XT
24 shader units: 6800
32 shader units: 6800GT-6800 Ultra EE

Helanic Frost
06-24-04, 02:36 AM
The FX 5800 and 5900 line only had 4 shader units vs. 8 on the 9700 and 9800 cards (one per pipeline). This is what lead to the lacking PS performance.

2 shader units: FX 5200
4 shader units: 5600-5950 9550-9600XT
8 shader units: 9700-9800XT
12 shader units: X800 pro
16 shader units: X800XT
24 shader units: 6800
32 shader units: 6800GT-6800 Ultra EE

That's true, not disputing that. I think that's what lead to ATI having better overall IQ too. I just feel an 8X1 pipeline setup would have given the FX series more fps and less of a performance hit using aa and af.

ChrisRay
06-24-04, 03:03 AM
Many of the Benchmarks the 9800 pro won were by 5-10 fps. I Beleive an 8X1 pipeline configuration would have made a big difference and so do many experts, it could have made up the 5-10 fps leads the ATI cards held in many circumstances. I've seen it disussed alot in alot of forums and in hardware review sites. It could have made up for poorer shader performance. I agree the shaders were not up to snuff but there is more than one way to skin a cat when designing a card to accomplish a goal. You don't see NVIDIA using a similar configuration in NV40 and for good reason and NV40 has fantastic performance.


There were more problems with the FX line than just Pipe config, The FX 5900 Had a register problem which made it extremely bandwith limited with high precision shaders, Due to this register issue it was very difficult to write shaders that ran faster than the r300,

The FX card had alot of FP potential. It just had a severe registry limitation

zakelwe
06-24-04, 03:25 AM
The poor shader performance was only really an issue in 3dmark03, most games people played during that time frame was not affected.

What was a problem was the IQ issues and type of AA in all those games compared to ATI. The 5800 128bit bus did not help either until that was recitfied by the 5900. Also. the 4x1 configuration of the lesser FX lead to the GF4 being as good or better .....

But to todays problems, there does not seem to be many apart from some driver issues which will be fixed and will get better and faster. The underlying hardware seems very good.

Regards

Andy

dan2097
06-24-04, 05:22 AM
Helanic Frost: Can you imagine how hot an 5800 ultra would be if they changed it late on to an 8x1 :) Adding the extra pipelines would take alot more pipelines than removing the tmu.

anzak: AFAIK the 5600/5700s are only 2 pixel shader units. Thy're a bit odd, acting as 4x1s under normal sitations and 2x2s under pixel shading.

Your chart doesnt really do the r3xx/r4xx justice. You can say both pf these architectures feature 2 shader units per pipe, albeit one of the shader units has significantly reduced functionality. The 2nd shader unit on the 6800s also has to act as a texture adressor sometimes while that functionality is done by a discrete part in the r3xx/r4xx.

Otherwise the 6800s would be significantly faster in ps2.0. But you can see how the 6800s have managed to be 8 times faster in ps2.0 vs the nv3x. Each shader unit on the nv40 is more powerful than one on the nv3x as it includes an extra maths stage IIRC

I think mainly availability and no officially released drivers on the Nvidia website are the biggest problems. Its a cunning tactic as it means that Nvidia will often manage to get their later beta drivers benchmarked against ATIs official drivers, although when the cards become widely available its going to be a bit dodgy, are the manufactures providing drivers which support the 6800?

jbirney
06-24-04, 02:19 PM
NV "MAY" have an issue with PCIe:

http://www.techreport.com/reviews/2004q2/intel-9xx/index.x?pg=13

Again a bit early to tell but ATI' did much better in this test. Maybe its drivers? Maybe is the way the program was written? Maybe it is due to the brigde chip? Again this may not be anything....


NV issues with Farcry are a complete and utter joke. Sorry but to me there is no way in heck a game with the TWIMTBP sticker on it should have any IQ issues at all. The whole idea of TWIMTBP is to remove the chances for issues like this and make the game play better for the user. I can understand a couple of weeks to fix the IQ issues but it has been months now. Unacceptable.

Other than that NV has hit a home run with these cards. Very impressive with the 6800's!

Arioch
06-24-04, 02:23 PM
Besides the pathetic product launch I have to applaud both companies efforts right now in general.