Originally Posted by jbirney
I am talking about 2003 (about may) when a few sites got a chance to benchmark Doom3 earliy. NV sposored the event and created custom demos. Catalyst Maker himself stated that ATI new knowthing of the test. Infact there driver was (3.2 or 3.4 forget which) was broke and did not use the 256mb on the 9800Pro's back then.
Thus if it was ok for NV to have all the time in the world to work on Doom3 and not ATI last year, turn about is fair play no?
We have differnt timedemos recorded by a few different sites and they all seem to back it up ATIs demo and you still think the are biased?
Dave over at Beyond3d has a possible reasoning:
Which seems pausable.
You failed to mention this bit also
I'll just add Deano C's comments about the NV40 vs X800.
Most of the day to day work is done on P4 3.0Ghz with ATI 9800 Pro's. The movie would have been made on a P4 3.4Ghz with a X800 (12 pipe) in it (that machine has a massive projector (we get to watch HS on a 10 foot screen Smile ) which we use for our meetings etc.). We have a bunch of NV40 to explore PS3.0 but the X800 (we currently only have the one) is a clear winner speed wise so is used for demo's etc.
And further on:
Its probably more our fault than nVidia's to be honest, we built it pretty much on 9700 and 9800, so its not surprising its runs better on an X800. The NV40 path is nowhere as optimized at the moment. We seem to encounter a fair few bugs on the NV40 that cause us to run slower code paths.
And Valve is pretty much in bed with Ati (upcoming "Ati" levels" f.e) so i'm guessing that the time Valve spent on optimizing any NV40 path is close to much non existing.
This is Bjorn quering some Deanos'C comment about the codding for NV40 and X800.
If you think that Doom 3 was biased, what do you make of those coments?