View Single Post
Old 05-18-03, 04:52 PM   #75
Registered User
Join Date: May 2003
Posts: 665

Cutting out the majority of the data to be passed to the card in effect makes the obtained value irrelevant since it merely reflects how well the card can render a portion of the data rather than the entire scene.
There is no indication that the majority of the data is cut out. There is simply no way to say how much of a boost is given.

Apparently you don't understand that the intent of the benchmark is to plug through all of the data and let the card's routines sort out how to handle it and then render it.
Since when is there a handbook on the intent of a benchmark? We already know that both NVIDIA and ATI both optimize for this benchmark, and we know that ATI has access to the developer's version of this benchmark. And obviously Futuremark's intent is not the same as NVIDIA or ATI. NVIDIA (and ATI) wants the benchmark to be as fast and as smooth as possible on their cards without corrupting image quality that we can see.

Thus the scores one receives in a benchmark with the FX5900 is incongruous to the actual performance one will achieve in gameplay.
The scores one receives in a benchmark are never perfectly congruous to actual gaming performance. That was the major beef with 3dmark03 in the first place. The GeForce FX 5900 Ultra has had it's performance well documented by several reviewers, and for the most part extremely well in virtually all benchmarks and games even compared to everything else that is out there.
jimmyjames123 is offline   Reply With Quote