Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-14-03, 07:33 AM   #49
Voudoun
Registered User
 
Voudoun's Avatar
 
Join Date: May 2003
Posts: 106
Default

I saw that and forgot to post it.

Very interesting, and completely reasonable I thought. Driver reliability when they're being stuffed with more and more shaders occurred to me too.

I know they're doing this to cover up their current product's problems, but will these replacement shaders really get removed come NV40, even if it does a better job vs R420 than NV3x vs R3xx? Won't it be easier to just leave them in there to try to either narrow a performance shortfall, or increase a performance advantage? Just speculation I know and I'm not claiming it's anything more than that, but given what's currently happening, can we really believe that this is purely a short-term thing?

Voudoun
Voudoun is offline   Reply With Quote
Old 11-14-03, 08:10 AM   #50
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default

I hope 3dmark keeps changing their code. Means more frequent driver updates for Nvidia users!
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote
Old 11-14-03, 08:37 AM   #51
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
Originally posted by bkswaney
Yes but AM3 is based off a real game. 3DM03 is not.
Or have I been lead a stray and AM3 is a sen?
Is it not based from the Krass engine?
Ok So you want benchmarks based off real engines. Riddle me this:

1) Kyro2 gets is butt handed to it in 3Dmark2001 by the GF2 Mx.
2) In Max Pane, the kyro2 usally had more than x2 FPS than the same GF2 mx
3) 3Dmark2001 and MaxPane used the SAME ENGINE!

Doh!

Besides look at UT2003 FPS scores and Splinter Cell FPS scores. Both based off the same Unreal Engine so both should have the same score right? Nope not even close. So if two games based off the same engine have different results then does the engine really matter? Nope.

I agree with you. Games matter. However dont ever underestimate the power of 3dmarks. Large OEMs and other big players base a lot of money on how cards score in benchmarks including 3dmark. I dont agree with them but thats what they do. If 3dmark was such a non-issue then why does on IHV keep on optimizing it using 3 different methods?
jbirney is offline   Reply With Quote
Old 11-14-03, 10:22 AM   #52
oqvist
 
Join Date: Sep 2003
Posts: 246
Default

Well the FX series so far is a totally lost generation. Canīt believe nVidia screwed up so bad with the FX series that they felt forced to pull all the moves they have done with it.

I will try cover my ear and be as objective I can when the NV40 comes out and hope that they donīt continue the same route with that as they did with the previous FX cards...

Or else just continuing buying ATI cards and forget that nVidia never existed...

And about ATI:s compilers. You can remove this optimizations with 2 mouseclicks in the control menu.

Whereas nVidia driver engineers canīt manage doing the same with the FX series because they are so complex I guess...
oqvist is offline   Reply With Quote
Old 11-14-03, 11:31 AM   #53
Sazar
Sayonara !!!
 
Sazar's Avatar
 
Join Date: Jan 2003
Location: Austin, Texas
Posts: 9,297
Default

meh one PR guys posting to contradict another

anyways I wonder if kyle is going to change his editorial in light of the retraction by nvidia (apparently) that their compiler is in fact NOT messed with



its getting curiouser and curiouser (sry... can't remmeber what movie thats from)
Sazar is offline   Reply With Quote
Old 11-14-03, 11:32 AM   #54
legion88
WhatIfSports.com Junkie
 
Join Date: Jul 2002
Posts: 135
Default

Quote:
Originally posted by euan
I'm confused.

What is the difference between a synthetic test that does many various 3d operations (geometery, textures, shaders), and say a game fly-by or recorded demo?
In an ideal world, there would be no difference. But we don't live in an ideal world.

A properly designed test for graphics cards would not have any vendor-specific code. That is, the test won't have subroutines like "if card = ATI, then run routine X. if card = NVDIA, then run routine Y'. All the cards would use the same routines so all the cards would be treated the same by the test.

In games, however, that is not the case. Back in the old days of 3dfx, game developers loved glide for more reasons that it being simple to use. Game developers can be very confident that their glide code would run on any 3dfx-based card from the various video card manufacturers because they all are using the same glide drivers.

For Direct3D and OpenGl, that is not the case. A developer's OGL code is not guaranteed to run well on every OGL-capable card because the various cards are not using the same OGL drivers. A developer's D3D code is not guaranteed to run well on every D3D-capable card because the various cards are not using the same D3D drivers. It is not uncommon to see a D3D game crash often using one card while not crash at all on another.

By necessity, the developers had to implement vendor-specific code in their programs for performance issues or to prevent "show-stopping" bugs.

So for game benchmarks, video card performance is not the only thing being measured. The game benchmarks also measure how well the developer can fine-tune their game code for specific vendors.

A properly designed "synthetic" benchmark don't have vendor specific code. All the cards are treated the same.
__________________
With a Bush, a Dick, and a Colin everyone gets screwed

Why are you here? http://www.sincero.com

WhatIfSports.com
legion88 is offline   Reply With Quote
Old 11-14-03, 12:12 PM   #55
Malfunction
 
Malfunction's Avatar
 
Join Date: Jul 2003
Location: Lake Jackson, TX
Posts: 1,002
Default

I agree with what you said legion88, although I can't shake this feeling that what everyone was worried about is going to happen the more I read about the latest DX9 games being *delayed... I mean coming out.

Videocard Industry/Game Dev becoming more like console market:

At first, I was a little reserved as to that being a good idea. Now I kinda am happy about it. I think it might make features that are not normally used like Tru-Form (like sxotty said in another post) start to be implemented. More like the more features a card has and are used while providing terrific performance will be the victor. Why is that such a bad thing?

Reminds me of the console market of the past. One game I can recall I had on both the PS1 and Sega Saturn was Resident Evil. (this is my example so bare with me ) While the Saturn was *supposedly inferior to the PS1, I still think that a few game(s) (Resident Evil in particular) was better looking and played better on the Saturn.

Director's cut on the PS1 version ment jack because I still think it looked better on the Saturn.

When they began weighing in the XBox against the competition, first thing they pointed out was game play on each system with the same title. I think there is enough room for Dev's to showcase for either IHV, how they go about it is what I am curious.

I would however enjoy seeing alot of the features with ATi that are not implemented in game(s) to be showcased soon. In my eyes, all this would do is benefit ATi because their tech is superior at the moment which would only force Nvidia to get on the ball.

My $.02...

Peace,


Last edited by Malfunction; 11-14-03 at 12:20 PM.
Malfunction is offline   Reply With Quote
Old 11-14-03, 01:23 PM   #56
NickSpolec
 
NickSpolec's Avatar
 
Join Date: Aug 2002
Location: Burning Inferno, Arizona
Posts: 371
Default

Unfortunately, Saturn wasn't supposed to be a 3D machine. Sega originally planned for it to be a power house 2D system (in fact, a home version of their System32 arcade hardware), with just enough 3D power to do Model 1 games.

But then Sony showed off PSX, and Sega rushed back to development. They added, in theory, more 3D functionality then PSX had. But, it was really a mess in conventional terms. If you had a good handle on the system (like Yu Suzuki), you could create pretty impressive stuff. But the common developer found the 3D programming far to complex and time consuming, and the results lack-luster compared to the ease of PSX.
__________________
Snake's System:

[size=1][list][*]AthlonXP Mobile 2500+ (@2.5ghz, 1.850v)[*]Albatron KX18D PRO (NForce 2 Ultra @227FSB)[*]512MB's OCZ Platinum PC3200 EL (@DDR454, CAS 2.5, 6,3,3)[*]GeForce3 (@230c, 460m)[*]Fortissimo III, Gamesurround 7.1[*]POS Intel 56k Modem (soon to get high speed, though)[*]Maxtor DiamondPlus 9 120GB, ATA133 (8mb)[*]Samsung DVD/CD-RW Combo (52x32x52x16x)[*]Lite-On LTR 16102B (16x8x40x)

[/size][/list]
NickSpolec is offline   Reply With Quote

Old 11-14-03, 01:42 PM   #57
Malfunction
 
Malfunction's Avatar
 
Join Date: Jul 2003
Location: Lake Jackson, TX
Posts: 1,002
Default

Quote:
Originally posted by NickSpolec
Unfortunately, Saturn wasn't supposed to be a 3D machine. Sega originally planned for it to be a power house 2D system (in fact, a home version of their System32 arcade hardware), with just enough 3D power to do Model 1 games.

But then Sony showed off PSX, and Sega rushed back to development. They added, in theory, more 3D functionality then PSX had. But, it was really a mess in conventional terms. If you had a good handle on the system (like Yu Suzuki), you could create pretty impressive stuff. But the common developer found the 3D programming far to complex and time consuming, and the results lack-luster compared to the ease of PSX.
Ok, Alot of the Dreamcast games looked better than the PS2. Soul Caliber is one I can think of off the top of my head. Won't deny that Sony made it easier for Dev's to code, that is for certain. Hoever they have not always had the best presentation of a game title by allowing it either. Double edge sword I suppose...

Peace,

Malfunction is offline   Reply With Quote
Old 11-14-03, 01:43 PM   #58
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Malfunction
Ok, Alot of the Dreamcast games looked better than the PS2. Soul Caliber is one I can think of off the top of my head. Won't deny that Sony made it easier for Dev's to code, that is for certain. Hoever they have not always had the best presentation of a game title by allowing it either. Double edge sword I suppose...

Peace,


Heh Played Grandia 2 on the Dreamcast then played it on the PS2? It looks awful. The Dreamcast would at least anti alias the game...


Speaking of grandia 2. I love the PC version
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 11-14-03, 02:04 PM   #59
NickSpolec
 
NickSpolec's Avatar
 
Join Date: Aug 2002
Location: Burning Inferno, Arizona
Posts: 371
Default

Quote:
Ok, Alot of the Dreamcast games looked better than the PS2. Soul Caliber is one I can think of off the top of my head. Won't deny that Sony made it easier for Dev's to code, that is for certain. Hoever they have not always had the best presentation of a game title by allowing it either. Double edge sword I suppose...
Of course, Dreamcast had some serious untapped force. I would estimate that no game for DC consumed more then 75-80 percent of the total Dreamcast power. Given an extra year of Sega developed games, I'm sure we would have gotten close to the 100 percent mark.

Quote:
Heh Played Grandia 2 on the Dreamcast then played it on the PS2? It looks awful. The Dreamcast would at least anti alias the game...
Contrary to popular belief, Dreamcast used no anti-aliasing.

The reason why no one really heard of jaggies (in the console market), or noticed them before Playstation 2 hit, was not because of anti-aliasng on Dreamcast, but the Dreamcast's flicker filter.

You should know about flicker filter, if you use TV out on video cards. The default Dreamcast flicker filter was equal to a 2 on GeForce hardware (when using the control panel), on ATI hardware... Meh, one or two notches below full.

Playstation 2 used no flicker filter (not in the first 6 months or so), which is why all the edges were so hard on all the games, while on Dreamcast the edges were quite soft. In fact, you can still find PS2 games that still use the default setting on 0 (zero) on flicker filter. Dreamcast, from the beginning, used the equivalent of 2 (and no game uses a different setting, that I know of --- simply because the default setting was effective [if not a little blurry])
__________________
Snake's System:

[size=1][list][*]AthlonXP Mobile 2500+ (@2.5ghz, 1.850v)[*]Albatron KX18D PRO (NForce 2 Ultra @227FSB)[*]512MB's OCZ Platinum PC3200 EL (@DDR454, CAS 2.5, 6,3,3)[*]GeForce3 (@230c, 460m)[*]Fortissimo III, Gamesurround 7.1[*]POS Intel 56k Modem (soon to get high speed, though)[*]Maxtor DiamondPlus 9 120GB, ATA133 (8mb)[*]Samsung DVD/CD-RW Combo (52x32x52x16x)[*]Lite-On LTR 16102B (16x8x40x)

[/size][/list]
NickSpolec is offline   Reply With Quote
Old 11-14-03, 02:51 PM   #60
Socos
Registered User
 
Join Date: Jul 2003
Location: Michigan
Posts: 137
Default

Quote:
Originally posted by ginfest
HB, I agree with some of what you are saying but have to question the above:
I have a 5900 and a 9800 and yes the 9800 runs Max Payne and my other games very well. I run all my games at 1280x960 4xAA/8xAF. I had to put the 5900 back in to get Call of Duty running and have since continued playing Max Payne and haven't changed any settings. I run it with the console enabled to get FPS showing and haven't noticed a big diff. I don't have and exact numbers but both cards run it at the above settings at 60 FPS or better.
Anyway, if you're talking IQ, I'm not sitting here while playing the game and other games that I play saying "...s**t, that looks crappy compared to my 9800.." and so on. Yes I know that comparing individual frames will show better AA for the 9800 but I'm talking game experience. And no I'm not saying that it doesn't count if you can't see it, just that the difference is not enough to ruin the game for me.
I suppose you could say it's just me-I wonder if others a have run both cards recently and saw a noticeable difference, ie something that makes you say "WTF, I can't play this game like this.."?

My $0.02

Mike G
So what you are saying is both cards play the games you play? Or Call of Duty does not work on a 9800? You lost me on that.

And your probably right most of the arguments made are BS.. Basically it boils down to both cards are capable of playing all of todays games at decent frame rates with decent IQ.

I'm still gonna get a 9800XT!!
__________________
AMD 64 3000 + - 1 GB Kingston HyperX - Chaintech ZNF-150 MB - Audigy 2 Gamer ZS - 200GB SATA HD -
ATI X800 Pro OC 520/540 - 21" Cybervison monitor - Thermaltake Butterfly 450 watt PS - UFO Custom Case
Socos is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Benchmarking Analytical Queries on a GPU News Archived News Items 0 05-20-12 08:00 AM
NVIDIA GeForce GTX 670 Video Card Review @ [H] News GeForce GTX 670 Reviews 0 05-10-12 11:11 AM
unigine Benchmarking with GTX285 and 302.07 on KDE4. This is normal? sl1pkn07 NVIDIA Linux 3 05-10-12 07:11 AM
Benchmarking AMD's 768-Shader Pitcairn: Not For Public Consumption News Archived News Items 0 05-08-12 02:30 AM
Hardball Editorial legion88 Feedback Forum 1 09-02-02 06:45 PM

All times are GMT -5. The time now is 09:33 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright Đ1998 - 2014, nV News.