Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 09-15-03, 12:26 AM   #133
Miester_V
Apprentice's Master
 
Join Date: May 2003
Location: U.S.A
Posts: 140
Send a message via AIM to Miester_V
Default

Wheter you can tolerate the image distortions or not is not the point. Graphics benchmarks assume that optimal image quality is being used when testing for performance. Using nVidia's method, why coulcn't they just make it it monochromatic without pixel shaders and the other graphical implementations in order to gain more performance? That's becuase there are STANDARDS when benchmarking, and the only results that come out are the general performance, not the image quality setting, since that is a non-variable parameter when comparing between two hardwares. Yes, there are slight differences in rendering as a by-product of having different hardware and drivers, but intentional manipulation of the IQ should not be tolerated, either by the industry or by consumers.
__________________
OS: Win XP Pro
CPU: AMD XP 2400+
Mobo: Epox 8K7a
PSU: Mortec 300w
Memory: Crucial 2100 512MB DDR
VGA: ATI Radeon 9800 Pro Retail
Hard Drive: Maxtor 40GB 7200rpm
DVD ROM: Pioneer 16X
CD-RW: Plextor 16/10/40
Sound Card: Sound Blaster Audigy
Speakers: Boston Acoustics 7800 4.1

------------------------------------------------
It is my DUTY to not purchase any Microsoft products.
Miester_V is offline   Reply With Quote
Old 09-15-03, 01:52 AM   #134
StealthHawk
Guest
 
Posts: n/a
Default Re: lol

Quote:
Originally posted by Shaitan
I totally agree with this. The only reason everyone is standing around and pointing fingers is there are no games to play to actually see the differences in action. IQ this and IQ that, I have looked at the screenshots and personally, there really isn`t enough differences to make me go 1 way or the other from a visuals standpoint.
And seeing a game in action and actually playing it, if your eyes can follow the differences, I salute your omnipotent asses. LOL
That's what in-game video options and the Detonator driver Intellisample slider are for. If you want more performance, then turn down options. Lower quality shouldn't be forced on everyone.

Quote:
like everything else, they will fix it in the next card inline coming to a PC store near you. ;p
If their next product is NV38 it isn't likely that anything will be fixed. From what we've heard, NV38 is merely a respun NV35 core.
  Reply With Quote
Old 09-15-03, 04:11 AM   #135
ishak540m
Registered User
 
ishak540m's Avatar
 
Join Date: Jul 2003
Location: Santa Cruz, CA
Posts: 75
Default

look at this comparison!
__________________

Current system:
Cooler Master ATC-710-SX1 Silver Aluminum Case\2 Blue Cold Cathode Lights\4 COOLER MASTER TLF-R82 Neon L.E.D. Fans\Antec[TrueBlue] 480 watt Power Supply with 2 blue LED fans\Intel D875PBZLK 875P Canterwood Gigabit LAN 8X AGP\Intel Pentium 4 3.0GHz 512k 800MHz FSB | socket 478 w/ Hyper Threading Technology\Zalman CNPS5700D-CU Pure Copper CPU Cooler for Socket 478\2 sticks of Corsair DDR400 512MB PC-3200 C2 XMS | (extreme memory speed series\2 Seagate 80GB 7200rpm SERIAL ATA Barracudas\Radeon 9800 Pro 256 MB\LITE-ON 16x DVD 48x CD-ROM Drive\Lite On 52x24x52 CD-RW\Mitsumi 1.44 MB Floppy Drive\Cooler Master Round Floppy drive cable 18-inch with pull tabs\Creative Labs Sound Blaster Audigy 2\Logitech Z-680 5.1's\Mitsubishi DP930SB-BK 19" CRT Monitor\Microsoft Windows XP Professional
My Ibanez S2120XAV (Bridge PU - Steve's Special, Neck PU - Humbucker From Hell.)|AMP: Mesa Boogie Mark IV
My Taylor 814CE w/ Blender
ishak540m is offline   Reply With Quote
Old 09-15-03, 04:16 AM   #136
jimbob0i0
ATI Geek
 
jimbob0i0's Avatar
 
Join Date: Apr 2003
Posts: 268
Send a message via ICQ to jimbob0i0 Send a message via Yahoo to jimbob0i0
Default Re: lol

Quote:
No vid card is futureproof btw. WTF are gamers gonna wise up to this fact and just see this for the PR ploy that it is. nVidia has used the futureproof ploy since the Geforce 256. Funny how we still need to upgrade every 2nd card or so to get the max benefits the never ending next gen cards always allow.
That is indeed true and a new vid card is a annual thing for me (occasionally 6 months if the card really warrants it). However, a gamer has the right to expect their top of the line $500 card to last more than 6 months and be playable at high settings on a standard code path without a 120MB driver replacing all the shaders for them.

In fact I would go so far to say that any top-of-the-line refresh part of a new core (ie 9800 or 5900 and note I'm saying refresh not the initial part) should be good for a year of gaming from purchase - heck on DX8 games my ti4600 on one of my comps still kicks ass and the 9700pro on my main rig is now doing the same for DX9 games now a year after the latter's release and even longer for teh former (and yes I know those weren't refresh parts... I got on the bandwagon early on those generations and did not regret it that time.)
jimbob0i0 is offline   Reply With Quote
Old 09-15-03, 05:27 AM   #137
kmf
Registered User
 
kmf's Avatar
 
Join Date: Aug 2003
Posts: 90
Unhappy

someone please tell me that nvidia is going to cancel the (nv38). it would be a complete wast of money investments exc.. exc.. i hope there not seriously considering it! they'll go bankrupt. this would not be a wise decision.
kmf is offline   Reply With Quote
Old 09-15-03, 06:34 AM   #138
Hank Lloyd
Registered User
 
Join Date: Aug 2003
Posts: 15
Default

Quote:
totally agree with this. The only reason everyone is standing around and pointing fingers is there are no games to play to actually see the differences in action. IQ this and IQ that, I have looked at the screenshots and personally, there really isn`t enough differences to make me go 1 way or the other from a visuals standpoint.
I hate to beat a dead horse here, but it bears repeating. Go ahead and re-read any nVidia Cg/Fx white paper that exists. They talked about how they went beyond the spec., "The way it's meant to be played", the huge 32-bit advantage over the inferior 24-bit one, the virtual limitless number of shaders the Fx could handle as compared to the competition, etc. Then factor in everything else we know that nVidia has either said or done since then, and every last bit of it is at odds with their desire to give you that Cinematic computing. I know it's all PR fluff, but honestly, there comes a point where they can go too far, and they reached that point a long time ago.

In all honesty, I can't imagine anybody with a clear thinking brain that would consider buying any Fx card.
Hank Lloyd is offline   Reply With Quote
Old 09-15-03, 08:45 AM   #139
aapo
Registered User
 
aapo's Avatar
 
Join Date: May 2003
Location: Finland
Posts: 273
Default Re: Re: Re: Re: Bah

Quote:
Originally posted by StealthHawk
What you say is only true iff a market analyst is keen enough to figure out what's going on. NVIDIA sure isn't going to admit their hardware is flawed.
I bet there are people who now far better than we how things really are. And those peeps reside in the same place where the big money is.
__________________
no sig.
aapo is offline   Reply With Quote
Old 09-15-03, 04:14 PM   #140
serAph
The Original Superfreak
 
serAph's Avatar
 
Join Date: Jul 2003
Location: Atlanta, GA
Posts: 341
Send a message via ICQ to serAph Send a message via AIM to serAph Send a message via Yahoo to serAph
Default

Quote:
Originally posted by 4q2
Its pretty obvious to me and almost everyone else here that Nvidia optimizes by lowering image quality until the benchmark bar next to their products name is equal to or better than the competition.

I never realized that the dawn of cinematic quality rendering had so much to do with how large a 2d bar graphic was.

My bad.
thats what corporations do man - its not just video cards and... prepare yourself for this... its not just nVidia either!

now. Welcome to reality. Enjoy your stay.
serAph is offline   Reply With Quote

Old 09-15-03, 06:56 PM   #141
Sazar
Sayonara !!!
 
Sazar's Avatar
 
Join Date: Jan 2003
Location: Austin, Texas
Posts: 9,297
Default

Quote:
Originally posted by serAph
thats what corporations do man - its not just video cards and... prepare yourself for this... its not just nVidia either!

now. Welcome to reality. Enjoy your stay.
do you still believe valve's claims were bogus seraph ?
Sazar is offline   Reply With Quote
Old 09-15-03, 09:14 PM   #142
Evan Lieb
Registered User
 
Join Date: Sep 2003
Posts: 38
Default

Quote:
Originally posted by John Reynolds
I was just thinking about this situation myself earlier today. I'm getting tired of seeing the same people constantly spouting the same negative comment over 'n over. It's like the noise ratio has gone through the roof on almost every forum and it's getting fairly irritating.

That said, however, what's also annoying is how Nvidia just doesn't seem to get the message. They blatantly continue to lie to their customers and the gaming community about cheating (let's be honest, they've yet to admit any wrong-doing whatsoever, which is, to be frank, quite insulting to anyone's intelligence if you consider the level of evidence that's been gathered against them this summer), they offer up slides that supposedly represent their internal driver development process and the safeguards that're supposed to prevent overly aggressive optimizations, and yet the latest driver build that's offered to reviewers seems to be just more of the same-old. Will any of this ever stop? Will it continue to the point that Nvidia's competitors feel that they too must start crossing the line with their optimizations so that their products are no longer unfairly represented, leaving hardware reviewers with an almost impossible job of gathering and presenting clear and accurate information to their readers as they waste their time and effort trying to work around all the cheating (which, from the screenshot detection disclosure, seem to be becoming more insidious)?

This has got to stop. The conclusion I came to this morning was that while watching the same people constantly repeating themselves in their cyclic Nvidia bashing posts is annoying, it's more important that hardware reviewers attempt some collective effort to put an end to Nvidia's parade of cheating. Stop that, and perhaps the Nvidia bashing will stop too. Until then, people are going to continue harshly criticizing the company and no amount of bannings will stop it. And ban too much and you risk becoming another [H], and we both know that is hardly desirable.

Edit: So what should be done. Draft up a document that states that there will be no further reviews, articles, editorials, or even publishing of press releases from Nvidia until they can release a WHQLed set of drivers that do not manifest any cheats once thoroughly examined. This document, to be effective, would have to be signed by the major hardware sites (Anand, Tom, [H]), and as many of the smaller sites as possible (B3D, NVNews, Tech-Report, ET, 3DGPU, ad nauseum) and people would have to stick to their guns. But even as I type this I know it's a pipe dream, that those who make their livings from their sites would immediately inject their egos into the process, and it would fail.
While the bigger web sites certainly wouldn't stop coverage of NVIDIA completely (that's silly really, they're the largest desktop graphics outfit in the world), I do think that an agreement between the large, medium and small hardwdare web sites should be reached on how to deal with cheats, whether they're insignificant cheats or blatant cheating.
Evan Lieb is offline   Reply With Quote
Old 09-15-03, 09:27 PM   #143
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by Evan Lieb
While the bigger web sites certainly wouldn't stop coverage of NVIDIA completely (that's silly really, they're the largest desktop graphics outfit in the world), I do think that an agreement between the large, medium and small hardwdare web sites should be reached on how to deal with cheats, whether they're insignificant cheats or blatant cheating.
What would you suggest?
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 09-15-03, 09:28 PM   #144
The Baron
Guest
 
Posts: n/a
Default

Quote:
Originally posted by digitalwanderer
What would you suggest?
Better idea--define what entails a "cheat" and what entails a legitimate "optimization." Then have that conversation.
  Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 09:53 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.