Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-13-03, 12:50 AM   #229
phial
 
Join Date: May 2003
Posts: 73
Default

perhaps.. but sinse when is the competitions drivers any less stable than nvidias? actually i have had more stability problems with my GF4TI than with my 8500, and ATI's drivers have only improved sinse those days from what people say and from the help ive been giving people in teh driver section at rage3d

i dont think drivers can be used as a supporting arguement lolol..

but anyways no point in argueing about it because we can throw "facts" back and forth all day.. as to whats true or not who really knows. so many variables... John Doe1's video card could be stable even after installing 25 driver releases on teh same windows installation , and meanwhile John Doe2's video card may crash after a fresh format . but like i said, stability isnt an issue the way you mean it and hasnt been for quite a long time
__________________
[url=http://www.urbandictionary.com/define.php?term=nVidiot]im surrounded by nVidiot's[/url]
phial is offline   Reply With Quote
Old 11-13-03, 12:53 AM   #230
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by phial
THE MOST INSULTING thing you have ever heard? ohhhh poor thing, did i hurt your ego by making you realize that your couple hundred dollar purchase wasnt as wise as you originally thought?

i had a nicer more detailed response to you but the thread was so conveniently locked before i could hit respond yesterday



dude, chill, you dont owe nvidia a thing. its a company , not a football team

obviously i have no idea why YOU bought an nvidia card (really i dont, when a 9800 or 9700 would more than likely perform better .. perhaps you play alot of old games), i was obviously generalizing but thats because i HAVE talked to ALOT of nvidia fans and they all say the same thing;

"i dont care if they cheat and make it look worse, as long as my game goes faster"

why do we upgrade then, if IQ doesnt matter? go buy a used GF2 GTS and play in bi-linear glory! over 85FPS doesnt matter anyways as most monitor refresh rates dont go higher than this, and most people cant tell the difference between 60fps and 85fps on top of that

if you personally had reasons to buy a 5900, then cool, aewsome man glad you found something that most people dont know about. perhaps you should share why did? or was it just beacuse of a sentimental attachment to high-school gaming and the name "nvidia". kinda like wen people hear the word "ferrarri" they go "oooo nice car"

again, this isnt directed at you, because i have no idea about you. just made me laugh how insulted you got


ANYWAYS, how come Futuremark isnt doing anything about this if it directly violates the white paper they released stating teh guidelines ? i mean wtf ... why eeven release it in the first place?
Notice your first post got deleted? I know exactly why I bought an Nvidia card.

Digital Vibrance
Super Sampling
AF affecting all edges.
Faster Framebuffer readback (PSX emulation this is really important)
And My New reason Nvidia FX drivers can access a very High Internal resolution. Max 4096x2048. This makes a big deal in PSX emulation if you want to use Petes Latest OpenGL plugin because it uses the Pbuffer render to texture trick that breaks AA, However rendering internally @ this resolution breaks the need for AA. And Guess what. my Radeon 9500 Pro does "not" work with this feature. Because it cant render that high internally


Thanks, And Drive through. If I were a mod here, You woulda been banned with your first flame entising post. Dont sit there and make ridiculous accusations as to why people do things, Because you only make yourself look stupid. I'm glad you think it's funny. Some mods here most probably dont
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members

Last edited by ChrisRay; 11-13-03 at 12:58 AM.
ChrisRay is offline   Reply With Quote
Old 11-13-03, 01:21 AM   #231
phial
 
Join Date: May 2003
Posts: 73
Default

personally i dislike nv AF.. and no, im not saying that just to contradict you. even at 8xAF, the mip map lines still stand out as plain as day, and in one particular game a friend and i play (anarchy online, mmorpg) you can actually see where it stops filtering altogether becase of the vast open landscapes and distances that you have to travel through. ESPECIALLY when we jump in our vehicles and fly up about 100 feet.. terrain a mile ahead of us suddenly becomes fuzzy as hell beacuse its not within the radius of the filter. but again, thats personal preference and i have to admit that for first person shooters where you cant see more than a couple hundred feet, nV AF does indeed look sharper
__________________
[url=http://www.urbandictionary.com/define.php?term=nVidiot]im surrounded by nVidiot's[/url]

Last edited by MUYA; 11-13-03 at 02:52 AM.
phial is offline   Reply With Quote
Old 11-13-03, 02:31 AM   #232
Rogozhin
Registered User
 
Rogozhin's Avatar
 
Join Date: Jul 2002
Location: oregon
Posts: 826
Default

Nothing like some ad hominem attacts to make your day all rosy.

"AF affecting all edges.

ATI does 128tap AF within the footprint that composes 90% of the viewable area-nvidia's algo changes the footprint based on what the application determines "best" for the game and within this variable footprint only 64tap AF is applies.

I'll take ati thank you.

rogo
Rogozhin is offline   Reply With Quote
Old 11-13-03, 02:36 AM   #233
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Rogozhin
Nothing like some ad hominem attacts to make your day all rosy.

"AF affecting all edges.

ATI does 128tap AF within the footprint that composes 90% of the viewable area-nvidia's algo changes the footprint based on what the application determines "best" for the game and within this variable footprint only 64tap AF is applies.

I'll take ati thank you.

rogo

Thats fine. But ATI wont filter the entire screen in Everquest, To me it's important to see filtering effect the whole screen on this game. Expecially in the wide open areas.

Some games this is completely un noticable. I never noticed Filtering edges in NFS Hot Persuit 2, Unreal Tournament 2003, or Neverwinter Nights, But in Everquest, And Funny Above poster should mention it. Anarchy Online. I have seen some major wierd AF oddities on my 9500 Pro. In Wide Open Areas.

Some games, It's noticable to me. Some Games it's not. Was this the primary reason I got an Nvidia card? No. Just something I like about it.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 11-13-03, 02:44 AM   #234
goofer456
 
goofer456's Avatar
 
Join Date: Sep 2002
Posts: 180
Default

Quote:
Originally posted by Ruined
Perhaps because he prefers stability over speed increases on games that won't be released for some time? 3dmark03 is great, but an actual game that uses intensive DX9 shaders would actually make the 9800PRO's PS2.0 speed far more worthwhile.
Funny to see how old cows are dragged from the canal (hope it sounds in english as it is a rough translation of a dutch saying) whenever the blatant cheating of Nvidia is exposed.

Catalyst drivers are as stable as forceware/det's.

Get over it and use less worn out arguments which hold no truth whatsoever.
goofer456 is offline   Reply With Quote
Old 11-13-03, 02:46 AM   #235
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by goofer456
Funny to see how old cows are dragged from the canal (hope it sounds in english as it is a rough translation of a dutch saying) whenever the blatant cheating of Nvidia is exposed.

Catalyst drivers are as stable as forceware/det's.

Get over it and use less worn out arguments which hold no truth whatsoever.
I still dont see why people see ATI drivers as unstable. It really is rhetoric. I've have found them to be just as stable as Nvidia Drivers. I have never had a game stop working or not function because of ATI drivers.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 11-13-03, 03:24 AM   #236
cthellis
Hoopy frood
 
Join Date: Oct 2003
Posts: 549
Default

First off, Chris, you're nothin' but net with me. ^_^ phial's being a yutz.

Quote:
Originally posted by ChrisRay
I still dont see why people see ATI drivers as unstable. It really is rhetoric. I've have found them to be just as stable as Nvidia Drivers. I have never had a game stop working or not function because of ATI drivers.
Sadly this is still such a HUGE thing for some people, and one of the biggest habits to break. I'll certainly identify areas of lackluster driver quality overall with ATi. (Of course I'll also identify bad cards of nVidia to have owned, and periods when it was damn near stupid to use anything but 3dfx...) But I can't understand how people--and most times people who really don't CARE about pursuing all the technical tidbits--have so much confidence in things they have no real knowledge about and carry grudges out for damn near decades. I have one friend who's very anti-ATi for issues he had with an ATi card (during that time where you'd be damn near stupid to own anything but 3dfx) and a 1st gen Radeon (all issues another friend did NOT get with his card), and another friend who can't let go of problems he had with Western Digital drives well before "gigabyte" was even thought about! o_O

Basically, it mystifies. The tech industry is in such a constant state of flux that pre-forming opinions would be highly illogical, as well as ignoring new information because one "knows better already." Not to mention pre-set opinions simple reinforce themselves, as it's EASY to find problems with X or Y if you're looking for them, and EASY to explain away things with others if you're looking do to THAT, too.

Eventually I just have to sigh and give up, but it's rather annoying that this is so widespread. Hopefully people will wake up from their tech-stupors eventually.
cthellis is offline   Reply With Quote

Old 11-13-03, 04:37 AM   #237
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Hellbinder
That Nvidia Statement is Complete both Complete Hypocracy and nonsense at the same time.

"An Optimization must accelerate more than just a benchmark"

It is plainly obvious what this means. There are numberous benchmarks out there and they were out there long before this statement was made. Therefore it is obviously talking know they are talking about BENCHMARKS. It is clear that Nvidia knows what they are saying here. DO NOT OPTOMIZE FOR A BENCHMARK ONLY. Good statement. Eeveryone applauded and agreed.

Now Nvidia has been cought "Cheating" again. And they come back with this..

"An optimization must accelerate more than just a benchmark unless the application is just a benchmark."

making themselves out to be complete Liars for the First statement. Where they had openly acknowledged that benchmarks should receive no special optiomizations directed only at them. Now they are "Claiming" they ment jsut the oposite and that its "ok" to Optomize for benchmarks.

I dont know what is sadder. They Nvidia employees would stoop so low as to say such a thing? or that there are many people out there who will not only accept this statement but agree with them.
Oh come on. I said since the beginning that there was very little chance NVIDIA ever meant that they wouldn't optimize for synthetic benchmarks. What they always meant by the statement was that any benchmark targeting a game would accelerate more than the benchmark mode.

It is utterly naive to think that NVIDIA would ever stop optimizing for synthetic benchmarks given that they've said several times that they would not stop.

Unless you can find me somewhere where NVIDIA explicitly clarified the statement to mean what the public has interpreted it to mean, I don't think you can hold NVIDIA reasonably responsible for it. Sure, they probably ambiguously worded it...but that's hardly the point.

We've known for some time now that NVIDIA has been breaking their guidelines by changing IQ in UT2003 and 3dmark2001/3dmark03 anyway. Is there anyone that is actually surprised that scores fell with this new patch? I think the interesting thing is why Pixel Shader 2.0 test didn't lose any performance. Does anyone know if Futuremark changed the shader here to prevent detection?
  Reply With Quote
Old 11-13-03, 04:42 AM   #238
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by lukar
ChrisW said here that the score for NV3x now drops in the 4000+ range, not in 3xxx range like before. So, I guess that Futuremark approved some of the 'optimizations' by Nvidia. I assume that Nvidia goes even further, something what Gabe mentioned for HL2 scenario.

The score that NV38 produces using the latest drivers in patched 3DMark03 which is in 4xxx range, was done by using FP16. I think Futuremak allowed it, and that's why we se 4xxx range score, but not 3xxx.
Hardly. The reason why performance didn't drop as much this time is because the drivers actually perform better at an empirical level.
  Reply With Quote
Old 11-13-03, 04:50 AM   #239
StealthHawk
Guest
 
Posts: n/a
Default

phial,

If I have to edit(delete) any more of your offensive posts your'e going to find yourself banned. You're the reason this thread was closed the first time. Since I had a ****ty week last week I'm willing to show some mercy this time.

DO NOT PROVOKE OTHER MEMBERS. K THANKS.

People are obviously offended by your uncalled for and unconstructive comments. If you're going to make them...just stop yourself...please. For your own sake.

Consider this a public warning.
You may now return to your regular scheduled programming. Hell, I won't even guarantee that you won't be banned later by me or by someone else.
  Reply With Quote
Old 11-13-03, 05:24 AM   #240
ChrisW
"I was wrong", said Chris
 
Join Date: Jul 2002
Location: standing in the corner!
Posts: 620
Default

Quote:
Originally posted by vandersl
The more I think about it, the more I think the problem FutureMark has is that people can see their benchmark.

Their business model kind of depends on having a benchmark with eye candy - they want end-users to download it, use it, and care about their scores. That is what makes the benchmark a powerful tool in selling graphics chips (to both end users and OEMs).

However, it seems that since the output is 'visual', people insist that 'close enough' is acceptable. If they can't see the difference in the output, then they don't care. This attitude allows IHVs to cheat on the benchmark, making it invalid for the very purpose it was intended.

Other synthetics don't seem to be treated the same way. I've never seen anyone make this type of argument about any other benchmark with a quantitative output.

If 3DMark was just a benchmark that computed numbers as an output, instead of colors, would anyone really argue that generating incorrect but 'close' numbers is OK? Really, I'd like to know.
Excellent point (one I feel most people overlook). What you see in 3DMark03 (the eye candy) is only there so you will have something nice to look at while the tests are running. The tests could be performed with just lots of random triangles on the screen. The point is the half of the test is what goes on behind the scenes. All the extra triangles used to create these "primitive" pictures are there only to simulate the amount of triangles that would be needed in future games (as predicted by Futuremark). Yes, it's true that the images shown on the screen could easily be drawn with far fewer triangles or much simpler shaders. That is not the point. The point is these things are purposely more complicated then they have to be to similate the complexity that will be shown in future games. It's kind of hard showing an accurate simulation of how future games will look without inventing a time machine and traveling to the future to get these images. All they can do is predict the number of triangles or complexity of shaders and purposely make them overly complex (by today's standards) and hope the results are proven accurate in the future.

The point I'm making here is when a company like nVidia hacks a benchmark and says "see, it looks almost as good", what they are doing is nothing more than simplifying something that was purposely made to be complex.
__________________
AIW 9700 Pro | 1.3GHz CeleronT | 512MB PC133 SDRAM | ECS PCIPAT Mobo (Intel 815EP)
RadLinker/RadClocker
ChrisW is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Driver 295.53 installs into kernel 3.4 without a patch! jdmcdaniel3 NVIDIA Linux 3 06-08-12 10:41 AM
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 09:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 05:48 PM

All times are GMT -5. The time now is 01:14 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.