Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-19-03, 04:42 PM   #229
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

i found and interesting article , a good reading ,for the people
interested to know more in clipping planes and oclussion culling
techniques in game engines..


Geometry Culling in 3D Engines

Quote:

Efficient algorithms for determining the visible parts of 3D environments are a key to visualizing complex scenes at interactive rates. Visibility culling algorithms reduce the number of polygons sent down the rendering pipeline based on the simple principle that if something is not seen, it does not have to be drawn. In the following I detail the underlying concepts of the major visibility culling algorithms in use today. We will cover view frustum culling, occlusion culling and contribution culling and finally discuss what else can be used in environments in which this alone is not enough.

View frustum culling

Perhaps the most obvious form of culling is view frustum culling, which is based on the fact that only the objects in the current view volume must be drawn. View volume is usually defined by six planes, namely the front, back, left, right, top, and bottom clipping planes, which together form a cut pyramid. Front and back clipping planes may be defined to lie at the viewer point and infinity, respectively. If a polygon is entirely outside the pyramid, it cannot be visible and can be discarded. If it is partially inside, it is clipped to the planes so that its outside parts are removed.

http://www.gamedev.net/reference/art...rticle1212.asp

Last edited by Nv40; 05-19-03 at 05:03 PM.
Nv40 is offline   Reply With Quote
Old 05-19-03, 05:14 PM   #230
muzz
 
muzz's Avatar
 
Join Date: Feb 2003
Posts: 816
Default

O brother........
__________________
muzz
muzz is offline   Reply With Quote
Old 05-19-03, 05:19 PM   #231
SlyBoots
Registered User
 
SlyBoots's Avatar
 
Join Date: Jul 2002
Location: La Grande, OR
Posts: 339
Talking

Quote:
Originally posted by muzz
O brother........
hehe, yea...even Ray Charles could see Joe's point
SlyBoots is offline   Reply With Quote
Old 05-19-03, 05:19 PM   #232
zakelwe
Registered User
 
Join Date: Dec 2002
Posts: 768
Default

Quote:
Originally posted by Joe DeFuria
Um, no. I said CAN and I mean CAN. Thanks for trying to put words in my mouth though. The fact that you believe I must mean "does" for there to be something to argue against, clearly displays that you have no grasp of the situation.

To be clear, "CAN increase performance" means "may increase performance, but certainly will NOT decrease performance."



It must be measurable to be an effective cheat indeed. It doesn't need to be measurable to be a cheat.



Perhaps if nVidia ever decides to release a driver that doesn't cheat (which we may never get...they may just find a way to disable the cheat when in free-look mode), we may never know.



nVidia is doing everything they can to get higher scores, which includes not rendering things they should be.



The scores WILL be higher...by how much is an unknown.



Not until nVidia releases drivers that cen be verified to not have the cheat when in benchmark mode.



One does not have to prove that not drawing pixels is faster than drawing them. This is a given.

One does have to make measurements in order to prove how much this cheat is impacting scores. And NO ONE has made any claims in that respect.
What you mean is nobody has bothered measuring it, though Dave Baunmann suggested a way to me on how to test it :-

As not all the old drivers have the problem you can measure the increase in performance from the earlier drivers to the later drivers comparing GT2 to GT3, if there is no extra gain from cheating / bug in GT2 then they should gain by the same amount because GT3 is not affected.

Unless all the drivers and all the games are affected by the cheat then you should see a discrepency. I can tell you, that with a GF4 card there is no improvement in GT2 in 44.03, there is a decrease and there is no improvement in 43.45 or 43.51 compared to the earlier 42 series drivers, which futuremark approve of.

So you are talking rubbish, in fact you that's all you do talk, you are too lazy to go out and test something for yourself.

Regards

Andy

Last edited by zakelwe; 05-19-03 at 05:35 PM.
zakelwe is offline   Reply With Quote
Old 05-19-03, 05:26 PM   #233
John Reynolds
Registered User
 
Join Date: Aug 2002
Posts: 365
Default

Quote:
Originally posted by Nv40
i found and interesting article , a good reading ,for the people
interested to know more in clipping planes and oclussion culling
techniques in game engines..
Unfortunately, the section you quoted is completely irrelevent since Nvidia is clipping objects fully within the view frustum.

Again, please don't take offense but watching someone such as yourself reach for anything to vindicate a favored IHV in this situation is truly saddening to me because it means a potentially frighteningly large section of the consumer market is woefully gullible, naive, and/or misinformed.
John Reynolds is offline   Reply With Quote
Old 05-19-03, 05:38 PM   #234
Clockwork
I was cured all right...
 
Clockwork's Avatar
 
Join Date: Jul 2002
Location: The Korova Milk Bar
Posts: 115
Default

Quote:
Originally posted by zakelwe
What you mean is nobody has bothered measuring it, though Dave Baunmann suggested a way to me on how to test it :-

As not all the old drivers have the problem you can measure the increase in performance from the earlier drivers to the later drivers comparing GT2 to GT3, if there is no extra gain from cheating / bug in GT2 then they should gain by the same amount because GT3 is not affected.

Unless all the drivers and all the games are affected by the cheat then you should see a discrepency. I can tell you, that with a GF4 card there is no improvement in GT2 in 44.03, there is a decrease and there is no improvement in 43.45 or 43.51 compared to the earlier 42 series drivers, which futuremark approve of.

So you are talking rubbish, in fact you that's all you do talk, you are too lazy to go out and test something for yourself.

Regards

Andy


Go and

Some people seem to be missing out on 1 factor though.

Strangely the image quality is back up in the drivers. Many people have alluded to the fact that nVidia (in previous drivers) may have been lowering precision to gain performance in 3dmark. Now nVidia runs into a dilemna...they have/want to raise precision back up to Dx9 standards but also want to maintain performance. What do they do? They bump up the quality and introduce the clipping planes to balance the processing load......The test looks better and nVidia customers are happy that no performance was lost...

That's my take on the situation. Too bad they got caught..HAHAHAH.

__________________
| AMD Athlon 64 3500+ | 1GB Corsair XMS Extreme Memory PC3200 DDR | GIGABYTE GA-K8NSNXP-939 nForce3 Ultra | WD 120GB SATA | BFG 6800GT OC w/ Zalman VF700cu | Pioneer DVR-A07XLA 8x DVD+-R/RW | Aopen 1640Pro-A 16x DVD | Cooler Master Cavalier 1 CAV-T01-WWA case | Ultra X-Connect 500watt psu | Windows XP Professional w/ SP2 |Samsung 193p+ 19" LCD

Last edited by Clockwork; 05-19-03 at 05:42 PM.
Clockwork is offline   Reply With Quote
Old 05-19-03, 06:17 PM   #235
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:
Originally posted by John Reynolds
Unfortunately, the section you quoted is completely irrelevent since Nvidia is clipping objects fully within the view frustum.

Again, please don't take offense but watching someone such as yourself reach for anything to vindicate a favored IHV in this situation is truly saddening to me because it means a potentially frighteningly large section of the consumer market is woefully gullible, naive, and/or misinformed.
what i find really sad ,Joe.. its HOW fraudulent is this PS1.4 benchmark.
i find very hard to believe ,that you believe in the things you say.
you only focus you view in what Nvidia have done ,but cares nothing about the test at all. How unfair it is for Nvidia cards ,but still people like you with some "knowledge" still defend the test. everyones knows this ..
3dmark is not apples vs apples , but you point your finger to Nvidia
for not playing apples vs apples.. you complain for Doom3 benchamrks
but are VERY pleased with 3dmark2003. DONt you see the pattern here?
Take your glasses Dude.. look things from other points of view too..
I understand clearly all your points , but what you dont want to see
is the truth of 3dmark 2003 as a fair comparisons between NVidia and ATI cards ,and as a bench representative of Games of the FUture...

you already knows this ,that the test is already invalid ,
since one participant does not agree with their rules anymore .
BUt as you have said."THE only thing that matters is the scores "

wait... it also matters that both cards do things in exactly in
the same way.. Ohh.. but thats impossible..
WHo cares.. "THE only thing that matters is the scores "

what is even Funny is that in games of the Future (Doom3)
using the best engine in the following years ,Nvidia have the fastest cards ..and trounce all ATI cards.. ppsss... in the real WOrld.

Last edited by Nv40; 05-19-03 at 06:30 PM.
Nv40 is offline   Reply With Quote
Old 05-19-03, 06:21 PM   #236
RobHague
I like cheese.
 
RobHague's Avatar
 
Join Date: Mar 2003
Location: UK
Posts: 904
Thumbs up

My idea of a driver cheat would be if an "optimization" only affects one specific benchmark/test and nothing else. Something that also reduces IQ, or lowers it below the minimum standard. I would have no problem with NVIDIA lowering precision if its above FP24 (The DX9 min).

With the case of this so called clipping then as that "optimization" will benifit NOTHING apart from 3dmark2003. Its giving the apearence of a faster product when it actually will never run that well in any real game, present or future. Its all pre-calculated and of no use apart from artifically inflating scores - so yes - IMHO its a cheat.

Id like to see them to remove this particular optimization and see how the scores are affected.... maybe futuremark can do a bit of arm twisting with the banning of the DetFX drivers... or have they done this already (sorry i got about half way through the thread then skipped to the end - was giving me a headache )?

Oh as for a 'fair and decent' benchmark for future cards - hows AquaMark 3 shaping up?
__________________

There used to be a signature here, but now there isnt.

Last edited by RobHague; 05-19-03 at 06:27 PM.
RobHague is offline   Reply With Quote

Old 05-19-03, 06:29 PM   #237
Ady
...
 
Ady's Avatar
 
Join Date: Nov 2002
Location: Australia
Posts: 502
Default

Quote:
Originally posted by Nv40
what i find really sad ,Joe.. its HOW fraudulent is this PS1.4 benchmark.
How unfair it is for Nvidia cards ,but still people like you with some "knowledge" still defend the test. everyones knows this ..
3dmark is not apples vs apples , but you point your finger to Nvidia
for not playing apples vs apples.. you complain for Doom3 benchamrks
but are VERY pleased with 3dmark2003. DONt you see the pattern here?
Take your glasses Dude.. look things from other points of view too..
I understand clearly all your points , but what you dont want to see
is the truth of 3dmark 2003 as a fair comparisons between NVidia and ATI cards ,and as a bench representative of Games of the FUture...

Funny how in games of the Future (Doom3) ,
Nvidia have the fastest cards ..trounce all ATI cards..
ppsss... in the real WOrld.

wtf r u on? It was Nvidia's CHOICE not to develop a card with ps1.4 support. It's no one else's fault they were lagging behind in that area. You might aswell say it's not far to bench the nv30 because it only has a 128bit mem interface. They can still do the exact same things with PS1.3, it just takes longer. No one forced them to not use PS1.4, or to use 128bit mem.

You are a total fool that is proving you will conjure up anything at all to defend your beloved Nvidia.

As for your Doom 3 comment. Look here for a benchmark comparison with high quality settings. I'm not sure about you, but I know I wan't to be playing Doom 3 in High Quality.
__________________
Dying is not going to kill me.

Last edited by Ady; 05-19-03 at 06:32 PM.
Ady is offline   Reply With Quote
Old 05-19-03, 06:38 PM   #238
DaveBaumann
Registered User
 
Join Date: Jan 2003
Posts: 98
Default

Quote:
what i find really sad ,Joe.. its HOW fraudulent is this PS1.4 benchmark.
How unfair it is for Nvidia cards ,but still people like you with some "knowledge" still defend the test. everyones knows this ..
3dmark is not apples vs apples , but you point your finger to Nvidia
for not playing apples vs apples.. you complain for Doom3 benchamrks
but are VERY pleased with 3dmark2003. DONt you see the pattern here?
Take your glasses Dude.. look things from other points of view too..
I understand clearly all your points , but what you dont want to see
is the truth of 3dmark 2003 as a fair comparisons between NVidia and ATI cards ,and as a bench representative of Games of the FUture...
This whole PS1.4 thing is tedious. PS1.4 is just as vailid as any other shader model – so what if NVIDIA doesn’t support it ‘natively’, they still support it with backwards compatibility via PS2.0.

Let me ask you, were you complaining about how unfair the use of PS1.1 was in 3DMark2001? You see, NVIDIA were the only ones to ‘natively’ support PS1.1 via their use of register combiners, ATI were going through backwards compatibility from their PS1.4 hardware model – was that unfair?

This is just how these things work. PS1.4 is there, supported by Microsoft and applications, it is just as valid to test as any other shader model, and this doesn’t make 3Dmark invalid because of it.

Quote:
what is even Funny is that in games of the Future (Doom3)
And this is what is slightly twisted about this whole situation – Doom3 will be quoted for years as the ‘next generation engine’ but in reality is using old generation technology. The game has one shader in their for the lighting model that uses about 9 instructions, AFAIK. Not only that when you run this in NV30’s default mode you are not running it at ‘next generation’ floating point precisions, but ‘prior generation’ integer shader precisions, which is not what this whole ‘cinematic rendering’ tagline that all the vendors are touting are advertising.
DaveBaumann is offline   Reply With Quote
Old 05-19-03, 06:38 PM   #239
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:
Originally posted by Ady
wtf r u on? It was Nvidia's CHOICE not to develop a card with ps1.4 As for your Doom 3 comment. Look here for a benchmark comparison with high quality settings. I'm not sure about you, but I know I wan't to be playing Doom 3 in High Quality.
now thats better you are accepting the tests are unfair.
who is the next ?

as everyones already knows NVidia have the control market share..
game developers will not be so generous like 3dmark2003 with ATi ,
when it comes to the real world ->games.
they will develop games to take the best Fps and IQ ..
possible of both ATI/NVIDIA. not only one.

1600x1200 4x+8x AF
radeon9800pro: 17.3
Nv35 : 27.7

THe diferences between medium settings and High is only Aniso filtering.
the only card playable at maximun IQ settings..
near 2x times!!!!!!!!!!!!! R350 performance..

Quote:
When 4xFSAA is enabled, not a single card can hold a candle to the FX 5900 Ultra. ****The FX 5800 Ultra is roughly on a level with the ATI cards.*** An interesting note on the side:more RAM doesn't seem to yield any performance benefits in DOOM III, as the Radeon 9800 PRO 256 MB shows.

Last edited by Nv40; 05-19-03 at 06:51 PM.
Nv40 is offline   Reply With Quote
Old 05-19-03, 06:45 PM   #240
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by digitalwanderer
WORD!!!!

I wish all you nVidiots would just either address that single issue or quit wasting our time.
I think it comes down to the fact that alot of us Nvidiots simply don't care about 3dmark. Not to the point where this would upset me any.


I don't remember the last time I actually read a 3dmark section of a review. I skip them. Always
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Bored, impressed, and giddy: Our final thoughts on E3 2012 (with photos) News Archived News Items 0 06-13-12 06:00 AM
Thoughts from console owners on NVIDIA's GEFORCE GRID MikeC Console World 11 05-27-12 08:43 AM
Looking for a good 21"/22" Monitor...any thoughts? Guuts General Hardware 13 09-22-02 11:04 AM
Thoughts on the command line as an interface. lunix Microsoft Windows XP And Vista 10 09-12-02 08:44 PM
GTA Thoughts? Typedef Enum Gaming Central 5 09-03-02 04:51 PM

All times are GMT -5. The time now is 04:37 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright ©1998 - 2014, nV News.