Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 06-02-03, 08:49 PM   #97
gordon151
Registered User
 
Join Date: Mar 2003
Posts: 264
Default

Quote:
Originally posted by bkswaney
Sure they say they r there. But where is the PROOF?
As for ignorgant statment... Well, at least I do not buy everything
some of the sites come up with. Like so many do.

I'm out of this post before I get booted from the forums
for saying what I really think.

So u guys fight about it.
Ok, it's very smart comment. MY BAD. Please forgive me bkswaney.
__________________
||- A64 2800+ @ 2.0GHz -||- 1GB Buffalo DDR400 @ 225Mhz -||- PNY 5700 128MB -||
gordon151 is offline   Reply With Quote
Old 06-02-03, 08:49 PM   #98
bkswaney
Mr. Extreme!
 
bkswaney's Avatar
 
Join Date: Aug 2002
Location: SC
Posts: 3,421
Send a message via Yahoo to bkswaney
Default

Quote:
Originally posted by c4c
The short answer is that PowerVR does this on the fly and it's what make their architecture cool. Nvidia's architecture does NOT do this on the fly, so by putting in clip planes they make it seem that their card performs better than it does.
OK.... So NV's card cannot do this... hummm maybe they should.

Well, at least FM fixed it. So now the 9800 and 5900 r very close in 3DM03.
bkswaney is offline   Reply With Quote
Old 06-02-03, 08:52 PM   #99
bkswaney
Mr. Extreme!
 
bkswaney's Avatar
 
Join Date: Aug 2002
Location: SC
Posts: 3,421
Send a message via Yahoo to bkswaney
Default

Quote:
Originally posted by gordon151
Ok, it's very smart comment. MY BAD. Please forgive me bkswaney.
No problem.
I just like getting all the facts straight and not hear say.
So they did cheat. Even tho FM now say they did not.
Boy what a mess.
bkswaney is offline   Reply With Quote
Old 06-02-03, 08:54 PM   #100
GlowStick
CoD4!
 
GlowStick's Avatar
 
Join Date: Feb 2003
Location: Florida
Posts: 5,786
Send a message via AIM to GlowStick
Default

Quote:
Originally posted by c4c
The short answer is that PowerVR does this on the fly and it's what make their architecture cool. Nvidia's architecture does NOT do this on the fly, so by putting in clip planes they make it seem that their card performs better than it does.
Actually all 3d video cards do that to an extent, and they also do it for the Z plane, withc is objects behind a wall etc, just not as efficnet as a static one.

But PowerVR's idea is they can do a better job of clipping in real time, and they actually did.
__________________
Intel i7-2600K, Corsair 8Gig, Corsair H100, Corsair 650D, Corsair HX750, ATi 6970, WD Caviar Black 2TB
Sony Vaio SB: i7, 8Gig, Intel 320 300gig
GlowStick is offline   Reply With Quote
Old 06-02-03, 08:56 PM   #101
muzz
 
muzz's Avatar
 
Join Date: Feb 2003
Posts: 816
Default

Quote:
Originally posted by bkswaney
OK.... So NV's card cannot do this... hummm maybe they should.

Well, at least FM fixed it. So now the 9800 and 5900 r very close in 3DM03.
Fixed what?
It wasn't broken in the first place.
NV was stupid enough to use 32 bit, not the DXstandard....
They have NOONE to blame but themselves for POOR scoring.

But as we see they fixed that now didn't they.
__________________
muzz
muzz is offline   Reply With Quote
Old 06-02-03, 09:01 PM   #102
Neural
Registered User
 
Join Date: Jun 2003
Posts: 9
Default

Ive just been reading over everyones thoughts, so just thought i would mention a few of my own. I know I'm new here so flame away if you like.

First, part of me thinks detecting a benchmark and changing its default paths is "wrong", whether "wrong" means "cheat" or "optimization". I would sincerely like to know if someone here could could explain how nVidia could hide the clip panes in their drivers?

*BUT* there may be another way of looking at this.

I think that whether we like it or not, all the big players in software and hardware think it is acceptable to optimize software code so that certain hardware runs better/faster. Intel has been doing the same thing (working to optimize code for their own products) for years as nVidia is trying to do, and for the same reasons as nVidia. Intel even has specific names for their optimizations, i.e. SSE, SSE2, 3.... and intel would be blown out in most applications by AMD if it werent for these optimizations even in BENCHMARKS where you can bet Intel is making sure their optimized instructions aer implemented. While this does effectively assure that AMD will never control the market, I still find it hard to fault intel for making its products work better as long as windows still looks like windows, and UT2K3 still plays like UT2K3. I notice alot of people using P4's in here so it stands to reason that most feel that this is acceptable.

So would it be the same if nVidia optimized software for their architecture, IF everything still looked and played the same? This all assuming that clip panes are of course not used as that would of course be wrong.

On a personal note, I like my 9500np128 mostly for its better image quality, but 8 pipes at 375/642 also runs games pretty good too. Hooray for competition!
Neural is offline   Reply With Quote
Old 06-02-03, 09:04 PM   #103
muzz
 
muzz's Avatar
 
Join Date: Feb 2003
Posts: 816
Default

UH OH R3D is buckling under the strain of the ALMIGHTY one too.......

3DFX/FM/rage3d gets hammered as a byproduct.....
__________________
muzz
muzz is offline   Reply With Quote
Old 06-02-03, 09:05 PM   #104
solofly
Registered User
 
Join Date: Jan 2003
Posts: 213
Default

Itís good to be an nVidia fanboy... (hehe) Lets put it this way, nVidia just saved Futuremark from going under. Who would Ati compete with then, themselves? Yes, yes but I donít want to hear your crying who is right and who is wrong, pointing fingers and such. If you think life is fair then youíre wrong. The only justice in this world is death and nothing else, rich or poor. This is almost better than a movie...

Last edited by solofly; 06-03-03 at 12:28 AM.
solofly is offline   Reply With Quote

Old 06-02-03, 09:07 PM   #105
muzz
 
muzz's Avatar
 
Join Date: Feb 2003
Posts: 816
Default

I don't own Intel products........... I NEVER bought an INTEL product....

And I never will as long as someone else comes up with a halfway decent chipset for AMD.

Geeeeeeeeez I hate 4 in 1 drivers..... but at least their IDE drivers work, unlike a certain co. that will remain nameless.
__________________
muzz
muzz is offline   Reply With Quote
Old 06-02-03, 09:08 PM   #106
solofly
Registered User
 
Join Date: Jan 2003
Posts: 213
Default

Quote:
Originally posted by muzz
I don't own Intel products........... I NEVER bought an INTEL product....

And I never will as long as someone else comes up with a halfway decent chipset for AMD.

Geeeeeeeeez I hate 4 in 1 drivers..... but at least their IDE drivers work, unlike a certain co. that will remain nameless.
Never say never...
solofly is offline   Reply With Quote
Old 06-02-03, 09:10 PM   #107
reever2
Registered User
 
reever2's Avatar
 
Join Date: Apr 2003
Posts: 489
Default

Quote:
Originally posted by Neural
Ive just been reading over everyones thoughts, so just thought i would mention a few of my own. I know I'm new here so flame away if you like.

First, part of me thinks detecting a benchmark and changing its default paths is "wrong", whether "wrong" means "cheat" or "optimization". I would sincerely like to know if someone here could could explain how nVidia could hide the clip panes in their drivers?

*BUT* there may be another way of looking at this.

I think that whether we like it or not, all the big players in software and hardware think it is acceptable to optimize software code so that certain hardware runs better/faster. Intel has been doing the same thing (working to optimize code for their own products) for years as nVidia is trying to do, and for the same reasons as nVidia. Intel even has specific names for their optimizations, i.e. SSE, SSE2, 3.... and intel would be blown out in most applications by AMD if it werent for these optimizations even in BENCHMARKS where you can bet Intel is making sure their optimized instructions aer implemented. While this does effectively assure that AMD will never control the market, I still find it hard to fault intel for making its products work better as long as windows still looks like windows, and UT2K3 still plays like UT2K3. I notice alot of people using P4's in here so it stands to reason that most feel that this is acceptable.

So would it be the same if nVidia optimized software for their architecture, IF everything still looked and played the same? This all assuming that clip panes are of course not used as that would of course be wrong.

On a personal note, I like my 9500np128 mostly for its better image quality, but 8 pipes at 375/642 also runs games pretty good too. Hooray for competition!
But in the realm of processors we KNOW what is being optimized and what is not, we also know that both the processors support and can be optimized for SSE, in benchmarks if one board can be optimized one way, and the other one cant its not a fair benchmark, in games its a different story
reever2 is offline   Reply With Quote
Old 06-02-03, 09:12 PM   #108
muzz
 
muzz's Avatar
 
Join Date: Feb 2003
Posts: 816
Default

Read the post, I am sure you can understand what is being said.
Their practices are very similar imo, like I said I would rip this POS out of my case if there was anything else I hated LESS....

Strongarm tactics blow, and TBO I really am tired of them doing it.
FM's credibility is now destroyed, as I bet the pressure was intense, and unbearable to a small outfit like them.
__________________
muzz
muzz is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
User Response : PR Response to Linus Torvald's Inflammatory Comments Blackcrack NVIDIA Linux 16 06-29-12 04:57 AM
PR Response to Linus Torvald's Inflammatory Comments News Archived News Items 0 06-19-12 12:00 AM
PR Response to Linus Torvald's Inflammatory Comments MikeC NVIDIA Linux 0 06-18-12 10:14 PM
NV30 name poll sancheuz NVIDIA GeForce 7, 8, And 9 Series 72 10-19-05 01:23 AM
Any details on Nvidia's failed NV2 for SEGA? suburbanguy Rumor Mill 1 08-21-02 10:30 PM

All times are GMT -5. The time now is 10:18 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright ©1998 - 2014, nV News.