PDA

View Full Version : The Quadro Effect


Pages : [1] 2 3

RobHague
01-23-05, 10:29 AM
Hi, this is related to the 6800 btw, so it tought it was better in here. :thumbsup:

Has anyone in here played around with SoftQuadro?

The latest version of RivaTuner (15.3) lets you change/force the Device ID before the OS boots so you can turn your GeForce (or whatever you have) into a Quadro/FireGL product and unlock any masked pipes there might be (as in the 6800NU and 6800LE), this is probably old news to some people i guess.

But the thing is, as a quadro conversion this works far too well. I read an NVIDIA announcement with interest, that stated the Quadro FX 4000 line is different silicon to the GeForce. Well someone is fibbing, because optimized drivers or no, you dont get over 100%-600% increases in performance from 'better' drivers (not real performance anyway).

I was playing around with this mod, and i ventured into the BIOS flashing method, out of curiosity and bordem mainly. After trying a standard FX 4000 bios from a PNY card, it read as a quadro, but the performance was the same as the GeForce, dispite the Quadro drivers installing fine. Seems the optimized drivers theory providing all the performance increase is out then - the computer thought it was a quadro, the drivers worked, but performance was the same....

I was considering giving up, as i had the DELL Engneering Release BIOS for the FX 4000, but unfortunatley using VGABIOS to varify it with my card always resulted in a black screen. So i didnt want to risk it.

Well long story short, temptation to "fiddle" was too great so i modded it with RivaTuner to turn the 16pipes back on (for some reason the engeneering sample turns off 4 pipes by default) and altered the clocks. By some wierd coincidence my card runs at 370Mhz core as standard... the real FX 4000 runs at 375 so 5mhz up is nothing. I tried it out and flashed it, half expecting the screen to stay black and me having to blindley flash back the original BIOS, but to my supprise it booted up fine. Displaying NV40GL... good sign.

The drivers installed, this time working fully - all the options and panels showed up and a quick test of performance seems to bring it at least x2 faster in SpecView Perf - as much as 600% for some tests. Will run more tests later when I can find some other tools as tweaks like this purk my interest.


It appears that the released BIOS on final cards, has some sort of protection to stop it being used on a card with the resistors setting its ID to something else. While the DELL BIOS appears to lack that.... or over-rides/forces its own.

Dunno what ill do with it now of course, im not a 'CAD' buff or anything, i dabble at best ;) so it might end up getting its old BIOS back once the novelty wears off maybe lol - But i thought my experiences would be worth sharing. If anyone wants to know the exact tools and bios i used i will be happy to point them in that direction. Be interesting for other ppl to see what results they get (especially with other 6xxx cards).

For now though, a conclusion thats obvious is that the Quadro cards are simply unlocked NV40's, while NV40's are simply NV40GL's with their professional features turned off. :rolleyes: I wish NVIDIA would stop the pre-tense though that they are 'totally different'.

EDIT:
I just made this chart up from my results in SpecView... as you can see there is quite a performance jump.

http://www.pdjkeelan.co.uk/shadowrealm/quad-chart.png

rewt
01-23-05, 10:46 AM
You're right they're not totally different. But they are different in quality.

It's like AMD selecting Athlon XP chips that pass under extreme situations and labeling them XP-M. They're the same chip, but with certain features unlocked.

ricercar
01-25-05, 04:29 AM
There are bonding options with Quadros that are different than the GeForce. Quadros come off a different factory line than the GeForce. You're fooling yourself if you want to believe you have created a Quadro when you buy a GeForce and change character display strings and timing in the BIOS.

Regardless of what the BIOS and drivers report, you can't put VTEC stickers on your Honda to make it go faster. Jus' don work dat way.

RobHague
01-25-05, 09:16 AM
You're fooling yourself if you want to believe you have created a Quadro when you buy a GeForce and change character display strings and timing in the BIOS.

Regardless of what the BIOS and drivers report, you can't put VTEC stickers on your Honda to make it go faster. Jus' don work dat way.

You totally sure about that?

BEFORE as 6800GT
---------- SUM_RESULTS\3DSMAX\SUMMARY.TXT
3dsmax-03 Weighted Geometric Mean = 15.78

---------- SUM_RESULTS\CATIA\SUMMARY.TXT
catia-01 Weighted Geometric Mean = 11.06

---------- SUM_RESULTS\ENSIGHT\SUMMARY.TXT
ensight-01 Weighted Geometric Mean = 11.15

---------- SUM_RESULTS\LIGHT\SUMMARY.TXT
light-07 Weighted Geometric Mean = 9.748

---------- SUM_RESULTS\MAYA\SUMMARY.TXT
maya-01 Weighted Geometric Mean = 18.96

---------- SUM_RESULTS\PROE\SUMMARY.TXT
proe-03 Weighted Geometric Mean = 14.52

---------- SUM_RESULTS\SW\SUMMARY.TXT
sw-01 Weighted Geometric Mean = 13.46

---------- SUM_RESULTS\UGS\SUMMARY.TXT
ugs-04 Weighted Geometric Mean = 4.607

AFTER as FX4000

---------- SUM_RESULTS\3DSMAX\SUMMARY.TXT
3dsmax-03 Weighted Geometric Mean = 31.11

---------- SUM_RESULTS\CATIA\SUMMARY.TXT
catia-01 Weighted Geometric Mean = 16.35

---------- SUM_RESULTS\ENSIGHT\SUMMARY.TXT
ensight-01 Weighted Geometric Mean = 16.66

---------- SUM_RESULTS\LIGHT\SUMMARY.TXT
light-07 Weighted Geometric Mean = 17.30

---------- SUM_RESULTS\MAYA\SUMMARY.TXT
maya-01 Weighted Geometric Mean = 30.23

---------- SUM_RESULTS\PROE\SUMMARY.TXT
proe-03 Weighted Geometric Mean = 34.73

---------- SUM_RESULTS\SW\SUMMARY.TXT
sw-01 Weighted Geometric Mean = 18.81

---------- SUM_RESULTS\UGS\SUMMARY.TXT
ugs-04 Weighted Geometric Mean = 29.51


UGS-04 test is the most impressive, from 4.607 to 29.51. It's not just numbers either - the benchmark runs noticably faster (actually A LOT noticably).

You dont need to flash the BIOS btw, rivatuner can over-ride the device ID for you with NVstrap. But its just a more 'full' solution to flash it. BTW not any old Quadro FX 4000 BIOS will do - using one from a released card gave the results you said. It said it was a Quadro but performance was the same. You need the special engeneering release BIOS for it to work.

ricercar
01-25-05, 03:03 PM
Yeah, I'm totally sure about that. I used to work at NVIDIA on the Quadro line.

Quadro come from different fab lines than Geforce. They have different bonding options. The metal balls that stick out from the plastic chip case connect inside to different pads on the die for Quadro than GeForce. Different connections, different logic. This is why changing BIOS, PCI ID or strapping options won't change a GeForce into a Quadro.

superklye
01-25-05, 04:13 PM
And Quadros have hardware based AA that the GeFORCE line lacks, among other features.

RobHague
01-25-05, 04:24 PM
Yeah, I'm totally sure about that. I used to work at NVIDIA on the Quadro line.

Quadro come from different fab lines than Geforce. They have different bonding options. The metal balls that stick out from the plastic chip case connect inside to different pads on the die for Quadro than GeForce. Different connections, different logic. This is why changing BIOS, PCI ID or strapping options won't change a GeForce into a Quadro.

Im sorry i really disagree. Although i cant say what you used to, or didnt used to work at - I heard it from an NVIDIA rep that the whole "distinction" of the GeForce and Quadro line being "hardware differences" is for the reason of marketing only. Not only that.. im seeing it with my own eyes right now.

They said the actual differences are that Quadro GPU's are hand picked, the cards (PCB) are made by NVIDIA (not 3rd partys who may cut corners) to certain specifications and last quite a bit longer than the average GeForce. After all its a BIG investment and these are not cards that are replaced every 6 months.

I would like to hear your explanation though, since this is doing nothing but making it say QUADRO on the screen and Drivers, of why doing this MOD (not just me, lots of other people) seems to have a significant increase in performance (100% to 600%) for CAD/3D work? Not to mention the fact that it enables certain "hardware only" features that only the Quadro is supposed to have in these applications?

Unless the NV40 is the same GPU, mine must have gone all BORG like and started assimilating and redesigning itself…

And Quadros have hardware based AA that the GeFORCE line lacks, among other features.

Yes they do... I have these features now. And?

superklye
01-25-05, 04:38 PM
You can't have hardware AA enabled by a softmod. It's...oh, what's that word? Uh...um...ah yes! IMPOSSIBLE. This has been talked about time and time again...you cannot add HARDWARE supported features via software unless the hardware is already there.

That would be like flashing your SoundBlaster Live! 5.1 to an Audigy 2 ZS Platinum Pro (I don't even know if that's possible) and then assume you can decode DTS via hardware...unless you have that optical or coaxial output for DTS capabilities, YOU DON'T HAVE THEM. This is the same situation here.

RobHague
01-25-05, 04:41 PM
You can't have hardware AA enabled by a softmod. It's...oh, what's that word? Uh...um...ah yes! IMPOSSIBLE. This has been talked about time and time again...you cannot add HARDWARE supported features via software unless the hardware is already there.

That would be like flashing your SoundBlaster Live! 5.1 to an Audigy 2 ZS Platinum Pro (I don't even know if that's possible) and then assume you can decode DTS via hardware...unless you have that optical or coaxial output for DTS capabilities, YOU DON'T HAVE THEM. This is the same situation here.

It's quite possible because the NV40 and NV40GL are the same GPU :banghead:. Why exactley, would i lie about this? :bleh: I cannot prove this myself because i do not have the applications available to me right now to show you, however i can point you to some threads if you care to read though them.

6800 > Quadro FX questions
http://forums.guru3d.com/showthread.php?s=&threadid=115155&perpage=10&pagenumber=1
(you will have to look through this one a bit to find relivent information to what we are discussion though)

To quote
Maya viewports do not seem to benefit much in speed, only in quality (hardware overlays support makes paint effects and other artisan tools work better)."

The Original Hardware and BIOS mod...
http://newbietech.net/eng/qtoq/nvidia/6800/6800mod.php

The pre-15.3 release of RivaTuner with NVstrap included.
http://www.cgtalk.com/showthread.php?t=191214&page=1&pp=20&highlight=15.3

to quote the first post
OpenGL works as expected. AA edges works in OpenGL!

Subtestube
01-25-05, 05:28 PM
For doubters, I suggest checking out this thread:

http://forums.guru3d.com/showthread.php?s=&threadid=123844

I'm fairly certain there are some diffs between a softQuad and a real one, but that doesn't stop the softQuad still doing significantly better under Maxtreme using SpecAPC. SpecAPC is NOT a synthetic like SpecViewPerf, and gives indications of real world speed boosts.

Cota
01-25-05, 05:52 PM
You can't have hardware AA enabled by a softmod. It's...oh, what's that word? Uh...um...ah yes! IMPOSSIBLE. This has been talked about time and time again...you cannot add HARDWARE supported features via software unless the hardware is already there.

That would be like flashing your SoundBlaster Live! 5.1 to an Audigy 2 ZS Platinum Pro (I don't even know if that's possible) and then assume you can decode DTS via hardware...unless you have that optical or coaxial output for DTS capabilities, YOU DON'T HAVE THEM. This is the same situation here.

Do a little research pal, AA lines work on geforce to quadro mods. Your statement is sort of true in that you can't add hardware supported features. Thing is that the feature is hardware supported on the geforce, its just not enabled.

superklye
01-25-05, 06:25 PM
Sorry, but I trust the guy that worked for nVIDIA more than anyone else on this topic.

RobHague
01-25-05, 06:38 PM
So what you are saying then superklve, is that quite a number of people allover the web are taking part in this "fake" Quadro mod, and that you also think the benchmarks are fake?... because some guy on a forum said "I used to work for NVIDIA so i know better"....I see :rolleyes:

Did you know i was Bill Gate’s illegitimate son btw?

superklye
01-25-05, 06:48 PM
What the hell is "superklve"? There's no v in my name, RubHague. :rolleyes:

RobHague
01-25-05, 06:50 PM
What the hell is "superklve"? There's no v in my name, RubHague. :rolleyes:

:eek: Very witty (EDIT: Oh you edited so nevermind). It was a 'y' that seems to have come out wrong of course but nevermind superklue i will get it right the next time. :thumbsup:

Side Note: When i posted this and said "it would make an interesting discussion" this isnt quite what i had in mind. I meant more of people trying it for themselves and posting results and such.. ;)

jolle
01-25-05, 08:11 PM
this "hardware" AA thing..
I get a vague feeling that this has its feet in HW Wireframe acceleration, and has somehow escalated from there..
I dont think a Geforce on regular ForceWare drivers will do HW acceleration of wireframe, or AA even on wireframe, as its not something you se in games..
But the Quadros have it enabled for their drivers, as its comon in their field of pro apps.

or maybe im just confusing things myself, these are vague memories of ages ago on Geforce256 meddling about wanting to mod it to Quadro, but wasnt about to go nuts with soldering and such.

noko
01-25-05, 11:32 PM
hmmm, I wonder how this softmod will work in TrueSpace?????

ricercar
01-26-05, 01:14 AM
Do a little research pal, AA lines work on geforce to quadro mods. Your statement is sort of true in that you can't add hardware supported features. Thing is that the feature is hardware supported on the geforce, its just not enabled.
Do a little research pal. Educate yourself.
http://www.nvidia.com/object/IO_20030630_7410.html
Technical Brief: Quadro vs. GeForce GPUs
Learn all about the differences between NVIDIA's consumer-level GeForce GPUs and workstation-class Quadro GPUs in the attached technical brief. (2MB PDF)

quite a number of people allover the web are taking part in this "fake" Quadro mod, and that you also think the benchmarks are fake?...
People allover [sic] the web are mistaken, taking part in wishful thinking that they don't understand to make a $500 part into a $2000 part.

One can't change the gates in a silicon device after it comes from the fab. Certainly not with a BIOS or driver update. BIOS updates won't activate Quadro hardware features that are not available in the gates. Now if you want to decap a chip with a four million dollar tool and mod the gates, then you can do what you suggest is happening "allover" the web.

Any improved performance is coming from optimizations of the software & firmware. One can optimize BIOS and drivers for specific software applications, for example Quadro benchmarks. The hardware is still GeForce, which is different than Quadro.

No GeForce GPU can possibly deliver hardware accelerated antialiasing, clipping or lighting, or Quadro memory management. Neither does my NV35GL deliver NV35 gaming performance with a GeForce BIOS. Sure the boot screen claims NV35, but a true 5900 will blow it away in Doom or 3DMark. Quadros are HARDWARE optimized for CAD, and GeForce HARDWARE optimized for games, in the HARDWARE, not the drivers or firmware.

because some guy on a forum said "I used to work for NVIDIA so i know better"....I see ... I heard it from an NVIDIA rep that the whole "distinction" of the GeForce and Quadro line being "hardware differences" is for the reason of marketing only. Not only that.. im seeing it with my own eyes right now.
That guy was blowing smoke up your ass. He certainly doesn't work on Quadros well enough to understand them. Tell me his name. Maybe I worked with him.

Sorry to burst your bubble. What you're seeing with your eyes is a delusion that you desperately want to believe in because we all want "one up" on the man. Sure, improved benchmarks are spiffy, but a genuine test will reveal the hard facts. (Remember Futuremark vs NVIDIA in early 2003? Remember how the FX line really performs?) Try pro CAD applications on your GeForce head-to-head with a real Quadro. Maya for example. Motionbuilder. Gelato. Lend your modded GeForce to an honest evaluator with experience using a real Quadro. Seek truth, not ego.

It irks me when I see these threads and people believe in the GeForce-to-Quadro mod. It's bogus; it's impossible. But if you want to deceive yourself and be happy, more power to you. It's frustrating to try helping people who don't want to be helped.

superklye
01-26-05, 01:30 AM
Thanks ricercar. :)

Cota
01-26-05, 02:47 AM
Do a little research pal. Educate yourself.
http://www.nvidia.com/object/IO_20030630_7410.html
Technical Brief: Quadro vs. GeForce GPUs
Learn all about the differences between NVIDIA's consumer-level GeForce GPUs and workstation-class Quadro GPUs in the attached technical brief. (2MB PDF)


People allover [sic] the web are mistaken, taking part in wishful thinking that they don't understand to make a $500 part into a $2000 part.

One can't change the gates in a silicon device after it comes from the fab. Certainly not with a BIOS or driver update. BIOS updates won't activate Quadro hardware features that are not available in the gates. Now if you want to decap a chip with a four million dollar tool and mod the gates, then you can do what you suggest is happening "allover" the web.

Any improved performance is coming from optimizations of the software & firmware. One can optimize BIOS and drivers for specific software applications, for example Quadro benchmarks. The hardware is still GeForce, which is different than Quadro.

No GeForce GPU can possibly deliver hardware accelerated antialiasing, clipping or lighting, or Quadro memory management. Neither does my NV35GL deliver NV35 gaming performance with a GeForce BIOS. Sure the boot screen claims NV35, but a true 5900 will blow it away in Doom or 3DMark. Quadros are HARDWARE optimized for CAD, and GeForce HARDWARE optimized for games, in the HARDWARE, not the drivers or firmware.


That guy was blowing smoke up your ass. He certainly doesn't work on Quadros well enough to understand them. Tell me his name. Maybe I worked with him.

Sorry to burst your bubble. What you're seeing with your eyes is a delusion that you desperately want to believe in because we all want "one up" on the man. Sure, improved benchmarks are spiffy, but a genuine test will reveal the hard facts. (Remember Futuremark vs NVIDIA in early 2003? Remember how the FX line really performs?) Try pro CAD applications on your GeForce head-to-head with a real Quadro. Maya for example. Motionbuilder. Gelato. Lend your modded GeForce to an honest evaluator with experience using a real Quadro. Seek truth, not ego.

It irks me when I see these threads and people believe in the GeForce-to-Quadro mod. It's bogus; it's impossible. But if you want to deceive yourself and be happy, more power to you. It's frustrating to try helping people who don't want to be helped.


That doesn't change the fact that AA lines do work on geforce to quadro mod. I agree that not all the funtions may be enabled, but the example superkyle used is not correct.

But please don't take my word on it, try it yourself.

I wonder what Unwinder thinks about this...

BTW

"Neither does my NV35GL deliver NV35 gaming performance with a GeForce BIOS. "
Try softmodding it into a geforce and see what happens.

Unwinder
01-26-05, 09:34 AM
Well, if you are really NV employee, then you're brave enough to try to make public comments on Quadro vs GeForce differences in open forum. Thanks for doing that, I feel a smell of pretty interesting discussion. However, I hate to say it, but NV40 and NV40GL chips seem to have the same "hardware" differences like 6800NU and 6800Ultra. Nothing but a strap in reg C010, limiting chip’s caps. Try to address some plain and simple questions to make me think different:

1) If you are really work @ NV, have you ever peeked in NV4x registers reference and seen that bit 16 (Quadro caps identification bit) of NV_PBUS_DEBUG_1 is software overridable via straping bit 0 of reg C020?
2) What is the purpose of leaving an ability of overriding Quadro caps bit via _software_, if the chips are _physically_ different as you say?
3) What is the purpose of making it overridable on NV4x, if it was hardwired with pull-up resistor on NV2x/NV3x?

Do a little research pal. Educate yourself.
http://www.nvidia.com/object/IO_20030630_7410.html
Technical Brief: Quadro vs. GeForce GPUs
Learn all about the differences between NVIDIA's consumer-level GeForce GPUs and workstation-class Quadro GPUs in the attached technical brief. (2MB PDF)

Sorry, but it this "educational" whitepappers contain a lot of PR. And the most of "exclusive" features mentioned there (e.g. unified depth/back buffer etc) are driver-based. And the rest (e.g. clipping planes and hardware antialiased lines) can be easily enabled via overriding straps whenever NV like it or not.


No GeForce GPU can possibly deliver hardware accelerated antialiasing, clipping or lighting, or Quadro memory management.

Sorry, but it perfectly can when this strap is overriden.

And one more thing. Are you absolutely sure that you are really NV employe? :) What's happend to NVIDIA Worldwide code of ethics, according to which
Employees should also be very careful not to disclose such information to family, friends, or any person outside NVIDIA who could act on such information, even if the employee receives no benefit from their actions. Except for authorized spokespersons for NVIDIA, employees should not
communicate with the press or in public forums.
Is it already canceled? :)

BTW, may I ask you real name too?

Vik1dk
01-26-05, 09:43 AM
(popcorn) :)

Cota
01-26-05, 09:53 AM
Tnx unwinder :)

I've tried the geforce to quadro mod since I had my geforce3, never had the chance to compare it to a real quadro, but there was a huge improvement in science benchmarks.

I really have no use to a quadro, so I just did the mod to see if it worked. There are tons of info about the mod, but also a lot of misinformation about how to get it done. Like bios flash and the like.

People who rant about IMPOSSIBLE mods, should take the time to try them themselves.

Claiming to be a nvidia employee/ex-employee, or "hey I know someone who knows someone who is the girlfriend of the brother of the cousin of some guy that worked at nvidia doesn't really give credibility to a statement.

And the next time you want to rant about it, at least google it before you post.

myshkinbob
01-26-05, 01:26 PM
Sorry to burst your bubble. What you're seeing with your eyes is a delusion that you desperately want to believe in because we all want "one up" on the man. Sure, improved benchmarks are spiffy, but a genuine test will reveal the hard facts. (Remember Futuremark vs NVIDIA in early 2003? Remember how the FX line really performs?) Try pro CAD applications on your GeForce head-to-head with a real Quadro. Maya for example. Motionbuilder. Gelato. Lend your modded GeForce to an honest evaluator with experience using a real Quadro. Seek truth, not ego.

It irks me when I see these threads and people believe in the GeForce-to-Quadro mod. It's bogus; it's impossible. But if you want to deceive yourself and be happy, more power to you. It's frustrating to try helping people who don't want to be helped.

You and superkyle are being really hard on robhague here, even rude. There's no disputing that in previous product generations, there were silicon differences between the consumer and workstation cards, like the geforce4/quadro4 generation. Some quadro features weren't there on the gf4 chips, and that's what the pdf you linked is referring to. It's probably what the guy who used to work on the NV fab lines is referring to also.

With the nv4x genearation of chips, they all use the same core. I can understand you want proof of that, and we'll go to benchmarks later, but here is some visible proof, taken from a review of the fx4000, they removed the HSF and found the core was labelled as a 6800 Ultra.

http://www.pcpop.com/pcpopimg/04/7/28-9-37-5-921428010.jpg

http://www.pcpop.com/pcpopimg/04/7/28-9-26-15-726243270.jpg

Now, you can see the cores are the same. Unwinder has told you that the only difference is a register value to lock/unlock Quadro features on NV4x cores.

Robhague linked (Here (http://forums.guru3d.com/showthread.php?s=&threadid=123844)) to a forum post of some biosmodded FX4000 benchmarks results, which you seemed to dismiss as fake. Well it was my post, i ran those benchmarks, and i'm telling you it's not faked, nobody is kidding themselves. If you read the post, and looked at the results, you'd see i compared against official Specviewperf results for the FX4000, running on faster workstation systems. The modded FX4000 keeps up with the real FX4000s on every test, those tests cover a lot of pro CAD features, from 8 different CAD applications, and all run with hardware line antialising.

But that is a synthetic test, you're right, it's still useful for comparing workstation hardware results relative to each other. Though if you'd read the post, you'd see i also benchmarked using the specapc 3ds max 6 scripts that run on the actual 3ds max 6 application, a real world application. There are no published FX4000 results for specapc max 6, only an FX3000 set. But if you go take a look, you'll see the softmodded FX4000 outperforms the real FX3000 by around 30%, about as much as the real FX4000 should. Specapc max 6 runs about 30 tests, covering line antialiasing, dual plane performance, blitting, shading performance, colour tests, particles, selections, transforms, subobject selection, opacity, etc etc. In not one test did the modded FX4000 appear to lack any of workstation card's features.

So if you're going to bash robhague for trying to have an informative discussion, and for sharing what is really good news for anyone needing workstation class performance from their consumer card, don't just say he lies, or some guy told you that's impossible, bring something to the discussion that can disprove what he told you, and now also my benchmark results in a real world CAD application. We're all adults after all, so let's make it a civilised debate. :)

As it stands, evidence suggests there is no physical difference between the NV40 and NV40GL cores, except for a register value or two. Consider your bubble burst.

Apologies if i've come across a bit insulting, but at least take a look at the facts before you dismiss people in these forums, i'm used to this place being a friendly board for talking about the hardware, where you don't get jumped on for saying something that appears unlikely at first. Don't turn the place into rage3d ;)

You need an ego check if you think just throwing insults about people deceiving themselves is 'helping people', sorry to get personal, but it's not helpful at all, that's being a know-it-all. I was actually just as cynical about it as yourself when i first read rob's post, but rather than just dismiss it, i thought i'd test it out for myself and make some definitive benchmark comparisons, which is a lot more informative and helpful to me and to anyone else interested in the topic.

smthmlk.
01-26-05, 07:26 PM
personally, i would trust a large group of people in the community with cards in hand doing various tests over some publicity from nvidia's site or someone saying they used to work for nvidia. but that's just me :)

Keep up the good work, it's always interesting to see whats really under the hood and what's possible with these cards!