PDA

View Full Version : gf3 ti500


simmo
06-08-03, 11:02 AM
hi all,
can you tell is the gf3 ti500 128m any good and how does it compare to the gf4 cards, and which gf4 does it sit next to in performance.

cheers
simmo

de_telegraaf
06-08-03, 11:11 AM
GF3 = crap
Next in line = GF4 4200 (as the GF$ MX is even more crap)

Get a FX5600 or better

Kev1
06-08-03, 12:30 PM
Its a great card. The GF3 Ti500 128 meg would play any game out today very well at 1024 x 768, no AA. On my setup in 3D Mark 2001 I score just under 10,000 3D Marks, and can play UT2003 at max settings with zero problems :)

I do agree however if your buying a video card today you should at least get a GF4 Ti series as they are fairly inexpensive now and getting cheaper by the day :thumbsup:

ragejg
06-08-03, 04:31 PM
If you're really specifying a *for real* gf3 ti500/128mb, then that is not only a great card, but a slight rarity... I could be wrong, but actual 128mb ti500's were limited to reviewer/previewer access in the US, and they were distributed sparsely overseas...

..You could be meaning the Gainward PowerPack ti/500 @ 210/460 128mb?? (a more common, but dang good card)...

Some cool things about GF3's:

1. They can put up with higher AGP frequencies (more OC headroom) i.e: newer Radeons crap out in the mid 70's or so, GF4's die out in the high 70's area... GF3's can put up with ONE HUNDRED MHZ... This comes in handy if you have a kt266a. kt333 or kt400 mobo and plan to OC... Hey, oc'ed nforce2's with cpu's runnin @ 2300mhz or so get 13k 3dmarks on these cards.

2. All GF3's really benifit from a little core/memory speed bump-up... and some can OC crazy great!

3. GF3's apparently take less of a hit from Anistropic Filtering than GF4 cards...

4. 2XAA doesn't kill fps as much as it used to, and looks better than it ever has with the newer drivers. ...And Quinxunx AA is improved as well...

5. Apparently, the GF3 core has an "unspoken-of" cache on it's single vertex shader, vs the cache-less dual setup on the GF4. When the core speed is bumped up, VS performance is much beyond decent, well into "great"... (folks, feel free to correct me if I mis-illustrate these points)

6. Some may think the contrary, but I beleve that "full directX8" titles have yet to saturate the market, ... and I think they will... Yes, DX9 is easier for developers, but... um... don't they need to continue writing/learning/optimizing DX8 code for the XBox? I feel that this may hold the DX8 standard around longer than many think... The GF3 factoid here is that this series of cards is *positively* able to manhandle DX8 code well enough to hold onto for a while.


... whew ...

goin back to #1, let's put it this way...
let's say you get a cheap ($80) ti4200 for your kt266a mobo with an XP1700+ TbredB... and with it in there, you can only up your fsb to 155... you get 2000/1950 in your sandra memory scores, and get a 3dmark score of 11.2k or so...

... then you drag out an $80 Gainward ti200 128mb... throw it in, and BOOM... w00t? 182fsb? holy crap! Much higher memory bandwidth (2500/2300 *oh yeah* you can feel it :D) then you oc the card and score a 9.9-11k 3dmark score...

I wanted the higher memory bandwidth (can you say bee-eff-nineteen-fourty-two?), and a higher agp freq...

GAWD I LOVE GF3'S... I just wonder which card has carried on the ti200's soul... 5600U mk2 maybe?

Kev1
06-08-03, 05:16 PM
I like ragejg :hug: :thumbsup: :dance: :bounce: :bounce:

digitalwanderer
06-08-03, 05:38 PM
GF3=yummy-good! (I liked my GF3 better than me 8500! :) )

Kev1
06-08-03, 05:59 PM
How well my GF3 does in new games, and I only have 64 meg version (which was a lot when I got the card but not now). Plus, I did not know half the info in ragejg's post, which I think helps explain why the GF3 still does so well :thumbsup:

I'll upgrade around Xmas time to some new video card, I have no idea what though. But until then I am very happy to use my GF3 :)

And when I upgrade, my Gainward GF3 Ti200 is going in my backup computer which has a Visiontek GF2 GTS in it now :D

Geforce4ti4200
06-08-03, 06:14 PM
so should I upgrade my ti4200 to a ti500? :confused:

ragejg
06-08-03, 07:24 PM
Geforce4ti4200: "so should I upgrade my ti4200 to a ti500?"

... If you have a kt266a and wanna go 180-200, heck yeah!! ;)

...If you like taking something OLDER and making it kick the buttock of something NEWER thru TWEAKING!! ;)

... only issue I have with GF3's right now is I CANT FIND A WINDOWS GF3 BIOS FLASH UTILITY... I have later rev gainward classic GF3... on same pcb as their ti200's, and I wanna flash the bios... I wanna use a good ti200 bios (I've heard memory timings were tightened up in the ti bios's), and can't find crap!! don't wanna use nvflash... gainwward tool spoiled me but is only for GF4's and above... poo... I'll post my inquiries re: this in another thread...

Geforce4ti4200
06-08-03, 08:47 PM
errr 180fsb on a kt266? your hard drive would crash and the pci would be way outta spec, also the chipset isnt capable of it, my kt266 wouldnt do past 145fsb and I had a geforce3 ti500 back then. right now my kt400 is limited by my pc3200 ram that wont do past 180fsb :( and I tried 160fsb at 80MHz agp so my ti4200 can handle 80MHz agp

StealthHawk
06-08-03, 09:20 PM
Originally posted by ragejg
Some cool things about GF3's:

3. GF3's apparently take less of a hit from Anistropic Filtering than GF4 cards...

Only true in D3D games, not in OGL games. Of course the absolute performance of a gf4 using AF in a D3D game is still significantly higher than that of a gf3.

4. 2XAA doesn't kill fps as much as it used to, and looks better than it ever has with the newer drivers. ...And Quinxunx AA is improved as well...

How so? QCA still blurs as much as it ever did on a gf3...I also have never heard of FSAA performance on a gf3 increasing?

5. Apparently, the GF3 core has an "unspoken-of" cache on it's single vertex shader, vs the cache-less dual setup on the GF4. When the core speed is bumped up, VS performance is much beyond decent, well into "great"... (folks, feel free to correct me if I mis-illustrate these points)

I don't know about that either. The gf4 or r8500 destroy the gf3 in VS performance, because both of those cards have 2 vertex shaders.

6. Some may think the contrary, but I beleve that "full directX8" titles have yet to saturate the market, ... and I think they will... Yes, DX9 is easier for developers, but... um... don't they need to continue writing/learning/optimizing DX8 code for the XBox? I feel that this may hold the DX8 standard around longer than many think... The GF3 factoid here is that this series of cards is *positively* able to manhandle DX8 code well enough to hold onto for a while. [/B][/QUOTE]

Well, many people argue that DX8 never too much of a foothold because it just wasn't flexible enough, so that the extra hassle of shader programming wasn't really worth it. DX9 is easier to program in, and offers more advantage. If we see DX9 games, they will have DX8 fallbacks, of course, so either way you win. The gf3 sadly does not have good VS performance though, so in any future games that stress the VS, it will not do nearly as well in comparison to better cards.

ragejg
06-08-03, 09:33 PM
kt266a, not kt266...

It's a kr7a-133r... the kr7a variants are all over the place right now, tons of ppl right now are unloading them for $45-$60 so they can get nforce2's...

Yes, 180fsb... no corruption... running off the onboard IDE no less, I haven't even tried the RAID channel yet...

I bought certain componentry for this system keeping high bus speeds in mind (i learned most o this from a good guy named [OC]_This @ amdforums:

-sblive5.1's can take high fsb's, audigy's cannot. That's why the audigy is in my htpc now...
-Maxtor's 7.2k drives put up with high pci speeds better than, let's say, a WD400BB... which is now in my HTPC...
-I had several nics, and went with the SMC I had cuz it does not crap out early either...
-Samsung PC2700 cas2.5 runs real nice @ 2.85V, cas2.5-3-6-3-2tcmd...
-PowerMagic 450W PSU... yes, cheap brand, but I've bought a few and I must say, the rail strength on the 450''s and 525's is definitely competitive with much more expensive PSU's... problem is, www.amamax.com sometimes will accidentally ship you an X-Case brand PSU which positively is crap... (but works great on ECS K7S5A's, ironically)...
- And lastly, a nice JIUCB 1700+... @ 1.725 it does 1800mhz stabley... and temps? I dunno but the wet sand mirror finish lap job on my no-name CU hsf, coupled with an upgrade to a LOUD 7k delta gave me cpu temps of 33 idle, 39-41 load... no those arent my case temps... My case somehow cools right... I thought "wha?" at first as well but then I took the side cover off and felt for hot, non moving air... none :):)...


and to boot (maybe "for the record" would be better there hee hee), doin this and gettin a hard fought 10k with buttloads o bandwidth was much more fun than piecing my "stocker" gigabyte together and getting the same score...

Kev1
06-08-03, 09:49 PM
That the high score on 3D Mark 2001 for a GF3 is around 13,000 :eek: Kindof amazing :)

I still think its a good video card.

ragejg
06-08-03, 10:52 PM
mr. stealthhawk....

thanks for your clarifications on some of my misunderstandings...

I'll attempt to explain:

Re: AF: I will not dispute that the AF performance is not on par with the GF4 series, but it is more use-able than on, say, a Radeon 8500 - 9200...

Re: AA: So I don't understand the tech of AA fully, but I've heard my fair share of folks saying that QAA looks much better on the 40 series than it did around 27.XX or so... myself included... Or was it teh dr0gz? :D

Re: VS: well, I looked up some 10k GF3 scores, and some 10k GF4 scores... the Futuremark test shows the advantage of two vertex shaders (heh... now, where did I get that "cache" tidbit from?) dang, hope I didn't completely misreport that... rrr.. lemme do some digging... I aint tryin to bs nobody...

UPDATE: I cannot find anything to back up my "cache on VS of GF3" I retract my statement. Sorry, I heard a few discuss that on a forum a while ago, and I suppose I shoulda listened better/obtained a url

ALSO: Aren't VS stressing games gonna be DX9 games as VS will be doing more of what PS did)? And aren't most of the full dx8 games yet to be released going to be using BOTH shaders, possibly creating enough balance for that gen. of cards to handle properly?

ragejg
06-08-03, 10:56 PM
btw here's what i been readin for the past 1/2 hr... an old extremetech article... I'd sure like to see the same results on 40.72 drivers!
http://www.extremetech.com/article2/0,3973,486101,00.asp

digitalwanderer
06-08-03, 11:19 PM
...I loved me GF3 and all, but when I got my GF4 ti4400 it blew it out of the water. No where near as much as a jump as the GF4 to 9700 Pro, but definately superior to the GF3 in every respect!

-AA/AF was much faster & better IQ
-Quincux looked good compared to the quincux on the GF3!
-All me games played better, and it handled me 25% OCing just as spiffy as me GF3. (My 9700 Pro will only let me get away with a 20% OC, but it's well worth it. ;) )

The GF4ti is a better card than the GF3, period.

Am I to understand that you are trying to argue this gentleman into upgrading to a GF3 over a GF4ti? If so, that's some bad advice you're giving out there friend. :(

ragejg
06-08-03, 11:50 PM
- Some people want to spend ONLY $70 right now... not $79 plus shipping...

- A lot of people looking for a "fresh off the mainstream hump" card are running "fresh off the mainstream hump" motherboards, ie: kt266a's & kt333's. Chances are, they have anywhere from a 1.2 duron to a TBredB 1700... and they OC a little. I was (in a previous post) just illustrating a point to consider if you have a mobo of this class.

oh, about teh ti4200 to ti500 upgrade? Sorry, I meant more sarcasm than what I put down... heh... :bleh: :D

sbp
06-09-03, 12:31 AM
Originally posted by Geforce4ti4200
so should I upgrade my ti4200 to a ti500? :confused: No, that would be a downgrade. GF Ti4200 performs better.

other points: Speaking as someone who has owned both a GF3 Ti500 and GF4 Ti4600, I didn't see much difference in the AA.
Accuview was supposed to move where subpixel samples were taken.

Quincunx http://www.pcabusers.com/forums/images/icons/drunk.gif

StealthHawk
06-09-03, 03:04 AM
Originally posted by ragejg
mr. stealthhawk....

thanks for your clarifications on some of my misunderstandings...

I'll attempt to explain:

Re: AF: I will not dispute that the AF performance is not on par with the GF4 series, but it is more use-able than on, say, a Radeon 8500 - 9200...

Hmm, the r8500-r9200 should have much better AF performance than the gf3-gf4. ATI's strength over nvidia was typically their much higher AF performance. Although with the gfFX and latest drivers the AF performance of nvida wins. the gf3-gf4 have much more usable FSAA than the r8500-r9200, because the nvidia cards use MSAA compared to the SSAA of the r8500-r9200. I think that's what you're thinking of.

Re: AA: So I don't understand the tech of AA fully, but I've heard my fair share of folks saying that QAA looks much better on the 40 series than it did around 27.XX or so... myself included... Or was it teh dr0gz? :D

I'm not sure if it looks better with newer drivers or not, but I just fired up WolfensteinET(love it) with the 43.45 and my gf3, and it still blurs the textures :( Of course some people like KILER like the gf3 QCA blur, and dispise how the gf4 QCA no longer blurs textures, to each his own ;)

ALSO: Aren't VS stressing games gonna be DX9 games as VS will be doing more of what PS did)? And aren't most of the full dx8 games yet to be released going to be using BOTH shaders, possibly creating enough balance for that gen. of cards to handle properly?

In DX9 PS got a lot more powerful than they were in DX8. I'm not sure why VS would replace PS. VS are used for geometry while PS are used for effects on textures and stuff. Eventually I'm sure the functionality of VS and PS will merge, but that will be in the future.

The thing about shaders is this, the DX9 generation has more VS/VS power than the DX8 generation. And the PS on the DX9 generation are faster than the PS on the DX8 generation. Again, the playability of the gf3 in future games will mainly be determined by what route developers take, DX8 or DX9. You have hugely CPU limited games like Commanche4, which play pretty much the same on all cards. But in more graphic card limited games you see the gf4 pulling away from the gf3, and the r9700+/gfFX5800+ pulling away from the gf4.

ragejg
06-09-03, 07:20 AM
Why you always makin me learn, stealthhawk??

:cool: :cool: :D :D