View Full Version : RF filter mod to improve image quality on GeForce4?

10-12-02, 11:20 PM
Hi all,

I had performed this hack (http://www.geocities.com/porotuner/imagequality.html) on my GF2 GTS a year ago and recently purchased a GF4 Ti4200. Upon loading windows xp after reinstallation of drivers and settings with resolutions and refresh rates set the same as with the GF2 GTS, the image quality difference was staggering. My GF2 GTS was significantly better than this new GF4 card in terms of quality.

I had looked over the card and the layout for the inductors is different (on the back I think) but there are so many, and I'm not sure which ones to remove. If someone on this forum can help me or point me to site or source for the information updated for the GF4, I would greatly appreciate it. Also, Insight on how to mod the GF3 would also be appreciated as I plan to help out a few friends with this mod.

BTW: I have a PNY Ti 4200 64 Meg with DVI and SVideo out if that helps.


10-17-02, 07:13 PM
the tvout was far better on the geforce 2 as well (unhacked in any way), i wonder if there exists such a hack for the tvout on the geforce4.

10-18-02, 12:29 AM
TV-out quality is a function of the decoder chip used by the manufacturer. I'm sure there are GeForce4's with decoders as good as any GeForce2. You just have to look for 'em.

I personally don't know what decoders are good and which aren't, though, so I can't say any more than that.

10-18-02, 01:03 PM
unfortuantely it is more complicated than that.

my pny geforce4 ti4600 is supposed to have the very best tv encoder available (cant remeber wich one right this second, i am very sick and out of my head, suffice to say i researched the hell out of it)

diffrent design of video cards PCB seem to be affected by and cause emi differently.

thats the main reason ASUS video cards used to not be based off of the reference designs, the asus geforce3 deluxe cards had much better tvout not because of the chip they used, but because they were designed better not to cause interference and noise on the tvout...

its a shame they(asus) used the ref design for there gf4's...

i wonder how the image quality of triplex cards with the silver coated pcb thats supposed to cut down on emi would be.

10-18-02, 04:34 PM
Well, then, in what way is the image quality worse? Usually it's obvious if it's emi (image not stable, text not as clear as it could be, blurry image can generally be attributed to emi).

10-18-02, 07:17 PM
I've noticed IQ is worst at resolutions above 1280x1024 when compared to my old GF2 GTS with the hack on my 21" Sony G500 at the same refresh rates.

Text is blurrier on the desktop, lines do not appear as "crisp" and the overall image is not as sharp.

I can honestly say the same for my GF2 GTS before the mod. It gets progressively worst as resolutions increase and refresh rate decreases. But after the mod with a desktop res of 1940x1440x75HZ, everything was so much better. It felt like 1024x768x120Hz quality wise.

My PNY GF4 Ti4200 is not a bad card at all. Overall, its fast and works fine but when facing the desktop daily for work and play, I would prefer something with a much higher IQ. Its too bad higher IQ cards do not offer the same performance for the same amount of money.

BTW: I'm only using the SVGA connector and don't use DVI or Svideo outputs. I noticed the thread was starting to lean towards TV out. I've also posted this request on xbitlabs and hopefully one of their hardward hacking gurus can help.