PDA

View Full Version : What contributes to the geforce 4 ti4400 looking better than anything after it?


Pages : [1] 2

kevJ420
05-16-07, 04:27 PM
This has been on my mind for the past 4 years, since I first played a game on the fx 5900 ultra.

It seems like every pre-DX9 game looks much, much better on a ti4400 I had a long ass time ago. I was wondering why this is.

The af still isn't as good on the 8800gtx as it was on GF3/4, and the overall image quality doesn't look as good on cards after the 3/4 series.

why is that? what the hell happened?

I've been dying to know.

I'd really like a return to that image quality.

DISCLAIMER: **** like shader precision and shadows are irrelevant to the image quality I'm talking about, so try to refrain from saying "how could half-life 2 look better on a 3/4 series than it does an on an 8800gtx?"

|MaguS|
05-16-07, 04:32 PM
What contributes to the geforce 4 ti4400 looking better than anything after it?

You being a complete moron.

Tr1cK
05-16-07, 04:34 PM
Kev, you're nuts. The 8800 series has the best AF ever on a consumer level card.

ViN86
05-16-07, 04:38 PM
:wtf:

kevJ420
05-16-07, 04:44 PM
i'll give my own suggestions. I don't know if they're right so someone needs to confirm these for me.

color compression forced w/ aa (i noticed ATI's overall image quality went downhill starting with the 9700 pro, which was ATI's 1st card with cc, just like how nvidia's iq went downhill starting with the first card they used cc on which was the geforce fx.)

the af got ****ed up starting with the fx series

different fog types, perhaps.

driver code removed for certain features, maybe selects worse texture compression, e.g. default is dxt3 instead of dxt5 or no TC. (not as likely as the other suggestions listed)

worse signal quality

Tr1cK
05-16-07, 04:52 PM
:rolleyes:

ViN86
05-16-07, 05:15 PM
"Living Ten Years Ago..."

The Autobiography of kevJ420....

seriously dude, all you talk about is old ****. Saturn... geforce4... who the hell cares anymore? the worst part is that when those things were released you were probably constantly talking about the NES and Atari. and i bet 10 years from now youll talk about the 8 series. get with the times, ****in hell. :|

RicHSAD
05-16-07, 05:17 PM
What contributes to the geforce 4 ti4400 looking better than anything after it? I think the answer is you and only you. If you really want to get back to that level of quality, just find an old PC, put the card in and enjoy your high level of quality.

kevJ420
05-16-07, 06:03 PM
"Living Ten Years Ago..."

The Autobiography of kevJ420....

seriously dude, all you talk about is old ****. Saturn... geforce4... who the hell cares anymore? the worst part is that when those things were released you were probably constantly talking about the NES and Atari. and i bet 10 years from now youll talk about the 8 series. get with the times, ****in hell. :|

No, not really. I've never played a pre-lynx atari and I've never loved the NES all that much. There's people today (myself excluded) that constantly talk about the NES.

Plus, Saturn has over 200 games, I haven't played them all, and I still love to play Nights and on top of that I've already made an A on every course many many years ago. I don't play any other games after I've beat them.

I owned my 1st saturn in 1995, when it was new. I wasn't interested in the NES at all then, simply b/c it didn't offer good graphics.

Anyways, relative to the thread, why did the image quality start going downhill with the FX series and the 6/7 series, and now the 8 series?

the Geforce 3/4 series really does look superior.

I just want that image quality to be reproduced on all future products, but I guess it won't.

Madpistol
05-16-07, 06:09 PM
This guy reminds me of someone else whose name also starts with "kev"...

post back if you have any hints. ;)

No offense dude, but the 8800 GTX has the best image quality of any graphics card thus far. The poor IQ of lower cards has been fixed and far improved. I have no clue where you're going with this, but I have a Ti 4200 still for my old system, and guess what... it doesn't even come close to the 8800.

Slammin
05-16-07, 06:35 PM
It would be hard to even compare the gfx of the ti4400 days to todays gfx, much less the technology used to present them. I mean, the polygon count has gone up so much that anyone making a comparison, especially based on their own memory, or perception of what they remembered seeing so many years ago, is frankly, a little bit nuts.

Kev, I hope you find this thread healing, because it does seem you have have a lot if pent up pain.

We're there for ya pal!

Belarnion
05-16-07, 06:43 PM
AF:

GeForce Ti 4xxx-series max AF (8x)
http://www.ixbt.com/video2/images/nv40-r420-6/gf4ti4600-tf-q-af08.png

GeForce 8xxx-series max AF (16x)
http://www.zive.cz/files/computer/obrazky/g80/obrazky/d3daf/8800gtx_af16xHQ.png

This proves, beyond any doubt, that G8x-series AF quality is way better than GeForce 4xxx-series AF quality.


Antialiasing quality has objectively never been better.
G80 16xQ AA http://www.ixbt.com/video2/images/g80-3/farcry_gf8800_16xq.png
http://www.ixbt.com/video2/images/g80-3/hl2_gf8800_tass_16xq.png

DAC's are better than they used to be. Color Precision is better.

I think you suffer from placebo effect. You should try to capture a picture of what you see... ;)

darkrider01
05-16-07, 07:13 PM
10 year old glasses?

rhink
05-16-07, 08:02 PM
color compression forced w/ aa (i noticed ATI's overall image quality went downhill starting with the 9700 pro, which was ATI's 1st card with cc, just like how nvidia's iq went downhill starting with the first card they used cc on which was the geforce fx.)

Lossless color compression genius. It doesn't affect quality at all, only performance. Oh yeah, right, you're the guy that claimed no compression is "lossless". Hard to convince someone who's impervious to facts.

kevJ420
05-16-07, 09:14 PM
My eyesight was 20/20 in 2002 when I had the ti4400.

So maybe it's my eyesight.

It could possibly be that it was a false memory, it's just that the false memory remembers seeing graphics so much more pure than what's offered today in terms of graphics quality.

The 2 games I remember in particular were both open GL.

They were RtCW and JK2. I played them in like 5/2002 on a visiontek ti4400, and they looked so good.

Then I got a system w/ a ****ty underrated ati 9700 pro 4 months later
(9/2002) and the 2 games looked like crap compared to what my (possibly) false memory remembered of the ti4400 that I had unfortunately sold.

In 2003, I ditched the 9700 pro and got a system w/ an FX 5900 which also looked worse than the ti4400 I had, but not quite as ****ty as the 9700pro by a long shot.

Then I ditched the 5900 ultra and had a 6800 gt put into the same machine.

It didn't look a whole hell of a lot better than the FX.

7 series is the same as 6 series, so I know what that's like b/c I had a 6800gt.

So now I have an 8800gtx and while the AF is improved @ HQ, it's not as good as I perhaps falsely remember on the ti 4400.

A diamond monster 3d 2 (3dfx Voodoo, 3Dfx back then) I had was arguably the best video card I've ever had for it's time and it was in the worst PC of any that've been in my household. But it was really the best PC as a whole b/c the voodoo 2 was so great.

I had no choice but to buy it. about a second after the copyright screen in Turok on a RIVA128, as soon as I saw the Iguana logo and mascot, I was terrified and i was like, "My parents were dumb enough to spend ~$3500 on a PC with this piece of **** in it?" I literally when up to my room and punched a hole in the wall (that was the 2nd and last time I damaged the wall in my room.) After the voodoo 2 was installed, I would have just thrown the Riva 128 in the trash can, had it been my own pc, and had the monster 3d 2 been a 2d/3d combo card. The thing that was so amazing was that the voodoo2 was perfect in both speed and iq, and also compatibility while the riva 128 had no advantages under any given situation.

The Voodoo 5 had the best IQ out of any card, shader model notwithstanding and it didn't have color compression w/ it's true FSAA I hope is reinstated this year w/ G90 and of course with color compression not forced along with it.

OK, i'm stepping off the soap box.

kevJ420
05-16-07, 09:18 PM
AF:

GeForce Ti 4xxx-series max AF (8x)
http://www.ixbt.com/video2/images/nv40-r420-6/gf4ti4600-tf-q-af08.png

GeForce 8xxx-series max AF (16x)
http://www.zive.cz/files/computer/obrazky/g80/obrazky/d3daf/8800gtx_af16xHQ.png

This proves, beyond any doubt, that G8x-series AF quality is way better than GeForce 4xxx-series AF quality.


Antialiasing quality has objectively never been better.
G80 16xQ AA http://www.ixbt.com/video2/images/g80-3/farcry_gf8800_16xq.png
http://www.ixbt.com/video2/images/g80-3/hl2_gf8800_tass_16xq.png

DAC's are better than they used to be. Color Precision is better.

I think you suffer from placebo effect. You should try to capture a picture of what you see... ;)

There's some not completely smooth transitions in the 8800 gtx one, and the blue is jaggy, not smooth, not completely round.

AA has been superior before. The voodoo 5 had superior and true fsaa.

Belarnion
05-17-07, 04:26 AM
There's some not completely smooth transitions in the 8800 gtx one, and the blue is jaggy, not smooth, not completely round.

AA has been superior before. The voodoo 5 had superior and true fsaa.
Take a look at the GeForce 4 Ti 4600 AF in this link, then. http://www.ixbt.com/video2/nv40-rx800-6.shtml

There is some very-not-smooth transitions and jagginess is teh present. Or do you think it's round?

evox
05-17-07, 04:40 AM
:wtf:

grey_1
05-17-07, 06:35 AM
kev...you alright man. :lol:

Treason
05-17-07, 07:08 AM
Anything nVidia = excellent image quality in his eyes.

SLippe
05-17-07, 07:33 AM
Why can't I ban based on stupidity! :headexplode:

911medic
05-17-07, 08:38 AM
What contributes to the geforce 4 ti4400 looking better than anything after it?
C'mon now guys, cut him some slack!

I think the GF4 Ti4400 is quite an attractive card! Cute PCB, slim & trim cooling solution, it's so...svelte! A real looker! Everything after blew up into a big, dual-slot cooler fatty, like a girl hitting her 30's.

Good eye, Kev!

rhink
05-17-07, 09:29 AM
The Voodoo 5 had the best IQ out of any card

Other than its complete lack of anisotropic filtering that made its textures a blurry mess, you mean?

The Voodoo 5 had the best IQ out of any card, shader model notwithstanding and it didn't have color compression w/ it's true FSAA I hope is reinstated this year w/ G90 and of course with color compression not forced along with it.

Lossless color compression. For about the 15th time- it has no affect on image quality.

about a second after the copyright screen in Turok on a RIVA128, as soon as I saw the Iguana logo and mascot, I was terrified and i was like, "My parents were dumb enough to spend ~$3500 on a PC with this piece of **** in it?" I literally when up to my room and punched a hole in the wall

:rolleyes:

Blacklash
05-17-07, 08:13 PM
nVidia did abandon HQ AF for aggressively optimized AF for a time (thank you ATi :rolleyes:) and came roaring back with the G80 series. There really is technically no comparison. IMO you are engaging in a seriously exaggerated romantic recollection about the 4400. I still have a ti 4600 and it does not touch G80 AF.

I was one of the most harsh and vocal critics of nVidia's AF with the 7800|7900 series and changed my tune when the G80 hit. My main complaint was that you didn't really have an option for true HQ AF. It was all half arse and optimized even when you attempted to defeat it. Thankfully getting rid of it, also killed shimmering in some titles.

Right now nVidia has the best IQ and performance available in both direct X and OGL. I am very glad I bought my 8800GTX and will be keeping it longer than I have any video card since I started building my own rigs back in 94.

kevJ420
05-17-07, 10:08 PM
nVidia did abandon HQ AF for aggressively optimized AF for a time (thank you ATi :rolleyes:) and came roaring back with the G80 series. There really is technically no comparison. IMO you are engaging in a seriously exaggerated romantic recollection about the 4400. I still have a ti 4600 and it does not touch G80 AF.

I was one of the most harsh and vocal critics of nVidia's AF with the 7800|7900 series and changed my tune when the G80 hit. My main complaint was that you didn't really have an option for true HQ AF. It was all half arse and optimized even when you attempted to defeat it. Thankfully getting rid of it, also killed shimmering in some titles.

Right now nVidia has the best IQ and performance available in both direct X and OGL. I am very glad I bought my 8800GTX and will be keeping it longer than I have any video card since I started building my own rigs back in 94.

I agree with you 100%. I thought that nvidia shouldn't do something just b/c ati does it. That's what pissed me off, was that there was no 3rd party like
3dfx to go to, they both had ****tering af. Nvidia also probably wouldn't have taken out the 32 bit z buffer and maybe even not the w-buffer had ati not taken it out.

They wouldn't have even worried about a "transistor buget" if ati hadn't invented that stupid ****.

i did however admit that my memory is apparently deceiving me and my eyesight is worse, so we should all disregard my comments.

Also, one person here actually didn't notice the 7800gtx's af ****tered after I told them that it sucked.

Thanks for participating in this thread=)