PDA

View Full Version : Comments on my GeForce 3 image quality


Pages : [1] 2 3 4

noko
01-12-03, 09:04 PM
How do these images look to you guys/gals? IQ thingy here, are they good, fair, poor, who cares or what not? If so please explain your reasoning. This is on my GF3 Ti200. Just some general comments would help.

Alley Serious Sam (http://home.cfl.rr.com/noko/alley.jpg)
Alley2 Serious Sam (http://home.cfl.rr.com/noko/alley2.jpg)
Alley3 Serious Sam (http://home.cfl.rr.com/noko/alley3.jpg)

Benchmark on SeriousSam Alley... with the IQ you see is 45 fps.

Suburbs (http://home.cfl.rr.com/noko/suburbs.jpg)
Suburbs2 (http://home.cfl.rr.com/noko/suburbs2.jpg)

Benchmark on Suburbs with IQ you see is 40 fps.

QuakeIII (http://home.cfl.rr.com/noko/QuakeIII.jpg)
QuakeIII2 (http://home.cfl.rr.com/noko/QuakeIII2.jpg)
QuakeIII3 (http://home.cfl.rr.com/noko/quakeiii3.jpg)

QuakeIII ver 1.31 demo with quality you see 50.5 fps.

These are all at 1154x864 so make sure your browser is not shrinking them. Nvidia has been improving something which blew by I think all of us which we never even noticed. Please comment.

Kruno
01-12-03, 09:06 PM
It is very good IQ for a Geforce 3.
Framerate is a bit low but IQ is great for an Nvidia card.

Kev1
01-12-03, 10:36 PM
Your screenies look great, but I will swear my setup still looks better. I just found a way to teak my setup so I have just about crystal clear clarity in 2d and 3d, with no slowups.

The way mine is setup is follows:

Relisys TE995 19" Monitor
Det. 41.09
RivaTuner Settings: Under Desktop Tab I have as follows:

Brightness +27
Contrast -14
Gamma 0.66

DV Med

Also have played with contrast/brightness on monitor in combination with RivaTuner to get awesome picture. Its a lot of work and experimenting, but when you get it just where you like it you can write down all the setting then it will be no problem.

Lots of people just plug in a video card and thats that. They do not realize an excellent picture can be had with a bit of effort.

I do agree your fps in Q3 seems quite a bit low. At 1024 x 768, 32 bit color I get well over 100 fps. Heck, in UT2003 I get 111 fps flyby, 49 fps bot match. But my system is a bit tweaked :)

It will be tweaked more soon as I am updating to the following:

MSI K7N2-L NForce2 mobo
512 megs Corsair PC3200 Cas2 Memory
AXP TBred 1800+ CPU (AIUGA Stepping)
ThermalRight SLK-800 HS W/46 cfm YS Tech Fan
Antec 430W TruePower PS
Keeping my GF3 Ti200 for now :)

With enough CPU power and memory, a GF3 series card will ROCK
:D

noko
01-12-03, 10:58 PM
Thanks for the feedback and the settings. I will try them out. Yea I can get more FPS if I down the AA a notch. I was going for best IQ and playable frame rates for me. Also if I down the resolution a notch from 1152x864 to 1024x768 my frame rates are as you speak. Your right about fine tuning for best IQ and contrast and brightness as well as gamma can make a huge difference. I will give you another setting which after I get some more feedback for you to play around with. It is kinda like turbo charging your Nvidia card which most people would laugh at.

Once again thanks for your feedback.

Kev1
01-12-03, 11:54 PM
Plus depending on your Monitor and setup your ideal settings may be different than mine.

I did not realize you were using AA. I have AA disabled in D3D and OGL. I do use AF in OGL at 2x or 4x, neither of which causes much of a performance hit.

Using any AA with GF3 I know will slow things up. I don't mind the jaggies too much as I rarely notice them anywhere. However, I know we all have different tastes :)

Have fun tweaking and post back how everything goes!

noko
01-13-03, 06:02 PM
Those images where all 16 bit with max anisotropic filtering and 4x AA. The QuakeIII images was also using the 9-tap Gaussian filter for AA. If you use 32bit textures with Nvidia cards while at 16 bit, the IQ is superb. This is something that ATI does not have.

For those who have a Ti4600 or an overclocked Ti4200 you can have both max AA and AF with good FPS at high resolutions that may even perform on par with a Radeon 9700pro with 4x AA and 8x and maybe better. Don't be afraid that 16 bit looks terrible, on an ATI card yes, on a Nvidia card it doesn't have to be.

Kruno
01-13-03, 10:54 PM
Originally posted by noko
Those images where all 16 bit with max anisotropic filtering and 4x AA. The QuakeIII images was also using the 9-tap Gaussian filter for AA. If you use 32bit textures with Nvidia cards while at 16 bit, the IQ is superb. This is something that ATI does not have.

For those who have a Ti4600 or an overclocked Ti4200 you can have both max AA and AF with good FPS at high resolutions that may even perform on par with a Radeon 9700pro with 4x AA and 8x and maybe better. Don't be afraid that 16 bit looks terrible, on an ATI card yes, on a Nvidia card it doesn't have to be.

I wouldn't comment on Ati's 16 bit if you don't know anything about it. As it does 32bpp internally leading to a 22bpp 3dfx style colour making older games in 16bpp look far better than nVidia's cards. :)
What's more the Radeon 9700 Pro outguns nVidia's cards in image quality in every way possible. :)

StealthHawk
01-13-03, 10:56 PM
Originally posted by K.I.L.E.R
I wouldn't comment on Ati's 16 bit if you don't know anything about it. As it does 32bpp internally leading to a 22bpp 3dfx style colour making older games in 16bpp look far better than nVidia's cards. :)

uh, source? are you talking about the way 32bit is now forced on in 16bit to allow the use of FSAA? or what.

Kruno
01-13-03, 11:13 PM
Originally posted by StealthHawk
uh, source? are you talking about the way 32bit is now forced on in 16bit to allow the use of FSAA? or what.

Nah, I'm talking about Santa and his lost wife. :p ;) :D
Source = some set of drivers on my HD. :p

noko
01-14-03, 05:44 AM
Well my last ATI card (Radeon 64 VIVO) 16 bit was not even remotely as good as my Nvidia card. A whole different ballpark you might say. In addition you are talking about unrelease drivers magically making 16bit good? Anyways post some images of 16bit on your ATI card :). Show or be quiet is the word.

Kruno
01-14-03, 07:32 AM
Originally posted by noko
Well my last ATI card (Radeon 64 VIVO) 16 bit was not even remotely as good as my Nvidia card. A whole different ballpark you might say. In addition you are talking about unrelease drivers magically making 16bit good? Anyways post some images of 16bit on your ATI card :). Show or be quiet is the word.

I have a Radeon 9700 Pro BTW. :) I heard about the original Radeon's 16bpp IQ. :)
Stop being a lazy ass and look yourself.
I don't need to prove crap to you. :)
STFU about things you know nothing about or go learn about it. :)

Nephilim
01-14-03, 10:53 AM
Originally posted by K.I.L.E.R
I have a Radeon 9700 Pro BTW. :) I heard about the original Radeon's 16bpp IQ. :)
Stop being a lazy ass and look yourself.
I don't need to prove crap to you. :)
STFU about things you know nothing about or go learn about it. :)

Uhh...no. Why don't you back your own statement up?

Kruno
01-14-03, 11:04 AM
Originally posted by Nephilim
Uhh...no. Why don't you back your own statement up?

Uhm no, why doesn't he/she back up his/her original statement?

You either take my word for it or go **** yourself. Like I said, the drivers are on my HD.
Don't give me **** about "you expect me to trust *insert name here*, I don't even know *insert name here*".

Like I said, I don't have to prove **** all to anyone.

Nephilim
01-14-03, 11:25 AM
That's fine. Don't expect anyone to believe you either.

Kruno
01-14-03, 11:39 AM
Originally posted by Nephilim
That's fine. Don't expect anyone to believe you either.

Don't give a ****. My experiences are my own. BTW I have just added you to my ignore list.

EDIT by volt: watch where you stepping bro, it may not feel pleasant.

Nephilim
01-14-03, 11:43 AM
Hahaha...for what? Because you got nasty with me?

Man.

Nephilim
01-14-03, 11:54 AM
Wow...who pissed in your cherios?

Nephilim
01-14-03, 12:14 PM
I just love it when someone supposedly uses the ignore feature, and yet fails to ignore the person.

lmao

netape
01-14-03, 01:53 PM
Kiler, stop being a :lame:r. Just backup your statement or go to Rage3D. ;) :jammin:

darkmiasma
01-14-03, 03:11 PM
K.I.L.E.R. -

you come up with the dumbest **** to post in here ... all of your posts are either blatent lies or absolutely useless bull**** ...

why dont you make an attempt to contribute instead of just trolling ...

- mike

saturnotaku
01-14-03, 03:30 PM
And you wonder why mods get pissed off at you. :rolleyes:

Grow the flange up.

volt
01-14-03, 03:56 PM
KILER: You are getting on my nerves already, and not only mine.
You've had more fruitful posts before you bought 9700.

Check your PM :angel2:

Bigus Dickus
01-14-03, 04:22 PM
K*I.L_e-R does have a point though.

noko made the claim that nVidia's 16 bit IQ is superior to ATi's bit IQ.

Don't be afraid that 16 bit looks terrible, on an ATI card yes...Notice the present tense form of "looks," suggesting that is the case on modern ATi cards?

noko is the one making bold claims. k-I>L.E^R simply said "you're wrong." Well, not simply... there was plenty of BS to follow. His point stands though... if noko is the one saying ATi's 16 bit IQ is "terrible" then noko is the one who needs to back up his claim.

Nephilim
01-14-03, 04:26 PM
As far as I'm concerned, they both do.

Noko wasn't being offensive about it though.

noko
01-14-03, 05:28 PM
Killer is not welcome at Rage3d, that is probably why he taunts people here. The one person who could disprove my claim is pure lame. My claim still stands, Nvidia has better 16bit quality .:angel: