PDA

View Full Version : Poor Image Quality in 169.21.........with GTS 512


holdenb
01-04-08, 12:17 PM
as the title says really

I have just upgraded from an ATI x1950 XT to a new GTS 512

I installled the latest 169.21 WHQL's and although my framerate has improved loads the image quality is rubbish!!!

I have to use 16xAF just to get it looking as good as the ATI card. Now i know ATI has always had the reputation of being better for IQ but this seems too poor. My 8800GTX IQ is fine (although still using the 163's)

Is this a known issue for this card with this driver???

SJA06
01-04-08, 01:42 PM
Have you, by any chance, got some of the optimizations ticked or intellisample set to high performance rather than high quality?

holdenb
01-04-08, 06:20 PM
Have you, by any chance, got some of the optimizations ticked or intellisample set to high performance rather than high quality?

sadly nope..................not as easy as that

SeriTonin
01-04-08, 08:27 PM
take the card back, get a 3870, your problems solved.

duralisis
01-05-08, 01:19 PM
Bugger you ATI troll :).

Just post some screenshots and we can compare. The 8800GTS 512 is the same chip as the 8800GT I have, which I haven't had any problems with IQ. Other than some minor gamma differences, IQ has been the same for me ever since the GF7 series.

I do however think there's quite a few bugs in the beta drivers ever since 169.21. The profile overrides have been broke ever since and only global works, there's a bunch of odd game bugs (like QuakeWars threaded renderer not working anymore), and I've seen a couple posts regarding banding and OpenGL not respecting "High Quality" settings for texture filtering.

Since you got the 512 you might be out of luck for rolling back, but if there's an IQ problem, we should be able to tell the difference between 169.09 and the later drivers between our two cards.

GeniusPr0
01-05-08, 02:38 PM
try 169.28 betas or omegas 169.21

Elvin Presler
01-06-08, 12:57 AM
I hate to say it, but ATI cards really do have quite a bit better IQ. If you've just "upgraded" from ATI to Nvidia, it's going to look...blurry and bland for while until you get used to it. Not trying to start anything, just stating the simple truth.

Even my old 9800 Pro looks better than my 7800GS in DVD or playing a game that both cards can handle. Obviously the 7800GS blows the 9800 Pro away in performance, but it does NOT look as good.

GeniusPr0
01-06-08, 12:14 PM
I hate to say it, but ATI cards really do have quite a bit better IQ. If you've just "upgraded" from ATI to Nvidia, it's going to look...blurry and bland for while until you get used to it. Not trying to start anything, just stating the simple truth.

Even my old 9800 Pro looks better than my 7800GS in DVD or playing a game that both cards can handle. Obviously the 7800GS blows the 9800 Pro away in performance, but it does NOT look as good.

7 series IQ sucks, the 8800 IQ is much better and is on par with atis

duralisis
01-06-08, 12:16 PM
Screenshots please. You can't just say "the ATI xxxx looked so much better" when all the evidence (thousands of reviews and screenshot comparisons) are to the contrary. The statement of a blurry and dull image leads me to believe it's your monitor rather than the chip itself.

andy_nv
01-06-08, 02:00 PM
I think it's more a matter of taste. With Nvidia every effect (HDR, water, textures...) kind of blends in, you get a cohesive picture that some find blurry. With ATI you can tell the effects apart and that makes IQ on Radeons look more vivid, almost cartoony to me.

Slammin
01-06-08, 03:34 PM
I hate to say it, but ATI cards really do have quite a bit better IQ. If you've just "upgraded" from ATI to Nvidia, it's going to look...blurry and bland for while until you get used to it. Not trying to start anything, just stating the simple truth.

Even my old 9800 Pro looks better than my 7800GS in DVD or playing a game that both cards can handle. Obviously the 7800GS blows the 9800 Pro away in performance, but it does NOT look as good.



You're not starting anything. I said the same thing when I went from 9800 to 6800 and then to 7800. My ati card had better IQ than both but nVidia really got it together in the IQ dept with the 8800 series.

Elvin Presler
01-06-08, 05:24 PM
I was going to say, it was the same with my 6800 (I went 9800 Pro, 6800 GT, 7800 GS [RMA "upgrade" for the 6800 GT]). I can't wait to get an 8800+ then if they have improved the IQ that much.

As for screenshots, I can never see the difference in screenshots, but while I'm playing, the difference is pretty clear. DVD playback is hands down better on my old ATI card, don't know about an 8800.

holdenb
01-07-08, 12:17 PM
Screenshots please. You can't just say "the ATI xxxx looked so much better" when all the evidence (thousands of reviews and screenshot comparisons) are to the contrary. The statement of a blurry and dull image leads me to believe it's your monitor rather than the chip itself.


well that statement is obviously just stupid.....

it's the same monitor!!!

Anyway I am not some fanboy slagging off ATI or Nvidia I'm just asking if anyone else has noticed this.
I have no real allegiance as i do own 8 computers all with a variety of nvidia/ati hardware

I mainly asked 'cos my 8800gtx IQ looks fine, but my GTS is a disappointment.

I'm not gonna post pics 'cos it still wouldn't illustrate the difference i am seeing......but then i'm not setting out to prove anything

duralisis
01-10-08, 10:33 AM
It's not about an allegiance to ATI or NVIDIA, it's simple troubleshooting.

You're complaining of a perceived decrease in image quality, but you say a screenshot won't show us the problem, so by simple deductive reasoning, it has to be a problem AFTER the GPU and framebuffer. That leaves a few possibilities:

1. Display
2. Cable
3. Port (DVI/VGA)
4. Filter components (VGA)

If you are using VGA, then it's a common problem and the perceived sharpness differs among all graphics cards. Some better than others; and board component quality (RF filters and RAMDAC) matters greatly.

With a VGA connected LCD, there is also a normal blur due to a digital-to-analog and then back to digital conversion that happens at the display. Each pixel is interpolated by 4 surrounding elements to match the color. This differs from DVI, where there is no interpolation at all.

If you're using DVI-D/I, then you should be getting an exact 1:1 image output from what your card sees in the framebuffer. The only time I've heard of a blurry DVI output is with some HDTV's and certain timing methods.

Now you may be unable to see the difference in a screenshot, but several of us know exactly what to look for and what might be causing artifacting during motion; and how to look for IQ issues (which do exist from driver to driver), but overall it's not a problem specifically with the card. Generally these things are minor and may include aliasing, mipmap distances, banding, and shader bugs.

mythy
01-10-08, 10:38 AM
actually the major thing about the 8800 series is its outstanding IQ over ATI's 1900 and 2900 series :p Particualry in texture filtering

holdenb
01-10-08, 02:53 PM
It's not about an allegiance to ATI or NVIDIA, it's simple troubleshooting.

You're complaining of a perceived decrease in image quality, but you say a screenshot won't show us the problem, so by simple deductive reasoning, it has to be a problem AFTER the GPU and framebuffer. That leaves a few possibilities:

1. Display
2. Cable
3. Port (DVI/VGA)
4. Filter components (VGA)

If you are using VGA, then it's a common problem and the perceived sharpness differs among all graphics cards. Some better than others; and board component quality (RF filters and RAMDAC) matters greatly.

With a VGA connected LCD, there is also a normal blur due to a digital-to-analog and then back to digital conversion that happens at the display. Each pixel is interpolated by 4 surrounding elements to match the color. This differs from DVI, where there is no interpolation at all.

If you're using DVI-D/I, then you should be getting an exact 1:1 image output from what your card sees in the framebuffer. The only time I've heard of a blurry DVI output is with some HDTV's and certain timing methods.

Now you may be unable to see the difference in a screenshot, but several of us know exactly what to look for and what might be causing artifacting during motion; and how to look for IQ issues (which do exist from driver to driver), but overall it's not a problem specifically with the card. Generally these things are minor and may include aliasing, mipmap distances, banding, and shader bugs.





blah blah.........

again you're not following me........

I'm sure a screen grab WOULD illustrate my point.......

I was actually saying that i cant be bothered to re-install the X1950 XT with its stupid catalyst control centre & bloated .net framework rubbish.
I say again I am not here to lock horns with fanboy's.....unfortunately all i am hearing is soundbites from posters about the 8800's better IQ than ATI etc etc

I repeat i own pretty much all the high end cards from the last few generations and really don't care which one i use.........But when i swapped from the x1950 to the GTS performance went through the roof... (x 4 ish) but IQ at default driver settings was worse

mythy
01-10-08, 03:37 PM
Some thing is simply wrong with your drivers/settings. The 8800 should be a MASSIVE noticeable improvement in textures :p I went from a1900XT to a 8800Ultra and it was night and day and my GT is no different... I would wipe the drivers and try again.

Hapatingjaky
01-10-08, 04:50 PM
Here, thi is easy. What titles are you playing that you do not like the image quality in? If you are refusing to take a screenshot then one of us can load up the game, take a screenshot and compare it to yours.

holdenb
01-11-08, 07:47 AM
Here, thi is easy. What titles are you playing that you do not like the image quality in? If you are refusing to take a screenshot then one of us can load up the game, take a screenshot and compare it to yours.



Well I would need someone with an x1950 XT.......I think the effects of the poor textures is particularly noticable in 3dmark03 (no fancy lighting or bump mapping to confuse the issue)

SO later i will post a shot from the 1st test of 3dmark03.....if someone wants to post a default pic from a radeon x1950

Roscoe
02-07-08, 07:25 PM
I can do that for you...

currently running a X1950 Pro Cat 8.1, default settings

... just downloading 3dmark03...

I'm waiting on an Inno 8800GT 512mb being delivered next week.... turning back to the Dark Side for the first time since I replaced my GeF4200 with a Radeon 9700 ;)


edit: damnit! all the Gamershell zips of 3dmark03 are corrupted! waiting on a slower d/l from guru3d...

edit2: ok, here we go...

http://www.brooker.eclipse.co.uk/Web/3dmark/3dmark03 score.jpg
http://www.brooker.eclipse.co.uk/Web/3dmark/shot0010.bmp

now it's time I went to bed!

Drax
02-08-08, 01:22 AM
Well what it may also be is this stupid move to not allow overriding of Anti-Aliasing settings. It usually goes to enhance instead, even if you have it set to override. I get a lot of shimmering in games where this happens, and overall IQ is always worse when using in game AA or enhance the application AA settings vrs Override. In WoW and LotR's Online I get a lot of sparkley artifacts around the edges of other distant characters unless I have it set to 8xQ or higher.

Try changing the name of the .exe file you use to run whatever it is your running and see if that improves things. That worked for WoW for me (though you can't have post-processing glow effects and forced AA at the same time, which is probably why they force enhance mode in that case).

Back when they allowed more freedom for AA settings there was no question my 8800 was -at least- as good as my old ATI X800XT PE