PDA

View Full Version : Fairly simple question.....


EazyDuz14
06-23-06, 01:59 PM
I was wondering. I can tweak to an extent, but i have never really messed with resolution and AA.

I have a 17" flat screen monitor, with a "native" resolution of 1280X1024, what are the advantages of running this res?
Also, whats better out of these two choices:

Higher res and lower AA
More AA and lower res


I cant decide, i seem to think the second one is the best choice though, better frame rates and sharper images.

Alot of people use 1280X1024 (in games which support it) and NO AA! Is that better than 1024X768 and 4X AA+ ?

Please reply with as much detail as you can (im really interested).

Thanks.

whiskeychaser
06-23-06, 02:08 PM
I can give you the simple answer. If you have the hardware to enable you to run at your native resolution, than that is where you should be. The native resolution=how many actual pixels your monitor has, so the sharpest picture is achieved when running at native resolution. Once you can run your favorite games at native resolution, than crank up the AA and AF to the levels that your cpu/video card can support. Good Luck!

flukester
06-23-06, 03:27 PM
I was wondering. I can tweak to an extent, but i have never really messed with resolution and AA.

I have a 17" flat screen monitor, with a "native" resolution of 1280X1024, what are the advantages of running this res?
Also, whats better out of these two choices:

Higher res and lower AA
More AA and lower res


I cant decide, i seem to think the second one is the best choice though, better frame rates and sharper images.

Alot of people use 1280X1024 (in games which support it) and NO AA! Is that better than 1024X768 and 4X AA+ ?

Please reply with as much detail as you can (im really interested).

Thanks.


AA is so insiginificant these days. If you are running at 12x10 then depending on your card, I'd pump AF and leave AA very low or off. I typically run everything with AF and no AA. Jaggedies depending on your game are insignificant. AF however makes things look lots better. AA rounds edges, AF gives depth to the graphics you see. When you throw HDR into the game, AA becomes even more senseless. Like Oblivion. nVidia 'could' put out a fixed driver to run AA and HDR, I've read this, but they don't care and most nVidia gamers would probably not care anyhow because AA doesn't really add any true value to the game.

Wasn't sure if that last guys answer was what you wanted. So this is my personal experience.

einstein_314
06-23-06, 04:16 PM
AA is so insiginificant these days. If you are running at 12x10 then depending on your card, I'd pump AF and leave AA very low or off. I typically run everything with AF and no AA. Jaggedies depending on your game are insignificant. AF however makes things look lots better. AA rounds edges, AF gives depth to the graphics you see. When you throw HDR into the game, AA becomes even more senseless. Like Oblivion. nVidia 'could' put out a fixed driver to run AA and HDR, I've read this, but they don't care and most nVidia gamers would probably not care anyhow because AA doesn't really add any true value to the game.

Wasn't sure if that last guys answer was what you wanted. So this is my personal experience.
I wouldn't say AA is insignificant...Maybe it's just me but I can't stand ANY game without at least 2x AA. And this is on my 24" LCD...1920x1200 resolution.

So to answer the original question. Native resolution is the number of physical pixels your LCD has. To run it at a lower resolution, it is forced to combine physical pixels to make a graphical pixel. This will result in a grainy or slightly blury picture. If your video card can handle it, I would recommend always running it at the native resolution. As for AA, I tend to put AF at 8x and then play with the AA setting until it is playable and smooth.

For me, I need at least 2x AA, so sometimes I have to lower graphical settings (like texture res and shadow detail) in the game so that I can use AA. This may not be the case for you though. If you can stand not having any AA, go for that.

EazyDuz14
06-23-06, 04:55 PM
Thanks for the replies.
Oh also, why use 8X AF as well as high texture settings in a game? Its the equivalent of using AA in the CP and AA in game settings, there is no need. I was just playing the Prey demo and i tried with high in game texture settings, with app controlled AF, then AF 8X and it made no difference!

I also tried the demo in 1280X1024 with 2X AA and yea, images look alot sharper, man that games like real life! It ran suprisingly well for a brand new title. Although i was only running around in some house. But still i had FPS nothing below 30 (well maybe when you look in that mirror)
My 6600GT isnt as good as it once was, better cards are taking over. Im going to pay big money for a nice DX10 card though :)

EazyDuz14
06-24-06, 04:37 AM
I cant seem to get Condemned to work very well either. Same with FEAR.
It doesnt matter what settings i have, but sometimes when i look at certain things my FPS goes somewhere in the 20's. Like on Condemned (if anyone has it) on the metro level (level 2) there is fencing all along the right hand side as you walk down the track. If i have the torch on and i look at the fence my FPS goes really low, whe im not looking at it my fps is 40-80+. It also happened on Fear when im in a large room which is well lit.
Im going to install the new Nvidia drivers (91.13 i think?) because they have some new control panel thing.
Does anyone else get this on Fear or Condemned?

Please reply.

|MaguS|
06-24-06, 04:41 AM
Condemned and FEAR are very intensive games, they require a great PC to run with high settings. Make sure you are not running AA with it and the game settings aren't set too high.

You have a 6600GT correct? I wouldn't expect to play either game at a high frame rate.

EazyDuz14
06-24-06, 04:49 AM
You have a 6600GT correct? I wouldn't expect to play either game at a high frame rate.

Well i managed to complete fear with the following important settings:

In control panel:

4X AA
app controlled AF

In game:
All of the computer section on highest
lighting effects: highest
Shadows: none
shaders:low
pixel doubling: no
res: 1024X768

Everything else was on med/high and in some parts of the game i had FPS of 60-120. But on some parts, i remember one of which where you are standing on the edge of an elevator shoot, the fps drops to 20 or around that area. I dont get why, and nothing i can do really helps. Same goes for condemned. Must be a poorly optimized game...

|MaguS|
06-24-06, 05:01 AM
Honestly, I don't see why you are running with any AA, sure it makes the game look jaggie but its such a huge performance killer for a 6600GT. You should just have it off and up the resolution and settings.

My roommate has a 6600GT and never uses AA and is able to play a game like Oblivion fine at 1280x1024.

EazyDuz14
06-24-06, 05:30 AM
My card can handle AA fine though. I have tried running i 1280X1024 with no AA on condemned but i still get a huge FPS drop when i turn to look at the fence. Turning the torch on kills some fps to. Why does it do this?

EazyDuz14
06-25-06, 05:01 AM
Would would you say this setup is (low,med,high settings)

CP:
2xq AA
app controlled AF

In game:

All of computer section on high
High lighting effects
Effect on high
Shadows off
1024X768

Texture quality is on 4X AF

These are the largest fps killers for Condemned.
Would you say these are low settings or medium or high?

|MaguS|
06-25-06, 05:30 AM
High settings, even though your resolution is low you are putting high lighting effects on which cause alot of performance drop.

It's like running HDR in Oblivion even on 1024x768, it still will be a huge performance loss due to it.

EazyDuz14
06-25-06, 05:40 AM
High settings, even though your resolution is low you are putting high lighting effects on which cause alot of performance drop.

It's like running HDR in Oblivion even on 1024x768, it still will be a huge performance loss due to it.

Cool :) Im quite pleased with your reply as my card is over a year old now, i guess its still classed as high! I was expecting med/low really. I tried using 1280X1024 with no AA, i thought it would perform better than 4X AA and 1024X768 but it didnt.
I dont know if you have got/played the game but its in widescreen, i couldnt stand that so i added in the config file:

"Widescreen" "4.3"

to make it use full screen. This however kills alot of FPS (about 15) so i couldnt run shadows as well as full screen. It was either shadows, lighting and the letter box effect, or fullscreen with no shadows but high lighting.
So i guess its even better considering im playing it in full screen, as when people say "I play Condemned with high shadows etc..." alot of people mean with the letter box effect screen, which i cant stand.

|MaguS|
06-25-06, 05:44 AM
The 6600GT isn't a bad card but its not a card you want to push many shader intensive things through. My roommate plays Oblivion at 1024x768 with slightly higher then auto config settings, but he also hasn't tweaked his ini (he wont let me touch it, lol).

EazyDuz14
06-25-06, 05:49 AM
The 6600GT isn't a bad card but its not a card you want to push many shader intensive things through. My roommate plays Oblivion at 1024x768 with slightly higher then auto config settings, but he also hasn't tweaked his ini (he wont let me touch it, lol).


Maybe mine is slightly more powerful is because i added 20 mhz overclocking to the 3D and memory clock settings.

EazyDuz14
06-25-06, 05:50 AM
The 6600GT isn't a bad card but its not a card you want to push many shader intensive things through. My roommate plays Oblivion at 1024x768 with slightly higher then auto config settings, but he also hasn't tweaked his ini (he wont let me touch it, lol).


Maybe mine is slightly more powerful is because i added 20 mhz overclocking to the 3D and memory clock settings.

EazyDuz14
06-25-06, 05:50 AM
The 6600GT isn't a bad card but its not a card you want to push many shader intensive things through. My roommate plays Oblivion at 1024x768 with slightly higher then auto config settings, but he also hasn't tweaked his ini (he wont let me touch it, lol).


Maybe mine is slightly more powerful is because i added 20 mhz overclocking to the 3D and memory clock settings.

EazyDuz14
06-25-06, 05:50 AM
The 6600GT isn't a bad card but its not a card you want to push many shader intensive things through. My roommate plays Oblivion at 1024x768 with slightly higher then auto config settings, but he also hasn't tweaked his ini (he wont let me touch it, lol).


Maybe mine is slightly more powerful is because i added 20 mhz overclocking to the 3D and memory clock settings.

EazyDuz14
06-25-06, 05:51 AM
The 6600GT isn't a bad card but its not a card you want to push many shader intensive things through. My roommate plays Oblivion at 1024x768 with slightly higher then auto config settings, but he also hasn't tweaked his ini (he wont let me touch it, lol).


Maybe mine is slightly more powerful is because i added 20 mhz overclocking to the 3D and memory clock settings.

Oops, my browser wouldnt post my message, isnt there a way to delete all of these?

|MaguS|
06-25-06, 06:00 AM
Hehehe

20mhz isn't a noticable OC I belive, but I have no real way of testing it. The 7 Series works differently and my roommates can't OC at all (1mhz OC fails).

EazyDuz14
06-25-06, 06:02 AM
Hehehe

20mhz isn't a noticable OC I belive, but I have no real way of testing it. The 7 Series works differently and my roommates can't OC at all (1mhz OC fails).

Maybe i should put it back to defaults then. I did mean 20 mhz each, not in total. I let Nvidia do the optimizer for safest settings, put Condemned on but i saw yellow flickering so i instantly stopped the game and put it back. So much for safe optmization :(

|MaguS|
06-25-06, 06:04 AM
The Automatic Overclocking isn't accurate at all... Its just good to see an estimate but not a permenent solution.

Blacklash
06-25-06, 06:06 AM
Any time you are having FPS issues in a game try disabling or greatly lowering shadows. Many games are coded to do them exclusively on the CPU.

|MaguS|
06-25-06, 06:25 AM
Any time you are having FPS issues in a game try disabling or greatly lowering shadows. Many games are coded to do them exclusively on the CPU.

What? Most Shadows are always rendered on the GPU, that is why when a game has a bug with drivers shadows are usually the first to become an issue. I can't even think of a single game that renders shadows on the CPU that is 3d Accelorated.