Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-06-04, 05:46 PM   #1
sxvinc
Registered User
 
Join Date: May 2004
Posts: 3
Default GeForce FX 5200 and antialias performance

Hi,

I have a GeForce FX 5200 in a 2.4 GHz P4 running RedHat 7.2 (XFree 4.1). I am running an OpenGL application.

I am a newcomer to graphics and OpenGL in particular. I naively expected antialiasing operations to be done on the graphics card. But I find that when I enable antialiasing it jacks the CPU rate up substantially. The better the antialiasing method I choose the higher the corresponding CPU rate. Without antialiasing my test app runs at about 1% total CPU (test app plus X). Turning on 8 bit multisample drives this up to about 65%, most of it used by the test app.

My particular question is whether getting a GeForce FX 5600 card will help since it has Intellisample on it (and the FX 5200 doesn't)?

A general question is: How is the antialiasing done? Obviously everything cannot be done in software or all cards would have the same functionality and they do not (e.g. GeForce MX and GeForce FX). Why isn't everything done on the card? What are the responsibilities of the software and the hardware when it comes to antialiasing?

Thank you for your help.

charlie

[BTW, my ultimate concern is to minimize CPU usage as opposed to maximizing frame rate. (Although these are related)]
sxvinc is offline   Reply With Quote
Old 05-07-04, 05:54 AM   #2
Thunderbird
 
Join Date: Jul 2002
Location: Netherlands, Europe
Posts: 2,105
Default Re: GeForce FX 5200 and antialias performance

Personally I don't have much OpenGL programming experience, so I don't really know to what Nvidia FSAA mode this corresponds. Perhaps I'm talking bull**** but you might be using an FSAA mode that isn't supported by your card or else Nvidia is using different algorithms to get similar FSAA results. Have you already experimented with the __GL_FSAA_MODE environment variable to see if those settings have similar impact on cpu-useage?
Thunderbird is offline   Reply With Quote
Old 05-07-04, 06:29 AM   #3
SuLinUX
 
SuLinUX's Avatar
 
Join Date: Sep 2003
Location: UK
Posts: 847
Default Re: GeForce FX 5200 and antialias performance

Antialiasing is done on the GPU, it's down to the techs on how much room on the GPU AA takes. Drivers also have a saying in how it's done since the 5336 is rather better at doing antialiasing than the 4620, getting a card with more memory bandwith/fillrate grealy decreases the load.

I found that the 4620 is rather poorer at AA but general performance with AA off is better than 5336, about 10fps more infact.
__________________
AthlonXP 2600+ / nForce2 Asus A7N8X-X / PNY GeForce FX5900 Ultra / 1024Mb Samsung Ram /nForce Sound / Hansol 920D Plus 19" monitor / Lite-On 32x12x40 / 2x Maxtor HD 40Gb/80Gb / nVidia 7174 driver / Gnome 2.10.1 / Kernel 2.6.11.9 / Slackware 10.0
SuLinUX is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 10:43 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.