View Single Post
Old 05-06-04, 05:46 PM   #1
sxvinc
Registered User
 
Join Date: May 2004
Posts: 3
Default GeForce FX 5200 and antialias performance

Hi,

I have a GeForce FX 5200 in a 2.4 GHz P4 running RedHat 7.2 (XFree 4.1). I am running an OpenGL application.

I am a newcomer to graphics and OpenGL in particular. I naively expected antialiasing operations to be done on the graphics card. But I find that when I enable antialiasing it jacks the CPU rate up substantially. The better the antialiasing method I choose the higher the corresponding CPU rate. Without antialiasing my test app runs at about 1% total CPU (test app plus X). Turning on 8 bit multisample drives this up to about 65%, most of it used by the test app.

My particular question is whether getting a GeForce FX 5600 card will help since it has Intellisample on it (and the FX 5200 doesn't)?

A general question is: How is the antialiasing done? Obviously everything cannot be done in software or all cards would have the same functionality and they do not (e.g. GeForce MX and GeForce FX). Why isn't everything done on the card? What are the responsibilities of the software and the hardware when it comes to antialiasing?

Thank you for your help.

charlie

[BTW, my ultimate concern is to minimize CPU usage as opposed to maximizing frame rate. (Although these are related)]
sxvinc is offline   Reply With Quote