View Full Version : Force AntiAlias not working in Lightwave 3D

11-05-04, 06:46 AM
Hi there,

I have two Nvidia boards, an Asus FX5200 and an Asus Ti4600 that both have the same problem.
I use Lightwave 3D to visualise designs, and I often make screenshots to keep track of my progress. Lightwave uses OpenGL. I therefore wish to enable the anti aliasing settings of the video cards so that the screenshots look better. There are no options within lightwave to turn AA on or off, so I tried it by bypassing the 'application preference' switch in the drivers of both cards. Unfortunately none of them actually then start showing antialiased lines, not on any quality setting.

I have an ATi 9600 as well, and this one performs the AA as expected in Lightwave, although after a while it crashes the system, so I'd rather use the nvidia boards. This means that AA can work in Lightwave. From what I have noticed in the ATi settings is that AA is only availble up to a certain resolution. I tried bringing the resolution way down but that did not help. Frankly, I don't know what to do. I updated to the last official release of the forceware drivers on both machines with a proper un/reinstall procedure.

The systems both run Win XP Pro, on Pentium 4 2.2 Ghz with MSI motherboards and more than 1.0 GB of RAM. The systems are stable and fast otherwise.

11-05-04, 10:57 AM
You need a real Nvidia QuadroFX card for hardware OpenGL AA in DCC apps like Lightwave.

11-05-04, 11:00 AM
So it just *wont* do it eventhough it's capable?
how annoying...

11-05-04, 11:29 AM
go to guru3d and get rivatuner, you can softquadro your 5200.

11-07-04, 09:22 AM
excellent, ill try, thanks!

Mr. Tinker
01-21-05, 08:53 AM
I can get OpenGL AA in LW with my 6800. You just have to make a profile.