nV News Forums

 
 

nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   NVIDIA Linux (http://www.nvnews.net/vbulletin/forumdisplay.php?f=14)
-   -   AIGLX and CPU consumption (http://www.nvnews.net/vbulletin/showthread.php?t=80229)

Rino 11-12-06 05:26 PM

AIGLX and CPU consumption
 
1 Attachment(s)
I noticed that AIGLX consume lots of CPU (about 30% when I move a window, sometimes to 100% when I rotate the cube very fast) . It is said that AIGLX is designed not to consume as much CPU power. What causes that much CPU consumption? Do you have any plans to optimize AIGLX support? I have Xelo FX5200, Fedora Core 6, 1.0.9629 drivers (Livna's package; the same result with nVidia generic tarball package). I tried AIGLX with some integrated Intel graphic card and it works perfectly, it consume almost none of CPU power.

Thanks for the answer in advance.

lloeki 11-13-06 02:03 AM

Re: AIGLX and CPU consumption
 
with nvidia, you don't need AIGLX. nvidia has its implementation of tex_from_pixmap as GLX. thus on a nvidia card, using aiglx will makes it tex_from_pixmap in software, whereas with intel cards, the driver being opensource, they may have made it accelerated even with AIGLX. using GLX with nvidia will make tex_from_pixmap accelerated again, with currently a few caveats (see black windows topics).

to nvidia forum mods/devs, this should be made clear once and for all, you should really make a sticky with clear title explaining that AIGLX != nvidia tex_from_pixmap. that will clear up many doubts in term of feedback, as more people are thinking that ( compiz & !XGL ) => AIGLX, even with nvidia.

Rino 11-13-06 03:26 AM

Re: AIGLX and CPU consumption
 
Quote:

Originally Posted by lloeki
with nvidia, you don't need AIGLX. nvidia has its implementation of tex_from_pixmap as GLX. thus on a nvidia card, using aiglx will makes it tex_from_pixmap in software, whereas with intel cards, the driver being opensource, they may have made it accelerated even with AIGLX. using GLX with nvidia will make tex_from_pixmap accelerated again, with currently a few caveats (see black windows topics).

Ok, how can I get all these fancy effects without AIGLX? I've read somewhere that nVidia will support AIGLX rather than XGL. I don't really care much about which technology I use, I just want it to do the job.

a7v 11-13-06 03:57 AM

Re: AIGLX and CPU consumption
 
Being a sceptic and having experience with running computers with flaky graphic cards I haven't tried out 3D-accelerated desktops yet, but you should probably start by reading HOWTO: Compiz with NVIDIA Graphics Drivers. There's probably also a section about the same subject in the readme that came with your nvidia-drivers. (Installed in /usr/share/doc/nvidia-glx/README.txt.gz for me, but probably different places in other distros)

ioannis 11-13-06 07:32 AM

Re: AIGLX and CPU consumption
 
Quote:

Originally Posted by Rino
I noticed that AIGLX consume lots of CPU (about 30% when I move a window, sometimes to 100% when I rotate the cube very fast) . It is said that AIGLX is designed not to consume as much CPU power. What causes that much CPU consumption? Do you have any plans to optimize AIGLX support? I have Xelo FX5200, Fedora Core 6, 1.0.9629 drivers (Livna's package; the same result with nVidia generic tarball package). I tried AIGLX with some integrated Intel graphic card and it works perfectly, it consume almost none of CPU power.

Thanks for the answer in advance.


(correct me if I'm wrong) By installing the nvidia driver (>=9625) you are using nvidia's implementation of the GLX_EXT_texture_from_pixmap extension (provided by their libglx.so). You do need Xorg 7.1 which happens to be AIGLX enabled (that's the native libglx.so, which is replaced by the nvidia driver), which is what causes the confusion in my opinion.

Rino 11-13-06 01:28 PM

Re: AIGLX and CPU consumption
 
Quote:

Originally Posted by ioannis
(correct me if I'm wrong) By installing the nvidia driver (>=9625) you are using nvidia's implementation of the GLX_EXT_texture_from_pixmap extension (provided by their libglx.so). You do need Xorg 7.1 which happens to be AIGLX enabled (that's the native libglx.so, which is replaced by the nvidia driver), which is what causes the confusion in my opinion.

$ locate libglx.so
/usr/lib/xorg/modules/extensions/libglx.so
/usr/lib/xorg/modules/extensions/nvidia/libglx.so
/usr/lib/xorg/modules/extensions/nvidia/libglx.so.1.0.9629

$ rpm -qf /usr/lib/xorg/modules/extensions/libglx.so \
> /usr/lib/xorg/modules/extensions/nvidia/libglx.so \
> /usr/lib/xorg/modules/extensions/nvidia/libglx.so.1.0.9629
xorg-x11-server-Xorg-1.1.1-47.fc6.i386
xorg-x11-drv-nvidia-1.0.9629-1.lvn6.i386
xorg-x11-drv-nvidia-1.0.9629-1.lvn6.i386

You may be right. But, AIGLX works, and it didn't work with the previous version of nVidia propriatery driver. Hm...

ioannis 11-13-06 05:45 PM

Re: AIGLX and CPU consumption
 
Quote:

Originally Posted by Rino
You may be right. But, AIGLX works, and it didn't work with the previous version of nVidia propriatery driver. Hm...

I'm not sure what you mean by "AIGLX works". How you can test this using nvidia's closed driver package. I haven't tried to restore Xorg's libglx and see if it works with the nvidia driver though, which might be what you are saying.

I find the following helpful in understanding the subject: Communication between Xgl and Xorg
mainly talks about Xgl, but also shows how things look like with AIGLX. The whole confusion comes from the fact that nvidia devs provide their own accelerated indirect GLX implementation via their libglx.so

to nvidia devs: what's the reason behind a custom implementation of the libglx.so? It looks very redundant to me. You must have your reasons.

lloeki 11-14-06 11:34 AM

Re: AIGLX and CPU consumption
 
Quote:

their own accelerated indirect GLX implementation via their libglx.so
because it's not indirect. it's GLX not AIGLX.

please please please people, make proper difference between those really different things:
- GLX_EXT_texture_from_pixmap: the thing needed for a X rendered window to become a texture handlable by opengl
- XGL: an implemental solution to the above, where two X servers are running, one relaying things to the other
- AIGLX: another implemental solution, where things are rendered indirectly offscreen, then put on screen, thus inducing an overhead.
- GLX: the usual opengl+x solution, where opengl things are rendered directly on X screen. this is the nvidia solution to implement texfrompixmap.

Quote:

to nvidia devs: what's the reason behind a custom implementation of the libglx.so?
in regard to tex_from_pixmap, because of the above

anybody feel free to correct me about that if anywhere I'm wrong


All times are GMT -5. The time now is 10:55 AM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.