|11-02-06, 10:03 PM||#1|
Join Date: Jun 2005
Problems with xrender + xdbe
I'm writing an app that draws 25 FPS of data onto the root window of X.
If I only use the dbe extension (double buffering), it works fine, consuming 0 to 0.7 percent CPU time.
Sometimes though, and the reason for this is completely unknown to me, the completely same operation takes 25% to 35% CPU!
Now, I decided to use XRender drawing translucent blocks instead of solid ones. Without double buffering (ie, no dbe code), this works and takes about 10% of my CPU time. Again, I need the double buffer. Now XRender totally flips and takes 100% of my CPU, even if the block I am rendering is as small as 5 pixels! If I don't render to the back buffer, but to the front, the buffer swapping also leads to about 60% CPU usage.
Ok, so combining RENDER and DBE doesn't work at all!
Using the 'nv' driver though, it works very well! Using only DBE, the CPU usage is not distinguishable against working with the 'nvidia' driver while that one operated correctly (0-0.7 % CPU). Using DBE + XRender, I get a reasonable CPU usage of ~ 15 %. This still sucks, but at least makes some sense, if XRender is not hw-accelerated.
I would be happy to help out fixing this issue. It's sad that in something the nv driver is able to achieve, the nvidia driver fails.
|Thread||Thread Starter||Forum||Replies||Last Post|
|problems with drivers after 295.40 (OpenSuSE 12.1, GeFroce 7900 GS, Phenom II X4)||alawa||NVIDIA Linux||0||06-09-12 07:37 PM|
|Nvidia Problems in Slackware 8.1||xtreme||NVIDIA Linux||3||10-01-02 01:03 PM|
|resolution problems||the fat bastid||NVIDIA Linux||4||09-15-02 10:40 PM|
|mplayer & xmms problems!||replys2me||NVIDIA Linux||5||09-06-02 03:34 PM|
|DirectX 8 problems with GeForce4||Derwin||NVIDIA GeForce 7, 8, And 9 Series||19||08-20-02 02:23 AM|