I have a similar setup and similar problems. Here is my setup:
GF 7800 GT PCIe (2 displays)
GF 6200 PCIe (1 display)
Debian amd64 testing + X.org 7.1 from unstable + 9625 driver
I have xinerama up and running, but it is quite slow (i guess X uses software rendering when spanning over two cards). I tried compiz, hoping that it just might
use some hardware-acceleration, but i got the same error about randr. It works with twinview and only one card, but not with all three heads.
The problems still exist when i only use two screens on the same card, using xinerama instead of twinview: Although the X.log says 'RandR enabled', xrandr -q tells me randr is not available. I'm not sure if this extension is simply not available with xinerama (yet), or if this is a bug..
However, it seems compiz is using randr only to determine the screen refresh rate, so i removed every XRR* call, set the refresh rate to a fixed value and ignored the error. My hacked compiz works with twinview, but segfaults with xinerama.
The code, where compiz segfaults, is in addScreen:
glXMakeCurrent (dpy, s->output, s->ctx);
currentRoot = s->root;
glExtensions = (const char *) glGetString (GL_EXTENSIONS);
// print some debug output..
fprintf( stderr, "Got Extensions on %d: %s \n", screenNum, glExtensions );
When i start compiz with --indirect-rendering, glGetString returns NULL, which is not good. When compiz is started without --indirect-rendering, ltrace reports the segfault after glXMakeCurrent.
Is this a problem with compiz, which does not handle multihead correctly (twinview seems to be a single screen, as this code is only called once), or is there something like a opengl-is-only-supported-on-one-screen problem around and what can I do about it if so?
I also noticed, gtk-window-decorator exits with a 'RenderBadPicture' error, but i haven't looked into that one yet..