View Single Post
Old 02-15-11, 01:02 PM   #1
LubosD
Registered User
 
Join Date: Jan 2005
Location: Czech Republic
Posts: 451
Send a message via ICQ to LubosD
Default VDPAU via libva 1.0.8 crashing

Hi, I'm trying to use VDPAU in VLC, which involves libva. After a Debian update to libva 1.0.8 it crashes, but it used to work with libva 1.0.1.

Code:
Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x7fffca2a0700 (LWP 21214)]
0x00007fffed201e68 in XQueryExtension () from /usr/lib/libX11.so.6
(gdb) bt
#0  0x00007fffed201e68 in XQueryExtension () from /usr/lib/libX11.so.6
#1  0x00007fffed1f6062 in XInitExtension () from /usr/lib/libX11.so.6
#2  0x00007fffeb94a712 in XextAddDisplay () from /usr/lib/libXext.so.6
#3  0x00007fffc9972821 in ?? () from /usr/lib/libvdpau_nvidia.so
#4  0x00007fffc990285d in vdp_imp_device_create_x11 () from /usr/lib/libvdpau_nvidia.so
#5  0x00007fffc9e93e90 in __vaDriverInit_0_31 () from /usr/lib/dri/nvidia_drv_video.so
#6  0x00007fffcde11242 in vaInitialize () from /usr/lib/libva.so.1
#7  0x00007fffce03e8e2 in Open (i_codec_id=<value optimized out>) at vaapi.c:144
#8  vlc_va_NewVaapi (i_codec_id=<value optimized out>) at vaapi.c:490
#9  0x00007fffce03b3b5 in ffmpeg_GetFormat (p_codec=<value optimized out>, pi_fmt=0x7fffcd3f6c94) at video.c:1177
#10 0x00007fffcd173cfd in ?? () from /usr/lib/libavcodec.so.52
#11 0x00007fffcd173dbc in ?? () from /usr/lib/libavcodec.so.52
#12 0x00007fffcd273e80 in avcodec_decode_video2 () from /usr/lib/libavcodec.so.52
#13 0x00007fffcd273f0f in avcodec_decode_video () from /usr/lib/libavcodec.so.52
#14 0x00007fffce03ab06 in DecodeVideo (p_dec=0x1feeb00, pp_block=<value optimized out>) at video.c:550
#15 0x00007ffff7924147 in DecoderDecodeVideo (p_dec=0x1feeb00, p_block=0x7fffd00ddd60) at input/decoder.c:1466
#16 0x00007ffff7924fa9 in DecoderProcessVideo (p_dec=0x1feeb00, p_block=<value optimized out>) at input/decoder.c:1815
#17 DecoderProcess (p_dec=0x1feeb00, p_block=<value optimized out>) at input/decoder.c:2003
#18 0x00007ffff79251fb in DecoderThread (p_this=<value optimized out>) at input/decoder.c:892
#19 0x00007ffff796ff34 in thread_entry (data=<value optimized out>) at misc/threads.c:58
#20 0x00007ffff76ca8ba in start_thread (arg=<value optimized out>) at pthread_create.c:300
#21 0x00007ffff722e02d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:112
#22 0x0000000000000000 in ?? ()
From the stack trace, I don't know who's the real culprit here. Whether it's libva doing something nasty, or vdpau calling libx11 the wrong way or libx11 crashing instead of returning an error...

That being said, could the NVDIA devs please take a look at that and decide who is to blame?
LubosD is offline   Reply With Quote