Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 12-29-09, 04:38 PM   #1
seaweed
Registered User
 
Join Date: Sep 2009
Posts: 57
Default vdp_output_surface_get_bits_native

Hi, I am developing an app to decode/render video with VDPAU. While the decoding/rendering work flawlessly, when I call vdp_output_surface_get_bits_native to dump decoded data to a malloc'ed buffer, it crashes. If I enable VDPAU_TRACE and
VDPAU_NVIDIA_DEBUG, it hangs on that function call without printing any information on the console. Here is the line that causes the crash:


vdp_st = pVDPAUCntx->vdp_output_surface_get_bits_native( output_surface, NULL, (void *) pRendererSession->getPicCbackParam->pData, (uint32_t *) pPictureInfo->stride );

prior to calling this, I malloc pRendererSession->getPicCbackParam->pData of videoWidth*videoHeight*4 (the native RGB format for the video surface is VDP_RGBA_FORMAT_B8G8R8A8). I also set pPictureInfo->stride[0], pPictureInfo->stride[1], pPictureInfo->stride[2], pPictureInfo->stride[3] to videoWidth.


Does anybody have any clue why it might causes the crash? I would appreciate if anybody could help.

Thanks
seaweed is offline   Reply With Quote
Old 01-04-10, 12:30 PM   #2
Stephen Warren
Moderator
 
Stephen Warren's Avatar
 
Join Date: Aug 2005
Posts: 1,327
Default Re: vdp_output_surface_get_bits_native

Are you sure that videoWidth/videoHeight (and hence stride) exactly match the surface size that VDPAU internally allocated?

Note that when you allocate a surface in VDPAU, the implementation is free to round up the actual surface size to meet implementation requirements. You should use VdpOutputSurfaceGetParameters to retrieve the exact allocated width/height. The put/get bits functions default size (i.e. when rect is NULL) is the entire allocated surface size, not the user-requested surface size.

Also, pitch should be width * pixel_size (i.e. width * 4), not just width; it's bytes, not pixels.

Failing that, it's probably best to run under a debugger, and check the address that VDPAU is accessing when the segfault occurs. You may be able to work back from that to work out why it's attempting to write that location based on pointers and pitch.

This isn't an issue, but for that format, you only need to set pitch[0] to a valid value; there's only a single plane.
Stephen Warren is offline   Reply With Quote
Old 01-06-10, 01:31 PM   #3
seaweed
Registered User
 
Join Date: Sep 2009
Posts: 57
Default Re: vdp_output_surface_get_bits_native

Quote:
Originally Posted by Stephen Warren View Post
Are you sure that videoWidth/videoHeight (and hence stride) exactly match the surface size that VDPAU internally allocated?
This was not the issue in my case but I added this extra check.

Quote:
Originally Posted by Stephen Warren View Post
Note that when you allocate a surface in VDPAU, the implementation is free to round up the actual surface size to meet implementation requirements. You should use VdpOutputSurfaceGetParameters to retrieve the exact allocated width/height. The put/get bits functions default size (i.e. when rect is NULL) is the entire allocated surface size, not the user-requested surface size.

Also, pitch should be width * pixel_size (i.e. width * 4), not just width; it's bytes, not pixels.
great, it was the pitch value in pixels that caused the issue. I did this change and its working perfectly.

Is it possible to get your direct email address, we are emerging OEM company of EVGA/NVIDIA and I have some basic questions regarding openGL integration with VDPAU.
seaweed is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 02:10 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.