Forums - Prefered way to do texture updates for a video player app?

2 posts / 0 new
Last post
Prefered way to do texture updates for a video player app?
Terje_Wiesener
Join Date: 4 Feb 12
Posts: 2
Posted: Tue, 2012-02-07 15:59

Hi,

I am (as many others) working on a movie player app for android based on ffmpeg.

I currently do video decoding to YUV420P in ffmpeg, and feed the frames into OpenGL textures to be converted to RGB by a simple shader. I am using OpenGL ES 2.0, and I am testing the code on a HTC Desire running Cyanogenmod 7.

Now, the issue I am having is that texture access is very slow when using GL_LUMINANCE for the textures - Rendering a 640x480 frame with GL_LUMINANCE for the Y, U and V channels typically takes around 50ms.

If I instead of GL_LUMINANCE use GL_RGBA and pack 4 pixels in one (with half the width and height) the same frame suddenly only takes around 22ms to render with the same shader. The problem with this approach is that in order to extract the correct (sub)pixel coordinates to use in the rendering, I have to add some complex operations, causing the total time to climb into the 80-100ms range. I believe the problem here is related to some of the required operations causing branching (mod() ) and dependant texture reads.

One other approach I have tried is to upscale the U and V planes to the same resolution as the Y plane and interleave them in a 4 32bpp sparsely filled buffer to get better texture performance. However, I cannot find an efficient way of doing the interleaving of the arrays in software - my naive implementation took around 40ms per frame, and  the most optimized approach I could come up with is still hovering around 25 ms just for the interleaving.

Can I expect better performance for the LUMINANCE pixel format if I use power-of-2 textures?

I have read that the Adreno 200 supports "streaming textures", but I cannot find any more info on how to use this functionality. Do you have any pointers?

I have also seen some mentions of native YUV pixel format support (perhaps through the GL_OES_EGL_image_external extension?), but I am struggling to find any documentation on this feature as well. Where is this documented?

I even looked into the media samples, and while there seems to be a renderer available there that can do YUV in hardware, it seems to me that you need to build and link against private android platform headers to use it? I guess this would mean ABI problems when it comes to deployment.

I see many other threads have brought up a similar question here in the forum, but there seems to be no definitive answer to the topic, so I see it fit to ask again.

In short, what is the prefered way to do  YUV to RGB conversion on Android 2.2 w/ Adreno 200? What are some key factors to consider?

 

  • Up0
  • Down0
Mark_Feldman Moderator
Join Date: 4 Jan 12
Posts: 58
Posted: Mon, 2012-02-13 08:00

Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

For Android 2.3 Google introduced support for YV12 and NV21 formats for streaming video.

These are available to Android developers. When these are used as textures they are rendered to the format of the EGL surface, RGB/A.

 

Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

To do streaming YUV video into RGB:

Use the GL_OES_EGL_image_external extension.

In Android you will allocate an android_native_buffer_t, of the YUV format. Fill the buffer with YUV data. Then create eglImage using eglCreateImageKHR, with target EGL_NATIVE_BUFFER_ANDROIDand this buffer. Then use glEGLImageTargetTexture2DOES() to attach that eglImage to texture target TEXTURE_EXTERNAL_OES.

Then that texture can be used with rendering. When rendering is done that eglImage buffer can be updated with new YUV data and render again for next frame. If you tie these buffers to the output of video decoding, you get optimal performance.

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.