Forums - MediaCodec vs OpenMAX as implementation interface

10 posts / 0 new
Last post
MediaCodec vs OpenMAX as implementation interface
ketan
Join Date: 13 Oct 16
Posts: 6
Posted: Mon, 2016-10-31 09:15

Hi,

Based on reviewing samples available under Android NDK it appears that in order to use hardware decoders (OMX.xxx) we can use either MediaCodec or OpenMAXAL interface.

My questions are:

1. Is this assumption correct?

2. Is there any preference to use OMX interface for decoding video + audio for mp4 or other streaming video formats.

3. Does any approach increases overhead for syncing video and audio? I would be doing some processing on each video frames.

Assumption is that I will be running on Sanpdragon 820+ and leverage DSP for decoding and OpenGL for video rendering.

Regards 

Ketan

  • Up0
  • Down0
ketan
Join Date: 13 Oct 16
Posts: 6
Posted: Wed, 2016-11-02 12:15

As usual nothing relevant at Qualcomm site. This is the last place to get any information.

For someone else looking for answer they can get some info from Stackoverflow - http://stackoverflow.com/questions/40361649/mediacodec-ndk-vs-openmax-fo...

  • Up0
  • Down0
swinston
Join Date: 8 May 15
Posts: 9
Posted: Fri, 2017-01-06 16:01

Hi Ketan,

Here are the answers to your questions.

1.) yes, MediaCodec and OpenMAXAL will give you the hardware decoders equally well.
2.) Use the right library for your needs.
3.) Technically there is no issue with syncing video and audio in OpenMAX as OpenMAX in Google's impelmentation doesn't handle audio rendering, you do.  Thus, keep your timing in line (relatively easy) and it will work.
 

This is unfortunately an area that hasn't received a lot of attention from Google.  There is not one officially supported way of playing media within the NDK, there's actually several.  OpenMAX in Google's implementation doesn't support almost anything that the OpenMAX standard is supposed to support.  You can get full implementations from 3rd parties, but in general, expect that if you want to display mp4 ts files, you're going to use OpenMAX and MediaCodec for everything else.

There was a nice presentation I saw QC give a while ago on how to recompile the Android system to give you QC libraries you can package with your application that give full support for OpenMAX but I can no longer find that presentation.
Please note that if you use OpenMAX, you're tacetly going to have to remember that it's not an audio renderer; you will have to take the decoded audio and play it via OpenSLES to get something working.  Like I said, there really isn't one standard here (yet).

I personally use OpenMAX as I find it easier to get access to the frame data.

  • Up0
  • Down0
ketan
Join Date: 13 Oct 16
Posts: 6
Posted: Mon, 2017-01-09 07:01

Hi Winston,

Thanks for the reply. 

I ended up going with NDK's MediaCodec route along with OpenSL ES for audio. Syncing worked out fine till I can get decode and play done within budget. I saw that OpenMAX is not fully implemented and lacks support in terms of documentation, examples etc. 

Regards

Ketan

  • Up0
  • Down0
swinston
Join Date: 8 May 15
Posts: 9
Posted: Wed, 2017-01-11 15:40

Hi Ketan,

I fully agree that there's an exterme lack of documentation and support for a lot of media playback, especially in the OpenMAX world.  I actually used to serve on the OpenSLES and OpenMAX AL khronos group.  The advantages of using OpenMAX are actually pretty phenominal.  Unfortunately, Google isn't providing a complete implementation; so in this case it really falls down.

What I've done in my projects is to use OpenMAX for our HLS player, as mpegts isn't supported by MediaCodec (yeah, that's incomplete too).  Then for all other media types I use MediaCodec and OpenSLES (Which I'll also note, isn't completely supported either).  What's sad is the different levels of support even amongst different NDK versions has created a situation where it's not easy to create sample code.

Such is the world we live in.

Thanks,
Steve

  • Up0
  • Down0
ketan
Join Date: 13 Oct 16
Posts: 6
Posted: Thu, 2017-01-12 06:41

Hi,

I agree completely. Nice to know that you have closer association and experienced it first time.

I spent long time in OpenGL and OpenCL world, later has similar struggle, which forces to select vendor APIs. I will comment on my final approach for other's benefit.

On Android using MediaCodec or NDK's MediaCodec is tbe best approach to leverage hardware decoders. In most cases it will provide best decoder available on the platform. OpenMAX is used mostly by hardware vendors to provide decoders but it is almost useless at higher level. My library is implelemted in C++, so used NDK's MediaCodec (if implemented on separate thread, it should be attached to JVM) and OpenSL to play audio (need to understand device play queue and callback approach). Image processing is C++ and OpenGL. 

Hope this helps to other people.

Ketan

  • Up0
  • Down0
maricel.eri
Join Date: 9 Sep 19
Posts: 3
Posted: Thu, 2019-10-03 01:23

Hello!

I am decoding video and audio with Qualcomm Snapdragon 820Am Automotive Development Platform.

Currently, we use NDK Mediacodec for this. However, we are having performance issue when trying to decode ts file file with 1440x1080 resolution.

My questions are:

1) By using NDK Mediacodec and specifying the following as decoders

  OMX.qcom.video.decoder.mpeg2; OMX.qcom.video.decoder.avc

when calling AMediaCodec_createCodecByName, is it already using the hardware decoders of the qualcomm snapdragon?

2) Is there a way to confirm that the hardware decoder of snapdragon is used upon decoding?

Thanks in advance for your answers.
 
Regards,
 
Eri
  • Up0
  • Down0
ketan
Join Date: 13 Oct 16
Posts: 6
Posted: Thu, 2019-10-03 07:10

1. If you are using NDK (C++) on Android then you will get hardware decoders optimal for your format. You can always review `logcat -s Media*` or similar formats to confirm. 

2. I did not use OMX as it is implemented by the drivers under the hood. Using NDK is sufficient.

3. I have done the same (ran multiple filters on each frame). This may be a bigger question and would recommend doing analysis of cost/benefit before you dig into it. Check some links below. Also StackOverflow is better to get any response, I never got any response here. Once you do decoding, you will have to spend time to write layer that manages frames (raw + processed video, audio) and play appropriately. Keeping in mind FPS of original video and audio bit rates. I went with OpenGL and OpenAL to manage frame processing and audio. All of these for 560p video tool 5-8 msec. 

Hope this helps

Ketan

https://stackoverflow.com/questions/49760884/custom-player-using-ndk-c-mediacodec-starvation-buffering-in-decoder

  • Up0
  • Down0
maricel.eri
Join Date: 9 Sep 19
Posts: 3
Posted: Fri, 2019-10-04 04:15

Hello Ketan!

Thanks a lot for the response.

Will check on it.

  • Up0
  • Down0
stokleygary
Join Date: 28 Feb 21
Posts: 3
Posted: Sun, 2021-02-28 17:05

It makes no difference to me in using the format in decoding. But I'm more interested in the moment with the synchronization of audio and video because I need to process each frame and find a suitable solution for this.

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.