Android mediacodec encodes and decodes simultaneously

I'm trying to use GPU to apply effects to the frames of the video, and then re encode these frames into a new result video

To improve performance, I implemented the following processes:

There are three different threads, each with its own OpenGL context. These contexts are set so that they can share textures between them

Thread 1 extracts frames from video and saves them as textures in GPU memory, similar to this example

Thread 2 uses the modified version of gpuimage to process textures, which can also output textures in GPU memory

Finally, thread 3 writes the texture obtained from thread 2 to a new video file, similar to the method described here

Use the queues between threads 1 and 2 and between threads 2 and 3 to maintain the frame order. After the texture is used for processing / writing, manually delete it from memory

The focus of this process is to separate each process so that the final performance will be the slowest of the three threads

Question:

The final video is 90% black frames, and only some of them are correct

I checked the results of extraction and processing, and they all worked as expected. Also note that the three components described in the three threads can work well together in a single thread

I tried to synchronize thread 1 and thread 3, and after adding 100ms sleep time on thread 1, the video result is good. There may be 1 or 2 black frames. In my opinion, the two instances of decoder and encoder cannot work at the same time

I will use any other required details to edit this post

resolvent:

Sharing textures between OpenGL es contexts requires extra care. The implementation activity in grafika's "display capture camera" has been interrupted; For details, see this issue. The basic problem is that you basically need to issue a memory barrier when updating the texture. In fact, this means issuing glfinish() on the producer side, then rebinding the texture on the consumer side, and completing all these operations in the synchronization block

If you can do all the gles work on a single thread, your life will become simpler (efficient). In my experience, it is unwise to activate multiple gles contexts at one time, and some trouble can be avoided by finding alternative methods

You may want more of these things:

>Thread 1 reads the file and feeds the frame to the mediacodec decoder. The decoder sends the output to the surface texture surface. > thread 2 has a gles context. It creates the surface texture to which thread 1 sends the output. It processes the image and renders the output on the surface of the mediacodec encoder. > the thread creating the mediacodec encoder #3 sits there waiting for the encoded output. Receives the output After, write it to disk. Please note that using mediamuxer may delay; For more information, see this blog post

In all cases, the only communication between threads (and internal processes) is through the surface. Surfacetexture and mediacodec instances are created and used from a single thread. Only the producer endpoint (surface) is passed

Flow control is a potentially troublesome point - if they are fed too fast, surface extensions will drop frames. Depending on the situation, combining threads #1 and #2 may make sense

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>