Output processing frames from mediacodec and update frames on Android

I am working on an image processing project. I receive the original H264 video stream in real time and decode it with mediacodec. I have successfully displayed the decoded video on TextureView or surfaceview. Now, I want to process each frame, process it with opencv4android, and then display the updated video frame on the screen. I know that opencv has a sample project, The sample project demonstrates how to process video frames from a phone camera, but I wonder what to do if I have another video source

I also have some questions about TextureView:

>What is the function of onsurfaceextensureupdated() in surfaceextensurelistener? If I call getBitmap () in this function, does that mean that I get every frame of the video? What about surfacetexture.onframeavailablelistener? > Can a hidden TextureView be used as an intermediary to extract its frame for processing and render it back to another surface, such as an OpenGL es texture for display?

resolvent:

Various examples of using "camera" as input in grafika can also be used with the input of video stream. Either way, you can send video frames to surface

If you want to process video frames in the software instead of on the GPU, things will become more difficult. You either have to receive frames on the surface and copy them into the memory buffer, and you may perform color conversion from RGB to YUV in the process, or you must obtain YUV buffer output from mediacodec. The latter is tricky because there may be several different formats, Including Qualcomm's proprietary tile format

About TextureView:

>Whenever a TextureView receives a new frame, it will call onsurfacetextureupdated(). You can use getbitmap() to get each frame of the video, but you need to adjust the video playback speed to match your filter speed - if it falls behind, TextureView will discard the frame. > you can create a "hidden TextureView" by placing other view elements on it, But this is stupid. TextureView uses surfacetexture to convert video frames into OpenGL es textures, and then renders them as part of the drawing view UI. Use glreadpixels() to retrieve bitmap data. You can use these elements directly. Bigflag extractmpegframestest demonstrates this

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>