Converting a Bitmap to a WebRTC VideoFrame

我怕爱的太早我们不能终老 提交于 2020-06-26 12:06:23

问题


I'm working on a WebRTC based app for Android using the native implementation (org.webrtc:google-webrtc:1.0.24064), and I need to send a series of bitmaps along with the camera stream.

From what I understood, I can derive from org.webrtc.VideoCapturer and do my rendering in a separate thread, and send video frames to the observer; however it expects them to be YUV420 and I'm not sure I'm doing the correct conversion.

This is what I currently have: CustomCapturer.java

Are there any examples I can look at for doing this kind of things? Thanks.

来源:https://stackoverflow.com/questions/51494654/converting-a-bitmap-to-a-webrtc-videoframe

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!