问题
I'm working on a WebRTC based app for Android using the native implementation (org.webrtc:google-webrtc:1.0.24064), and I need to send a series of bitmaps along with the camera stream.
From what I understood, I can derive from org.webrtc.VideoCapturer
and do my rendering in a separate thread, and send video frames to the observer; however it expects them to be YUV420 and I'm not sure I'm doing the correct conversion.
This is what I currently have: CustomCapturer.java
Are there any examples I can look at for doing this kind of things? Thanks.
来源:https://stackoverflow.com/questions/51494654/converting-a-bitmap-to-a-webrtc-videoframe