I would like to use a custom video source to live stream video via WebRTC Android implementation. If I understand correctly, existing implementation only supp
There are two possible solutions to this problem:
VideoCapturer
and create VideoFrame
using byte[]
stream data in onReceive
handler. There actually exists a very good example of FileVideoCapturer, which implements VideoCapturer
.VideoFrame
from NV21Buffer, which is created from our byte array stream data. Then we only need to use our previously created VideoSource
to capture this frame. Example:public void onReceive(byte[] videoBuffer, int size, int width, int height) {
long timestampNS = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
NV21Buffer buffer = new NV21Buffer(videoBuffer, width, height, null);
VideoFrame videoFrame = new VideoFrame(buffer, 0, timestampNS);
videoSource.getCapturerObserver().onFrameCaptured(videoFrame);
videoFrame.release();
}