Custom video source for WebRTC on Android

橙三吉。 提交于 2020-07-05 10:45:48

问题


Overview

I would like to use a custom video source to live stream video via WebRTC Android implementation. If I understand correctly, existing implementation only supports front and back facing cameras on Android phones. The following classes are relevant in this scenario:

  • Camera1Enumerator.java
  • VideoCapturer.java
  • PeerConnectionFactory
  • VideoSource.java
  • VideoTrack.java

Currently for using front facing camera on Android phone I'm doing the following steps:

CameraEnumerator enumerator = new Camera1Enumerator(false);
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
VideoSource videoSource = peerConnectionFactory.createVideoSource(false);
videoCapturer.initialize(surfaceTextureHelper, this.getApplicationContext(), videoSource.getCapturerObserver());
VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VideoTrackID, videoSource);

My scenario

I've a callback handler that receives video buffer in byte array from custom video source:

public void onReceive(byte[] videoBuffer, int size) {}

How would I be able to send this byte array buffer? I'm not sure about the solution, but I think I would have to implement custom VideoCapturer?

Existing questions

This question might be relevant, though I'm not using libjingle library, only native WebRTC Android package.

Similar questions/articles:

  • for iOS platform but unfortunately I couldn't help with the answers.
  • for native C++ platform
  • article about native implementation

回答1:


There are two possible solutions to this problem:

  1. Implement custom VideoCapturer and create VideoFrame using byte[] stream data in onReceive handler. There actually exists a very good example of FileVideoCapturer, which implements VideoCapturer.
  2. Simply construct VideoFrame from NV21Buffer, which is created from our byte array stream data. Then we only need to use our previously created VideoSource to capture this frame. Example:
public void onReceive(byte[] videoBuffer, int size, int width, int height) {
    long timestampNS = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
    NV21Buffer buffer = new NV21Buffer(videoBuffer, width, height, null);

    VideoFrame videoFrame = new VideoFrame(buffer, 0, timestampNS);
    videoSource.getCapturerObserver().onFrameCaptured(videoFrame);

    videoFrame.release();
}


来源:https://stackoverflow.com/questions/61160558/custom-video-source-for-webrtc-on-android

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!