问题
Overview
I would like to use a custom video source to live stream video via WebRTC Android implementation. If I understand correctly, existing implementation only supports front and back facing cameras on Android phones. The following classes are relevant in this scenario:
- Camera1Enumerator.java
- VideoCapturer.java
- PeerConnectionFactory
- VideoSource.java
- VideoTrack.java
Currently for using front facing camera on Android phone I'm doing the following steps:
CameraEnumerator enumerator = new Camera1Enumerator(false);
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
VideoSource videoSource = peerConnectionFactory.createVideoSource(false);
videoCapturer.initialize(surfaceTextureHelper, this.getApplicationContext(), videoSource.getCapturerObserver());
VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VideoTrackID, videoSource);
My scenario
I've a callback handler that receives video buffer in byte array from custom video source:
public void onReceive(byte[] videoBuffer, int size) {}
How would I be able to send this byte array buffer? I'm not sure about the solution, but I think I would have to implement custom VideoCapturer
?
Existing questions
This question might be relevant, though I'm not using libjingle library, only native WebRTC Android package.
Similar questions/articles:
- for iOS platform but unfortunately I couldn't help with the answers.
- for native C++ platform
- article about native implementation
回答1:
There are two possible solutions to this problem:
- Implement custom
VideoCapturer
and createVideoFrame
usingbyte[]
stream data inonReceive
handler. There actually exists a very good example of FileVideoCapturer, which implementsVideoCapturer
. - Simply construct
VideoFrame
from NV21Buffer, which is created from our byte array stream data. Then we only need to use our previously createdVideoSource
to capture this frame. Example:
public void onReceive(byte[] videoBuffer, int size, int width, int height) {
long timestampNS = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
NV21Buffer buffer = new NV21Buffer(videoBuffer, width, height, null);
VideoFrame videoFrame = new VideoFrame(buffer, 0, timestampNS);
videoSource.getCapturerObserver().onFrameCaptured(videoFrame);
videoFrame.release();
}
来源:https://stackoverflow.com/questions/61160558/custom-video-source-for-webrtc-on-android