Current best practice to stream live video in web browser?

前端 未结 2 1259
时光取名叫无心
时光取名叫无心 2021-01-31 22:05

We develop an IP camera product which streams H.264/MPEG4/MJPEG video via RTSP/UDP. It has a web interface, currently we use the VLC Firefox plugin to allow viewing of the live

相关标签:
2条回答
  • 2021-01-31 22:55

    There are many methods you can use that don't require transcoding.

    WebRTC

    If you're using RTSP, you're much of the way there in sending your streams via WebRTC.

    WebRTC uses SDP for declaring streams, and RTP for the transport of these streams. There are some other layers you need for setting up the WebRTC call, but none of these require particularly expensive computation. Most (all?) WebRTC clients will support H.264 decoding, many with hardware acceleration in-browser.

    The easiest way to get started with WebRTC is to implement a browser-to-browser client first. Then, you can go a layer deeper with your own implementation.

    WebRTC is the route I recommend to you. NAT traversal (in most cases) and P2P connectivity are built-in, so your customers won't have to remember IP addresses. Simply provide signalling services and your customers can connect directly to their cameras at home from wherever. Provide TURN servers, and they'll be able to connect even if both ends are firewalled. If you don't wish to provide such services, they're lightweight and can run directly on the camera in a mode like you have today.

    Fragmented MP4 over HTTP Progressive with <video> tag

    This method is much simpler than WebRTC, but totally different than what you're doing now. You can take your H.264 stream, and wrap it directly in an MP4 without transcoding. Then, it can be played in a <video> tag on a page. You'll have to implement the appropriate libs in your code, but here's an FFmpeg example that outputs to STDOUT, which you'd pipe to clients:

    ffmpeg \
      -i YOUR_CAMERA_HERE \
      -vcodec copy \
      -acodec copy \
      -f mp4 \
      -movflags frag_keyframe+empty_moov \
      -
    

    Others...

    In your case, there's no added benefit to DASH. DASH is intended for utilizing file-based CDNs for streaming. You control the server, so there's no point in writing out files or handling HTTP requests in a file-like manner. While you can certainly use DASH with H.264 streams without transcoding, I think it's a waste of your time.

    HLS is much the same. Your stream is compatible with HLS, but HLS is dropping out of favor rapidly due to its lack of flexibility on codec. DASH and HLS are essentially the same mechanism... write a bunch of media segments to a CDN and create a playlist or manifest indicating where they are.

    0 讨论(0)
  • 2021-01-31 22:58

    Well, I had to do the same thing while back in a raspberry pi 3. we transcoded it on the fly using ffmpeg on the pi and used https://github.com/phoboslab/jsmpeg to stream mjpeg. then played it on the browser/ionic app.

    var canvas = document.getElementById('video-canvas');
    this.player = new JSMpeg.Player(this.button.url ,{canvas: canvas});
    

    We were managing up to 4 concurrent streams with minimum delay <2-5 secs on our Pis.

    But once we moved to React Native we used the RN VLC wrapper on the phones

    0 讨论(0)
提交回复
热议问题