live-streaming

Live audio streaming container formats

萝らか妹 提交于 2020-02-20 06:10:34
问题 When I start receiving the live audio (radio) stream (e.g. MP3 or AAC) I think the received data are not kind of raw bitstream (i.e. raw encoder output), but they are always wrapped into some container format. If this assumption is correct, then I guess I cannot start streaming from arbitrary place of the stream, but I have to wait to some sync byte. Is that right? Is it usual to have some sync bytes? Is there any header following the sync byte, from which I can guess the used codec, number

Live audio streaming container formats

左心房为你撑大大i 提交于 2020-02-20 06:10:10
问题 When I start receiving the live audio (radio) stream (e.g. MP3 or AAC) I think the received data are not kind of raw bitstream (i.e. raw encoder output), but they are always wrapped into some container format. If this assumption is correct, then I guess I cannot start streaming from arbitrary place of the stream, but I have to wait to some sync byte. Is that right? Is it usual to have some sync bytes? Is there any header following the sync byte, from which I can guess the used codec, number

Not able to Show live camera RTSP streaming with Angular

↘锁芯ラ 提交于 2020-01-30 02:44:15
问题 I'm developing a web application with angular, I need to add a window that shows a live RTSP streaming. After searching I found that can be done with JSMpeg js library. In the server side I found this nodejs example Stream = require('node-rtsp-stream') stream = new Stream({ name: 'name', streamUrl: 'rtsp_url', wsPort: 9999, ffmpegOptions: { // options ffmpeg flags '-stats': '', // an option with no neccessary value uses a blank string '-r': 30 // options with required values specify the value

Appending paramaters to each m3u8 and ts file while playing live stream

限于喜欢 提交于 2020-01-24 19:23:12
问题 I am using videojs in live streaming environment and using nginx secure URLs to protect the stream. See here for the details - https://www.nginx.com/blog/securing-urls-secure-link-module-nginx-plus/ The algorithm works fine and the player is able to detect when the live.m3u8 file becomes available. However, when playing the stream, I just get a spinning wheel. On the JS console, I see that the sub-playlist e.g. live_109.m3u8 URL does not have the required md5 hash and expiry timestamp and

Is it possible to allow many users to make live broadcasts using YouTube API and a single account?

末鹿安然 提交于 2020-01-17 04:05:53
问题 I want to make a web application and I want to use YouTube API to allow my users make live broadcasts. Is it necessary that my users log with their Google/YouTube accounts to use the live stream or is it possible to make them use this function without bothering them with this detail? 回答1: In order to create the Live Event and Live Stream objects required for a livestream on YouTube, the user making those requests must be authenticated. From the Docs: Your application must have authorization

Creating live video streams dynamically

♀尐吖头ヾ 提交于 2020-01-16 06:08:07
问题 I'm familiar with publishing/subscribing to predefined live video feeds on Adobe Flash Media Server. How can I allow users to create new streams dynamically? Meaning, instead of providing fixed feeds, users click on "Create Feed", enter a name, and then anyone would be able to publish/subscribe to that feed. I'm not looking for a source-code (though obviously that would be nice). Rather, I'd like to understand what I need to do a high-level to get this to work. 回答1: Answering my own question:

Creating live video streams dynamically

≯℡__Kan透↙ 提交于 2020-01-16 06:05:10
问题 I'm familiar with publishing/subscribing to predefined live video feeds on Adobe Flash Media Server. How can I allow users to create new streams dynamically? Meaning, instead of providing fixed feeds, users click on "Create Feed", enter a name, and then anyone would be able to publish/subscribe to that feed. I'm not looking for a source-code (though obviously that would be nice). Rather, I'd like to understand what I need to do a high-level to get this to work. 回答1: Answering my own question:

Can RTSP (Real Time Streaming Protocol) be used to send live video stream from iPhone to a media server?

雨燕双飞 提交于 2020-01-14 02:36:06
问题 I am new to iOS and multimedia development and I am working on an application which will capture video from iPhone's camera and send the live stream to a media server. In this link a person asked a question on stackoverflow saying that his application was rejected by Apple as he didn't use Apple HLS (HTTP Live Streaming) method in his application for receiving the live stream. But my case is different as I am not receiving a live stream in the iPhone. I have to send the live video to a media

How do I use javascript to automatically switch to a backup live stream if primary fails in JWPlayer?

余生颓废 提交于 2020-01-06 18:36:31
问题 I am attempting to write a javascript function that will enable an instance of JW Player to automatically switch from a primary live HLS stream to a backup live HLS stream in the event of an error (ex: primary encoder goes down). What I have so far: <div id="myElement">Loading the player...</div> var playerInstance = jwplayer("myElement"); playerInstance.setup({ file: "http://server/primary/playlist.m3u8", width: 640, height: 360, title: 'Basic Video Embed', description: 'work damn you', });

Send captured images from python server to javascript client

混江龙づ霸主 提交于 2020-01-06 14:44:12
问题 Now I try to make server using Raspberry Pi which send live stream image data to browser. The server side was written in Python & Tornado, while client side was written in HTML and javascript. Both use WebSocket. (I am a beginner of javascript.) These are the codes Server side: class WSHandler(WebSocketHandler): def initialize(self, camera): self.camera = camera cv.SetCaptureProperty(self.capture, cv.CV_CAP_PROP_FRAME_WIDTH, 480) cv.SetCaptureProperty(self.capture, cv.CV_CAP_PROP_FRAME_HEIGHT