live555

RTSP/RTMP Video Streaming Client iOS [closed]

此生再无相见时 提交于 2019-12-03 03:21:40
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 4 years ago . I'm in need of a open source solution/library to stream RTSP/RTMP to an iOS Application. I need to build an app that connects to a media server, and opens the provided video stream. I believe there has to be libraries out there, but I have yet to find one that is open source, compiles, actually works, and runs

Live555 WebRtc integration

匿名 (未验证) 提交于 2019-12-03 02:38:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I worked in the past on WebRTC with a C server (Janus) that i know very very well, now i want to add the WebRtc capacity to Live555 to be able to stream some video through it. The problem is that i'am kind of overwhelmed with the lack of documentation in live555 (worst than Janus) If i can resume live555 lacks : 1- A SDP parser/builder 2- A HTTP signaling Parser/Builder My question is if i see all that is missing, or need to be done ? If by any chance someone can give me some hints. 回答1: Live555 make a demo that is available on http://webrtc

如何在MPlayer上支持RTSP

匿名 (未验证) 提交于 2019-12-02 23:57:01
http://www.live555.com/mplayer/ 请按照如下步骤 下载 " LIVE555 Streaming Media " l的库的源码, 解压后 将"live/" 目录移动到 "/usr/local/", "/usr/local/lib/", 或者是 "/usr/lib/". (注意:如果你不这么做, 则需要制定目录 "live/" ) 下载最新源码 MPlayer source code . 如果你在第一步的时候把 "live/"目录移动到"/usr/local/", "/usr/local/lib/", 或"/usr/lib/", 运行 cd MPlayer * ; ./ configure 如果移动 live555 目录了会自动发现, 否则需要运行 cd MPlayer * ; ./ configure -- extracflags =- I < path - to - LIVE555 - Streaming - Media - library - directory > 现在,构建和生成 MPlayer , make ; make install (构建MPlayer和 LIVE555 Streaming Media 的时候必须是相同的gcc版本.) 来源:博客园 作者: 麦壳饼 链接:https://www.cnblogs.com/MysticBoy

Mplayer ww 版本 SVN-r37370-ffmpeg n2.5.3 gcc 4.5.1

匿名 (未验证) 提交于 2019-12-02 23:57:01
缘起于更换工作环境到新的工作站,于无意中发现旧硬盘里的相关代码,又用着potplayer不爽。想着WW版自从2015年就不开始更新了… 神不在我们身边,所以朝圣的路很远,何不我自成神? 距mplayer最新的SVN版本还有700多个版本……等这些版本编译正常时,将把各种第三方库及GCC进行再一次更新……在此之前,请慢慢等待…不定时更新 根据Mplayer ww 版本 最后一个版本2015年的r37356 做了更新 SVN-r37370-ffmpeg n2.5.3 gcc 4.5.1 FFMPEG由n2.5-〉n2.5.3 下载地址: 链接:https://pan.baidu.com/s/1aQyKFy7fFlSHrpXNTFOD4w 提取码:hfp8 库及编译环境: glib_2.28.1 pkg-config_0.23-3 libintl-8 * compiler flags: -O3 -s -mms-bitfields -march=i386 -mtune=i686 mingw-full-20160707 * libtool-2.4.6 * autoconf-2.69 * automake-1.15 * m4-1.4.17 * binutils-2.26.1 * gmp-6.1.1 * mpfr-3.1.4 * mpc-1.0.3 mingw-full-20160704 *

How to write a Live555 FramedSource to allow me to stream H.264 live

杀马特。学长 韩版系。学妹 提交于 2019-12-02 19:41:21
I've been trying to write a class that derives from FramedSource in Live555 that will allow me to stream live data from my D3D9 application to an MP4 or similar. What I do each frame is grab the backbuffer into system memory as a texture, then convert it from RGB -> YUV420P, then encode it using x264, then ideally pass the NAL packets on to Live555. I made a class called H264FramedSource that derived from FramedSource basically by copying the DeviceSource file. Instead of the input being an input file, I've made it a NAL packet which I update each frame. I'm quite new to codecs and streaming,

RTSP/RTMP Video Streaming Client iOS [closed]

你离开我真会死。 提交于 2019-12-02 17:48:17
I'm in need of a open source solution/library to stream RTSP/RTMP to an iOS Application. I need to build an app that connects to a media server, and opens the provided video stream. I believe there has to be libraries out there, but I have yet to find one that is open source, compiles, actually works, and runs on iOS 5+, iPhone 4+. I do not have a preference, RTMP or RTSP will suffice. Preferably the one with the least amount of work. I have RTSP working on the Android side, but nothing for iOS yet. This is what I already know from research today - RTSP Seems possible using Live555/FFMPEG

Live555 framework generation error

谁说胖子不能爱 提交于 2019-12-01 20:21:46
问题 I'm trying to build the framework for Live555 library. I got the library file from here as per this answer of SO I've tried multiple times to generate it as per that answer. It is simply giving the following error : /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/g++ -c -Iinclude -I../UsageEnvironment/include -I../groupsock/include -I. -DBSD=1 -O2 -DSOCKLEN_T=socklen_t -DHAVE_SOCKADDR_LEN=1 -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64 -fPIC -arch armv7

Has anyone succeedded in streaming rtsp packet using TCP through live 555 libraries

眉间皱痕 提交于 2019-12-01 14:27:15
Has anyone succeedded in streaming rtsp packet using TCP through live 555 libraries .?** I try'ed searching web a lot but didn't find anything useful all the solution provided i try'ed but the wireshark shows that UDP packet are being streamed. Ya send the appropriate setup request you will be able to stream over TCP.In case of live 555 subclass "OnDemandServerMediaSession" class. 来源: https://stackoverflow.com/questions/14811574/has-anyone-succeedded-in-streaming-rtsp-packet-using-tcp-through-live-555-librar

Anybody has successfully ported live555 to android?

安稳与你 提交于 2019-11-30 05:30:58
I've been trying to build live555 according to this guide: https://github.com/boltonli/ohbee/tree/master/android/streamer/jni as well as using some other guides, all to no avail. If someone has succeeded in porting live555 to android can you, please, tell me how I can do so? I successfully built the project as follows: git clone https://github.com/boltonli/ohbee.git cd ohbee/android/streamer android update project --path . --name "streamer" --target "android-15" cp lib/jnix.jar libs/ # This is the only trick ant debug The jar was in the lib/ directory rather than libs/ . If that doesn't

Live555: X264 Stream Live source based on “testOnDemandRTSPServer”

浪尽此生 提交于 2019-11-30 05:26:38
I am trying to create a rtsp Server that streams the OpenGL output of my program. I had a look at How to write a Live555 FramedSource to allow me to stream H.264 live , but I need the stream to be unicast. So I had a look at testOnDemandRTSPServer. Using the same Code fails. To my understanding I need to provide memory in which I store my h264 frames so the OnDemandServer can read them on Demand. H264VideoStreamServerMediaSubsession.cpp H264VideoStreamServerMediaSubsession* H264VideoStreamServerMediaSubsession::createNew(UsageEnvironment& env, Boolean reuseFirstSource) { return new