live555

Live555 OpenRTSP Client : How to delete current rtsp client properly and start new One

亡梦爱人 提交于 2019-12-07 23:48:00
问题 Well, When my OpenRtsp Client lost connection with server, I dispose the old client and other parameters then re-create new client. The Client send Options,Describe request successfully but failed after that... I can not able create Session and Subsesions so I got Access Violations errors.. How to reset old OpenRtspClient properly so that get new "brand" RTSPClient? My Current Way to Reset Old Client: I just modify the "shutdown" method in playCommon class. I did not send Teardown... ... void

Building 64bit Live555 with Visual Studio 2013

假装没事ソ 提交于 2019-12-06 11:44:14
I am trying to build the components of Live555 with Visual Studio 2013 64bit on Windows 7. I have tried editing win32config and the *.mak files without success. I've been searching the internet for a few hours and trying all kinds of things with command prompts. For some reason VS2013 x64 command prompt is still building 32bit static libs and I can't figure out why. If anyone has any good ideas, that would be fantastic! Ravi Chauhan The win32config file that you get from the .tar.gz file requires substantial editing to make it compatible with recent SDK and MSVC++ releases. This is a version

Live555 OpenRTSP Client : How to delete current rtsp client properly and start new One

风格不统一 提交于 2019-12-06 07:28:08
Well, When my OpenRtsp Client lost connection with server, I dispose the old client and other parameters then re-create new client. The Client send Options,Describe request successfully but failed after that... I can not able create Session and Subsesions so I got Access Violations errors.. How to reset old OpenRtspClient properly so that get new "brand" RTSPClient? My Current Way to Reset Old Client: I just modify the "shutdown" method in playCommon class. I did not send Teardown... ... void ResetOurClient(){ if (env != NULL) { env->taskScheduler().unscheduleDelayedTask(sessionTimerTask); env

What does Elementary Stream mean in Terms of H264

我与影子孤独终老i 提交于 2019-12-05 05:02:31
问题 I read what an Elementary Stream is on Wikipedia. A tool i am using "Live555" is demanding "H.264 Video Elementary Stream File". So when exporting a Video from a Video Application, do i have to choose specific preferences to generate a "Elementery Stream" ? 回答1: If you're using ffmpeg you could use something similar to the following: ffmpeg -f video4linux2 -s 320x240 -i /dev/video0 -vcodec libx264 -f h264 test.264 You'll have to adapt the command line for the file type you're exporting the

centos6.5 live555 rtsp转发

不想你离开。 提交于 2019-12-05 02:48:34
live555安装 wget http://www.live555.com/liveMedia/public/live555-latest.tar.gz tar -zxvf live555-latest.tar.gz cd live ./genMakefiles linux-64bit #注意后面这个参数是根据当前文件夹下config.<后缀>获取得到的 make 完成之后会生成一个proxyServer目录。 rtsp转发 cd proxyServer ./live555ProxyServer rtsp流源地址 &, 例如:./live555ProxyServer rtsp://192.168.0.188:554/stream/main & #执行命令后会返回一个分发的流地址 可以用vlc或其它流播放器测试转发后的流地址。 同类的rtsp服务还有EasyDarwin。 来源: https://www.cnblogs.com/dch0/p/11899474.html

Using Live555 to Stream Live Video from an IP camera connected to an H264 encoder

夙愿已清 提交于 2019-12-04 12:01:23
问题 I am using a custom Texas Instruments OMAP-L138 based board that basically consists of an ARM9 based SoC and a DSP processor. It is connected to a camera lens. What I'm trying to do is to capture live video stream which is sent to the dsp processor for H264 encoding which is sent over uPP in packets of 8192 bytes. I want to use the testH264VideoStreamer supplied by Live555 to live stream the H264 encoded video over RTSP. The code I have modified is shown below: #include <liveMedia.hh>

OpenCV/FFMpeg image capture problems

核能气质少年 提交于 2019-12-04 07:17:34
I'm trying to capture images from an IP camera in real time. The stream works perfectly well in VLC, but OpenCV's cvQueryFrame() seems to jumble and corrupt the incoming images to the point of no recognition. Again, capturing from file works fine, but not a live stream. In case it makes a difference, I'm using an rtsp connection URL; I've also tried this with two different camera models (different brands), and the problem remains. Besides, the (I'm assuming) codec is outputting several errors of the following kind: Error at MB: 1746 and concealing 6000 DC, 6000 AC, 6000 MV errors . What can I

What does Elementary Stream mean in Terms of H264

让人想犯罪 __ 提交于 2019-12-03 21:12:11
I read what an Elementary Stream is on Wikipedia . A tool i am using "Live555" is demanding "H.264 Video Elementary Stream File". So when exporting a Video from a Video Application, do i have to choose specific preferences to generate a "Elementery Stream" ? Ralf If you're using ffmpeg you could use something similar to the following: ffmpeg -f video4linux2 -s 320x240 -i /dev/video0 -vcodec libx264 -f h264 test.264 You'll have to adapt the command line for the file type you're exporting the video from. This generates a file containing H.264 access units where each access unit consists of one

Using Live555 to Stream Live Video from an IP camera connected to an H264 encoder

假如想象 提交于 2019-12-03 07:36:09
I am using a custom Texas Instruments OMAP-L138 based board that basically consists of an ARM9 based SoC and a DSP processor. It is connected to a camera lens. What I'm trying to do is to capture live video stream which is sent to the dsp processor for H264 encoding which is sent over uPP in packets of 8192 bytes. I want to use the testH264VideoStreamer supplied by Live555 to live stream the H264 encoded video over RTSP. The code I have modified is shown below: #include <liveMedia.hh> #include <BasicUsageEnvironment.hh> #include <GroupsockHelper.hh> #include <stdio.h> #include <unistd.h>

How to write a Live555 FramedSource to allow me to stream H.264 live

[亡魂溺海] 提交于 2019-12-03 07:08:19
问题 I've been trying to write a class that derives from FramedSource in Live555 that will allow me to stream live data from my D3D9 application to an MP4 or similar. What I do each frame is grab the backbuffer into system memory as a texture, then convert it from RGB -> YUV420P, then encode it using x264, then ideally pass the NAL packets on to Live555. I made a class called H264FramedSource that derived from FramedSource basically by copying the DeviceSource file. Instead of the input being an