I'm trying to capture H264 stream from locally installed Logitech C920 camera from /dev/video0
with Gstreamer 1.0
v4l2src
element.
v4l2-ctl --list-formats
shows that camera is capable to give H264 video format:
# v4l2-ctl --list-formats
ioctl: VIDIOC_ENUM_FMT
...
Index : 1
Type : Video Capture
Pixel Format: 'H264' (compressed)
Name : H.264
...
But pipeline
# gst-launch-1.0 -vvv v4l2src device=/dev/video0 ! video/x-h264, width=800, height=448, framerate=30/1 ! fakesink
keeps giving me not-negotiated (-4)
error:
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2809): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 67687169 ns.
Any help!
Is gstreamer mandatory for your needs? I also have lots of problems with the Logitech C920 in H264 mode and gstreamer. But I managed to use VLC as a RTSP server to use the C920 with H264:
$ cvlc -v v4l2:///dev/video0:chroma="H264":width=1024:height=570:fps=30 \
--sout="#rtp{sdp=rtsp://:8554/live}"
Then I can connect with another VLC to the URI rtsp://localhost:8554/live
If GStreamer is mandatory for you, I only managed to use it with a capture utility that you can find here: https://github.com/csete/bonecam - directory "capture"
You have to compile it, but if you have some programming skills it shoud be very easy as there is only one C file and a script to help. Just pass "host" as a parameter to the script :
# Get the bonecam/capture content or git clone the directory, and then
$ cd bonecam/capture
$ ./build host
You can use the "capture" utility with something like that :
$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1024,height=570,pixelformat=1
$ v4l2-ctl -d /dev/video0 --set-parm=30
$ ./bonecam/capture/capture -d /dev/video0 -c 100000 -o | \
gst-launch -e filesrc location=/dev/fd/0 ! legacyh264parse ! rtph264pay ! udpsink host=10.0.0.42 port=5000
If you do not like to specify the number of frame to get ("-c" parameter with "capture"), there is a fork to this utility that you can find here: https://github.com/DeLaGuardo/bonecam
I know there is a plugin categorized as "bad", called uvch264 for gstreamer 0.10, that should work with the C920. But I do not know for gstreamer 1.0, and I could not test it.
UPD:
Don't forget to add --rtsp-timeout=-1
to cvlc
command line like
$ cvlc -v v4l2:///dev/video0:chroma="H264":width=1024:height=570:fps=30 \
--sout="#rtp{sdp=rtsp://:8554/live}" --rtsp-timeout=-1
Without this option streaming only lasts for 60 seconds by default.
I have been trying to do the same thing and I got the same error. I believe I was using GStreamer 1.0.6.
What I found, possibly even thanks to Fergal Butler's answer, was the following page:
http://kakaroto.homelinux.net/2012/09/uvc-h264-encoding-cameras-support-in-gstreamer/
Here Youness Alaoui describes the uvch264_src element he made to bring H264 camera support to GStreamer.
He describes the port to GStreamer 1.0 as pending in his article. So over the last week I've been looking into this. It turns out that it has now been ported to GStreamer 1.0, but only in a developer release (Version 1.1.2).
You can get version 1.1.2 here:
http://gstreamer.freedesktop.org/src/
It's called "uvch264src" now, and it's a part of gst-plugins-bad. I think it is also present in version 1.1.1 but I haven't really looked into that.
I had a bit of a hard time getting it installed, I think mostly due to to conflicts with GST 1.0 packages installed on my PC (so my own fault). But note that it has dependencies on libgudev-1.0-dev and libusb-1.0-0-dev so install these packages first - it took me a while to work out it was those two I was missing.
Here is a pipeline I got to work which uses uvch264:
gst-launch-1.0 uvch264src device=/dev/video0 name=src auto-start=true src.vfsrc ! video/x-raw, format=YUY2, width=160, height=90, framerate=5/1 ! xvimagesink src.vidsrc ! queue ! video/x-h264, width=800, height=448, framerate=30/1 ! h264parse ! avdec_h264 ! xvimagesink
If you don't want to use the preview video (from the vfsrc pad) just hook src.vfsrc straight up to a fakesink. I should also mention that even though this pipeline is working for me, I get a lot of warnings about "Got data flow before segment event". So obviously I'm not doing something right, but I'm not sure what.
Anyway, after all of that messing about getting 1.1.2 and uvch264src completely installed and working, I decided to give v4l2src a quick go again. And it turns out that v4l2src supports H264 properly after all :/. (See the short answer.)
Short Answer:
So the short answer to your question is that if you are happy to install 1.1.2 from source you'll be able to do exactly what you want in the same way you've been trying to do it. You shouldn't need uvch264src. I've tested your pipeline and it worked fine with my installation. I've also tried this simple pipeline, to display the video on-screen, and it worked fine for me as well:
gst-launch-1.0 -e v4l2src device=/dev/video0 ! video/x-h264, width=800, height=448, framerate=30/1 ! avdec_h264 ! xvimagesink sync=false
I don't believe v4l2src supports h264 at the moment. See here:
http://www.oz9aec.net/index.php/gstreamer/473-using-the-logitech-c920-webcam-with-gstreamer
and here:
http://kakaroto.homelinux.net/2012/09/uvc-h264-encoding-cameras-support-in-gstreamer/
Try use videoconvert for automatically convert the video to a format understood by the video sink
gst-launch-1.0 -vvv v4l2src device=/dev/video0 ! videoconvert ! ...
I've also got a Logitech C920 camera, and have used the following pipeline to record H.264 video from the camera:
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-h264,width=1280,height=720,framerate=30/1 ! mpegtsmux ! filesink location=output.ts
This asks the camera to produce H.264 data, which I then mux into a MPEG transport stream container, and write to disk. I can play the resulting file successfully with Totem.
The above pipeline records at 720p. The camera can also record at 1080p if you change the requested format to width=1920,height=1080
.
来源:https://stackoverflow.com/questions/15787967/capturing-h-264-stream-from-camera-with-gstreamer