问题
I can't understand how to create a .mjpeg file. As far as I understand it is simply a series of jpeg files. I searched online for a way to combine them into a single file, but didn't find any information. Some people said that one just needs to create a miniserver that would show one image after another.
I'm trying to use the following application, git://git.ideasonboard.org/uvc-gadget.git, to test UVC, and one of the options that it has is a path to the mjpeg file. I'm not very clear if it is possible to create a mjpeg file at all.
Would appreciate any help on how to create an mjpeg file so I could use it with the above mentioned application.
回答1:
I had a difficult time searching for the same. It's especially misleading to read through mencoder's manpage when it supports various movie containers but not the UVC payload format.
This seemed to work for me to record a bytestream from a webcam on Ubuntu 16.04:
gst-launch-1.0 v4l2src device=/dev/video0 ! 'image/jpeg,width=1280,height=720,framerate=30/1' ! \
filesink buffer-size=0 location=mystream.mjpeg
where 1280x720 at 30 fps is what guvcview
says my webcam supports.
Source: link
Edit: Later I learned about v4l2-ctl
:
v4l2-ctl -d /dev/video0 --list-formats-ext # identify a proper resolution/format
v4l2-ctl --set-fmt-video=width=1280,height=720,pixelformat=1
v4l2-ctl --stream-mmap=1 --stream-count=30 --stream-to=mystream.mjpeg
When the stream-count is set to 1, it makes a regular JPEG file that can be viewed with xdg-open
. Otherwise, run file mystream.mjpeg
to confirm the output has a proper resolution and frame count.
Getting this data to actually work with uvc-gadget -i
could be much more involved. Given it possibly requires the appropriate patches, kernel configuration, and debugging, so far I have only gotten the uncompressed format to work in isochronous on my Raspberry Pi Zero. Hopefully you're further along.
来源:https://stackoverflow.com/questions/45703234/how-to-create-mjpeg