问题
I have wrote my code that captures videos by my builtin webcam of my MAC using ffmpeg
.
On local machine, the code works fine. However, I built a docker container of my code, and tried to run it, but I got the following error:
error: Command failed: ffmpeg -f avfoundation -framerate 30 -i "0" -target pal-vcd -vf scale=640x480 -flags +global_header -f segment -segment_time 10 -segment_list ../out.csv -segment_format_options movflags=+faststart -reset_timestamps 1 -strftime 1 %Y%m%d-%H%M%S.mp4
ffmpeg version git-2016-05-25-9591ca7 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
configuration: --extra-libs=-ldl --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-nonfree --enable-openssl
libavutil 55. 24.100 / 55. 24.100
libavcodec 57. 43.100 / 57. 43.100
libavformat 57. 37.100 / 57. 37.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 46.100 / 6. 46.100
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
Unknown input format: 'avfoundation'
- as far as what I understood from this log, is that the docker
container doesn't have access to local devices.
According to this discussion Docker - a way to give access to a host USB or serial device? , I need to use the--device
flag to pass my device's location.
However, according to this answer, that is not possible? - In addition to what I've mentioned above, I couldn't get my webcam's path at the first place. what is a correct path for macbook webcam
My question is: How to access my MAC's builtin camera to record from a docker container?
回答1:
I'm not a docker expert, but there are a bunch of problems I can see:
- Docker on Mac runs a virtual machine (hyperkit/xhyve) running Linux, which then uses containers. This means containers aren't running on the bare metal kernel, and their kernel doesn't have direct access to hardware. You can of course also use Parallels or VMWare Fusion or VirtualBox to run the Linux system instead.
- It depends on the specific model of Mac, but many of the more recent models use PCIe based cameras, not USB.
So to make it work, you would need to:
- Unload the macOS driver for your camera.
- Pass through the raw USB device to the virtual machine running Linux, if it is indeed a USB model.
- Get Linux to load the driver for the camera.
- Pass the camera from the host Linux system through to the container.
PCIe passthrough is currently not possible on macOS hosts at all, so depending on your hardware, it might be completely infeasible.
If it's working without Docker, trying to force it into a container seems far more complicated and error prone than necessary.
回答2:
I think this comment would answer your question: https://stackoverflow.com/a/64634921/838712
This explains how to make the macOS camera available inside a linux docker container. ffmpeg should then be able to use this device without any problem.
来源:https://stackoverflow.com/questions/62556566/how-to-access-builtin-webcam-from-a-docker-container-using-ffmpeg