问题
I'm creating an application that will allow users to upload video files that will then be put through some processing.
I have two containers.
Nginx
container that serves the website where users can upload their video files.- Video processing container that has
FFmpeg
and some other processing stuff installed.
What I want to achieve. I need container 1 to be able to run a bash script on container 2.
One possibility as far as I can see is to make them communicate over HTTP via an API. But then I would need to install a web server in container 2 and write an API which seems a bit overkill. I just want to execute a bash script.
Any suggestions?
回答1:
You have a few options, but the first 2 that come time mind are:
- In container 1, install the Docker CLI and bind mount
/var/run/docker.sock
(you need to specify the bind mount from the host when you start the container). Then, inside the container, you should be able to usedocker
commands against the bind mounted socket as if you were executing them from the host (you might also need tochmod
the socket inside the container to allow a non-root user to do this. - You could install
SSHD
on container 2, and thenssh
in from container 1 and run your script. The advantage here is that you don't need to make any changes inside the containers to account for the fact that they are running in Docker and not bare metal. The down side is that you will need to add the SSHD setup to your Dockerfile or the startup scripts.
Most of the other ideas I can think of are just variants of option (2), with SSHD replaced by some other tool.
Also be aware that Docker networking is a little strange (at least on Mac hosts), so you need to make sure that the containers are using the same docker-network and are able to communicate over it.
回答2:
Running a docker
command from a container is not straightforward and not really a good idea (in my opinion), because :
- You'll need to install docker on the container (and do docker in docker stuff)
- You'll need to share the unix socket, which is not a good thing if you have no idea of what you're doing.
So, this leaves us two solutions :
- Install ssh on you're container and execute the command through ssh
- Share a volume and have a process that watch for something to trigger your batch
回答3:
I believe
docker exec -it <container_name> <command>
should work, even inside the container.
You could also try to mount to docker.sock
in the container you try to execute the command from:
docker run -v /var/run/docker.sock:/var/run/docker.sock ...
回答4:
It was mentioned here before, but a reasonable, semi-hacky option is to install SSH in both containers and then use ssh to execute commands on the other container:
# install SSH, if you don't have it already
sudo apt install openssh-server
# start the ssh service
sudo service start ssh
# start the daemon
sudo /usr/sbin/sshd -D &
Assuming you don't want to always be root, you can add default user (in this case, 'foobob'):
useradd -m --no-log-init --system --uid 1000 foobob -s /bin/bash -g sudo -G root
#change password
echo 'foobob:foobob' | chpasswd
Do this on both the source and target containers. Now you can execute a command from container_1 to container_2.
# obtain container-id of target container using 'docker ps'
ssh foobob@<container-id> << "EOL"
echo 'hello bob from container 1' > message.txt
EOL
You can automate the password with ssh-agent, or you can use some bit of more hacky with sshpass
(install it first using sudo apt install sshpass
):
sshpass -p 'foobob' ssh foobob@<container-id>
回答5:
I wrote a python package especially for this use-case.
Flask-Shell2HTTP is a Flask-extension to convert a command line tool into a RESTful API with mere 5 lines of code.
Example Code:
from flask import Flask
from flask_executor import Executor
from flask_shell2http import Shell2HTTP
app = Flask(__name__)
executor = Executor(app)
shell2http = Shell2HTTP(app=app, executor=executor, base_url_prefix="/commands/")
shell2http.register_command(endpoint="saythis", command_name="echo")
shell2http.register_command(endpoint="run", command_name="./myscript")
can be called easily like,
$ curl -X POST -H 'Content-Type: application/json' -d '{"args": ["Hello", "World!"]}' http://localhost:4000/commands/saythis
You can use this to create RESTful micro-services that can execute pre-defined shell commands/scripts with dynamic arguments asynchronously and fetch result.
It supports file upload, callback fn, reactive programming and more. I recommend you to checkout the Examples.
来源:https://stackoverflow.com/questions/59035543/how-to-execute-command-from-one-docker-container-to-another