How to run multiple Python scripts and an executable files using Docker?

后端 未结 3 1492
清歌不尽
清歌不尽 2021-01-06 02:19

I want to create a container that is contained with two Python packages as well as a package consist of an executable file.


Here\'s my main package (dockerize

相关标签:
3条回答
  • 2021-01-06 02:30

    Best practice is to launch these as three separate containers. That's doubly true since you're taking three separate applications, bundling them into a single container, and then trying to launch three separate things from them.

    Create a separate Dockerfile in each of your project subdirectories. These can be simpler, especially for the one that just contains a compiled binary

    # execproject/Dockerfile
    FROM ubuntu:18.04
    WORKDIR /app
    COPY . ./
    CMD ["./gowebapp"]
    

    Then in your docker-compose.yml file have three separate stanzas to launch the containers

    version: '3'
    services:
      pythonic_project1:
        build: ./pythonic_project1
        ports:
          - 8008:8008
        env:
          PY2_URL: 'http://pythonic_project2:8009'
          GO_URL: 'http://execproject:8010'
      pythonic_project2:
        build: ./pythonic_project2
      execproject:
        build: ./execproject
    

    If you really can't rearrange your Dockerfiles, you can at least launch three containers from the same image in the docker-compose.yml file:

    services:
      pythonic_project1:
        build: .
        workdir: /app/pythonic_project1
        command: ./__main__.py
      pythonic_project2:
        build: .
        workdir: /app/pythonic_project1
        command: ./__main__.py
    

    There's several good reasons to structure your project with multiple containers and images:

    • If you roll your own shell script and use background processes (as other answers have), it just won't notice if one of the processes dies; here you can use Docker's restart mechanism to restart individual containers.
    • If you have an update to one of the programs, you can update and restart only that single container and leave the rest intact.
    • If you ever use a more complex container orchestrator (Docker Swarm, Nomad, Kubernetes) the different components can run on different hosts and require a smaller block of CPU/memory resource on a single node.
    • If you ever use a more complex container orchestrator, you can individually scale up components that are using more CPU.
    0 讨论(0)
  • 2021-01-06 02:31

    As mentioned in the documentation, there can be only one CMD in the docker file and if there is more, the last one overrides the others and takes effect. A key point of using docker might be to isolate your programs, so at first glance, you might want to move them to separate containers and talk to each other using a shared volume or a docker network, but if you really need them to run in the same container, including them in a bash script and replacing the last CMD with CMD run.sh will run them alongside each other:

    #!/bin/bash
    
    exec python3 /path/to/script1.py &
    exec python3 /path/to/script2.py
    

    Add COPY run.sh to the Dockerfile and use RUN chmod a+x run.sh to make it executable. CMD should be CMD ["./run.sh"]

    0 讨论(0)
  • 2021-01-06 02:38

    try it via entrypoint.sh

    ENTRYPOINT ["/docker_entrypoint.sh"]
    

    docker_entrypoint.sh

    #!/bin/bash
    
    set -e
    
    exec python3 not__main__.py &
    exec python3 __main__.py 
    

    symbol & says that you run service as daemon in background

    0 讨论(0)
提交回复
热议问题