How to run multiple Python scripts and an executable files using Docker?

后端 未结 3 1493
清歌不尽
清歌不尽 2021-01-06 02:19

I want to create a container that is contained with two Python packages as well as a package consist of an executable file.


Here\'s my main package (dockerize

3条回答
  •  太阳男子
    2021-01-06 02:30

    Best practice is to launch these as three separate containers. That's doubly true since you're taking three separate applications, bundling them into a single container, and then trying to launch three separate things from them.

    Create a separate Dockerfile in each of your project subdirectories. These can be simpler, especially for the one that just contains a compiled binary

    # execproject/Dockerfile
    FROM ubuntu:18.04
    WORKDIR /app
    COPY . ./
    CMD ["./gowebapp"]
    

    Then in your docker-compose.yml file have three separate stanzas to launch the containers

    version: '3'
    services:
      pythonic_project1:
        build: ./pythonic_project1
        ports:
          - 8008:8008
        env:
          PY2_URL: 'http://pythonic_project2:8009'
          GO_URL: 'http://execproject:8010'
      pythonic_project2:
        build: ./pythonic_project2
      execproject:
        build: ./execproject
    

    If you really can't rearrange your Dockerfiles, you can at least launch three containers from the same image in the docker-compose.yml file:

    services:
      pythonic_project1:
        build: .
        workdir: /app/pythonic_project1
        command: ./__main__.py
      pythonic_project2:
        build: .
        workdir: /app/pythonic_project1
        command: ./__main__.py
    

    There's several good reasons to structure your project with multiple containers and images:

    • If you roll your own shell script and use background processes (as other answers have), it just won't notice if one of the processes dies; here you can use Docker's restart mechanism to restart individual containers.
    • If you have an update to one of the programs, you can update and restart only that single container and leave the rest intact.
    • If you ever use a more complex container orchestrator (Docker Swarm, Nomad, Kubernetes) the different components can run on different hosts and require a smaller block of CPU/memory resource on a single node.
    • If you ever use a more complex container orchestrator, you can individually scale up components that are using more CPU.

提交回复
热议问题