Huge files in Docker containers

前端 未结 2 1204
深忆病人
深忆病人 2021-02-07 10:07

I need to create a Docker image (and consequently containers from that image) that use large files (containing genomic data, thus reaching ~10GB in size).

How am I suppo

相关标签:
2条回答
  • 2021-02-07 10:24

    Is there a better way of referencing such files?

    If you already have some way to distribute the data I would use a "bind mount" to attach a volume to the containers.

    docker run -v /path/to/data/on/host:/path/to/data/in/container <image> ...
    

    That way you can change the image and you won't have to re-download the large data set each time.

    If you wanted to use the registry to distribute the large data set, but want to manage changes to the data set separately, you could use a data volume container with a Dockerfile like this:

    FROM tianon/true
    COPY dataset /dataset
    VOLUME /dataset
    

    From your application container you can attach that volume using:

    docker run -d --name dataset <data volume image name>
    docker run --volumes-from dataset <image> ...
    

    Either way, I think https://docs.docker.com/engine/tutorials/dockervolumes/ are what you want.

    0 讨论(0)
  • 2021-02-07 10:26

    Am I supposed to include them in the container (such as COPY large_folder large_folder_in_container)?

    If you do so, that would include them in the image, not the container: you could launch 20 containers from that image, the actual disk space used would still be 10 GB.

    If you were to make another image from your first image, the layered filesystem will reuse the layers from the parent image, and the new image would still be "only" 10GB.

    0 讨论(0)
提交回复
热议问题