How do I cache steps in GitHub actions?

后端 未结 6 1810
野趣味
野趣味 2021-01-30 10:54

Say I have a GitHub actions workflow with 2 steps.

  1. Download and compile my application\'s dependencies.
  2. Compile and test my application

My

6条回答
  •  鱼传尺愫
    2021-01-30 11:06

    I'll summarize the two options:

    1. Caching
    2. Docker

    Caching

    You can add a command in your workflow to cache directories. When that step is reached, it'll check if the directory that you specified was previously saved. If so, it'll grab it. If not, it won't. Then in further steps you write checks to see if the cached data is present. For example, say you are compiling some dependency that is large and doesn't change much. You could add a cache step at the beginning of your workflow, then a step to build the contents of the directory if they aren't there. The first time that you run it won't find the files but subsequently it will and your workflow will run faster.

    Behind the scenes, GitHub is uploading a zip of your directory to github's own AWS storage. They purge anything older than a week or if you hit a 2GB limit.

    Some drawbacks with this technique is that it saves just directories. So if you installed into /usr/bin, you'll have to cache that! That would be awkward. You should instead install into $home/.local and use echo set-env to add that to your path.

    Docker

    Docker is a little more complex and it means that you have to have a dockerhub account and manage two things now. But it's way more powerful. Instead of saving just a directory, you'll save an entire computer! What you'll do is make a Dockerfile that will have in it all your dependencies, like apt-get and python pip lines or even long compilation. Then you'll build that docker image and publish it on dockerhub. Finally, you'll have your tests set to run on that new docker image, instead of on eg, ubuntu-latest. And from now on, instead of installing dependencies, it'll just download the image.

    You can automate this further by storing that Dockerfile in the same GitHub repo as the project and then write a job with steps that will download the latest docker image, rebuild if necessary just the changed steps, and then upload to dockerhub. And then a job which "needs" that one and uses the image. That way your workflow will both update the docker image if needed and also use it.

    The downsides is that your deps will be in one file, the Dockerfile, and the tests in the workflow, so it's not all together. Also, if the time to download the image is more than the time to build the dependencies, this is a poor choice.


    I think that each one has upsides and downsides. Caching is only good for really simple stuff, like compiling into .local. If you need something more extensive, Docker is the most powerful.

提交回复
热议问题