PyPI is slow. How do I run my own server?

前端 未结 5 902
别那么骄傲
别那么骄傲 2020-12-12 17:18

When a new developer joins the team, or Jenkins runs a complete build, I need to create a fresh virtualenv. I often find that setting up a virtualenv with Pip and a large nu

相关标签:
5条回答
  • 2020-12-12 17:41

    Setup your local server then modify the local computer's hosts file to overwrite the actual URL to instead point to the local server thus skipping the standard DNS. Then delete the line in the host file if you are done.

    Or I suppose you could find the URL in pip and modify that.

    0 讨论(0)
  • 2020-12-12 17:50

    I recently installed devpi into my development team's Vagrant configuration such that its package cache lives on the host's file system. This allows each VM to have its own devpi-server daemon that it uses as the index-url for virtualenv/pip. When the VMs are destroyed and reprovisioned, the packages don't have to be downloaded over and over. Each developer downloads them one time to build their local cache for as long as they live on the host's file system.

    We also have an internal PyPi index for our private packages that's currently just a directory being served by Apache. Ultimately, I'm going to convert that to a devpi proxy server as well so our build server will also maintain a package cache for our Python dependencies in addition to hosting our private libraries. This will create an additional buffer between our development environment, production deployments and the public PyPi.

    This seems to be the most robust solution I've found to these requirements to date.

    0 讨论(0)
  • 2020-12-12 17:57

    Do you have a shared filesystem?

    Because I would use pip's cache setting. It's pretty simple. Make a folder called pip-cache in /mnt for example.

    mkdir /mnt/pip-cache
    

    Then each developer would put the following line into their pip config (unix = $HOME/.pip/pip.conf, win = %HOME%\pip\pip.ini)

    [global]
    download-cache = /mnt/pip-cache
    

    It still checks PyPi, looks for the latest version. Then checks if that version is in the cache. If so it installs it from there. If not it downloads it. Stores it in the cache and installs it. So each package would only be downloaded once per new version.

    0 讨论(0)
  • 2020-12-12 18:01

    While it doesn't solve your PyPI problem, handing built virtualenvs to developers (or deployments) can be done with Terrarium.

    Use terrarium to package up, compress, and save virtualenvs. You can store them locally or even store them on S3. From the documentation on GitHub:

    $ pip install terrarium
    $ terrarium --target testenv --storage-dir /mnt/storage install requirements.txt
    

    After building a fresh environment, terrarium will archive and compress the environment, and then copy it to the location specified by storage-dir.

    On subsequent installs for the same requirement set that specify the same storage-dir, terrarium will copy and extract the compressed archive from /mnt/storage.

    To display exactly how terrarium will name the archive, you can run the following command:

    $ terrarium key requirements.txt more_requirements.txt
    x86_64-2.6-c33a239222ddb1f47fcff08f3ea1b5e1
    
    0 讨论(0)
  • 2020-12-12 18:01

    Take a look at David Wolever's pip2pi. You can just set up a cron job to keep a company- or team-wide mirror of the packages you need, and then point your pips towards your internal mirror.

    0 讨论(0)
提交回复
热议问题