I\'m thinking about putting the virtualenv for a Django web app I am making inside my git repository for the app. It seems like an easy way to keep deploy\'s simple and easy
It's not a good idea to include any environment-dependent component or setting in your repos as one of the key aspects of using a repo, is perhaps, sharing it with other developers. Here is how I would setup my development environment on a Windows PC (say, Win10).
Open Pycharm and on the first page, choose to check out the project from your Source Control System (in my case, I am using github)
In Pycharm, navigate to settings and choose "Project Interpreter" and choose the option to add a new virtual environment , you can call it "venv".
Choose the base python interpreter which is located at C:\Users{user}\AppData\Local\Programs\Python\Python36 (make sure you choose the appropriate version of Python based on what you have installed)
Note that Pycharm will create the new virtual environment and copy python binaries and required libraries under your venv folder inside your project folder.
Let Pycharm complete its scanning as it needs to rebuild/refresh your project skeleton
exclude venv folder from your git interactions (add venv\ to .gitignore file in your project folder)
Bonus: If you want people to easily (well, almost easily) install all the libraries your software needs, you can use
pip freeze > requirements.txt
and put the instruction on your git so people can use the following command to download all required libraries at once.
pip install -r requirements.txt
Storing the virtualenv directory inside git will, as you noted, allow you to deploy the whole app by just doing a git clone (plus installing and configuring Apache/mod_wsgi). One potentially significant issue with this approach is that on Linux the full path gets hard-coded in the venv's activate, django-admin.py, easy_install, and pip scripts. This means your virtualenv won't entirely work if you want to use a different path, perhaps to run multiple virtual hosts on the same server. I think the website may actually work with the paths wrong in those files, but you would have problems the next time you tried to run pip.
The solution, already given, is to store enough information in git so that during the deploy you can create the virtualenv and do the necessary pip installs. Typically people run pip freeze
to get the list then store it in a file named requirements.txt. It can be loaded with pip install -r requirements.txt
. RyanBrady already showed how you can string the deploy statements in a single line:
# before 15.1.0
virtualenv --no-site-packages --distribute .env &&\
source .env/bin/activate &&\
pip install -r requirements.txt
# after deprecation of some arguments in 15.1.0
virtualenv .env && source .env/bin/activate && pip install -r requirements.txt
Personally, I just put these in a shell script that I run after doing the git clone or git pull.
Storing the virtualenv directory also makes it a bit trickier to handle pip upgrades, as you'll have to manually add/remove and commit the files resulting from the upgrade. With a requirements.txt file, you just change the appropriate lines in requirements.txt and re-run pip install -r requirements.txt
. As already noted, this also reduces "commit spam".
I think one of the main problems which occur is that the virtualenv might not be usable by other people. Reason is that it always uses absolute paths. So if you virtualenv was for example in /home/lyle/myenv/
it will assume the same for all other people using this repository (it must be exactly the same absolute path). You can't presume people using the same directory structure as you.
Better practice is that everybody is setting up their own environment (be it with or without virtualenv) and installing libraries there. That also makes you code more usable over different platforms (Linux/Windows/Mac), also because virtualenv is installed different in each of them.