I have developed my entire project (Django, Python) on Windows and all the PaaS out there use Linux.
VirtualEnv on Linux:
VirtualEnv_dir /
bin
Unless you use some Windows specific libraries; or an alternate Python implementation (like IronPython), there is nothing to worry about.
Many people (including myself) use Windows for development and deploy on Linux for production and use virtualenv for this purpose. It is designed to make your environment portable.
You don't push the entire virtualenv to Linux.
Once you have your virtual environment ready and your code is working, you should freeze the requirements for your application:
pip freeze > requirements.txt
In your target operating system; create an empty virtual environment:
virtualenv --no-site-packages prod_env
In recent versions of virtualenv
, --no-site-packages
is the default.
Next, populate the environment with your requirements file from development:
source prod_env/bin/activate
pip install -r requirements.txt
When you have a requirements change, simply regenerate the requirements.txt
file and run pip install -r requirements.txt
in production.
In some situations, your production systems don't have access to the Internet to download packages so the pip install
trick doesn't work. For these scenarios you can create your own private pypi server and push your packages there. Added bonus going via this route is that you can create and push private packages and install them using the normal setuptools utilities.
Once you have decided on which process works for you - you then automate it in your deployment scripts; generally with hooks into your source code management system. Some people prefer a separate release engineering process (with a release manager - that is a person, not a program).