How to manage local vs production settings in Django?

前端 未结 22 1505
别跟我提以往
别跟我提以往 2020-11-22 15:00

What is the recommended way of handling settings for local development and the production server? Some of them (like constants, etc) can be changed/accessed in both, but s

相关标签:
22条回答
  • 2020-11-22 15:20

    Two Scoops of Django: Best Practices for Django 1.5 suggests using version control for your settings files and storing the files in a separate directory:

    project/
        app1/
        app2/
        project/
            __init__.py
            settings/
                __init__.py
                base.py
                local.py
                production.py
        manage.py
    

    The base.py file contains common settings (such as MEDIA_ROOT or ADMIN), while local.py and production.py have site-specific settings:

    In the base file settings/base.py:

    INSTALLED_APPS = (
        # common apps...
    )
    

    In the local development settings file settings/local.py:

    from project.settings.base import *
    
    DEBUG = True
    INSTALLED_APPS += (
        'debug_toolbar', # and other apps for local development
    )
    

    In the file production settings file settings/production.py:

    from project.settings.base import *
    
    DEBUG = False
    INSTALLED_APPS += (
        # other apps for production site
    )
    

    Then when you run django, you add the --settings option:

    # Running django for local development
    $ ./manage.py runserver 0:8000 --settings=project.settings.local
    
    # Running django shell on the production site
    $ ./manage.py shell --settings=project.settings.production
    

    The authors of the book have also put up a sample project layout template on Github.

    0 讨论(0)
  • 2020-11-22 15:21

    My solution to that problem is also somewhat of a mix of some solutions already stated here:

    • I keep a file called local_settings.py that has the content USING_LOCAL = True in dev and USING_LOCAL = False in prod
    • In settings.py I do an import on that file to get the USING_LOCAL setting

    I then base all my environment-dependent settings on that one:

    DEBUG = USING_LOCAL
    if USING_LOCAL:
        # dev database settings
    else:
        # prod database settings
    

    I prefer this to having two separate settings.py files that I need to maintain as I can keep my settings structured in a single file easier than having them spread across several files. Like this, when I update a setting I don't forget to do it for both environments.

    Of course that every method has its disadvantages and this one is no exception. The problem here is that I can't overwrite the local_settings.py file whenever I push my changes into production, meaning I can't just copy all files blindly, but that's something I can live with.

    0 讨论(0)
  • 2020-11-22 15:21

    For most of my projects I use following pattern:

    1. Create settings_base.py where I store settings that are common for all environments
    2. Whenever I need to use new environment with specific requirements I create new settings file (eg. settings_local.py) which inherits contents of settings_base.py and overrides/adds proper settings variables (from settings_base import *)

    (To run manage.py with custom settings file you simply use --settings command option: manage.py <command> --settings=settings_you_wish_to_use.py)

    0 讨论(0)
  • 2020-11-22 15:23

    Instead of settings.py, use this layout:

    .
    └── settings/
        ├── __init__.py  <= not versioned
        ├── common.py
        ├── dev.py
        └── prod.py
    

    common.py is where most of your configuration lives.

    prod.py imports everything from common, and overrides whatever it needs to override:

    from __future__ import absolute_import # optional, but I like it
    from .common import *
    
    # Production overrides
    DEBUG = False
    #...
    

    Similarly, dev.py imports everything from common.py and overrides whatever it needs to override.

    Finally, __init__.py is where you decide which settings to load, and it's also where you store secrets (therefore this file should not be versioned):

    from __future__ import absolute_import
    from .prod import *  # or .dev if you want dev
    
    ##### DJANGO SECRETS
    SECRET_KEY = '(3gd6shenud@&57...'
    DATABASES['default']['PASSWORD'] = 'f9kGH...'
    
    ##### OTHER SECRETS
    AWS_SECRET_ACCESS_KEY = "h50fH..."
    

    What I like about this solution is:

    1. Everything is in your versioning system, except secrets
    2. Most configuration is in one place: common.py.
    3. Prod-specific things go in prod.py, dev-specific things go in dev.py. It's simple.
    4. You can override stuff from common.py in prod.py or dev.py, and you can override anything in __init__.py.
    5. It's straightforward python. No re-import hacks.
    0 讨论(0)
  • 2020-11-22 15:26

    The problem with most of these solutions is that you either have your local settings applied before the common ones, or after them.

    So it's impossible to override things like

    • the env-specific settings define the addresses for the memcached pool, and in the main settings file this value is used to configure the cache backend
    • the env-specific settings add or remove apps/middleware to the default one

    at the same time.

    One solution can be implemented using "ini"-style config files with the ConfigParser class. It supports multiple files, lazy string interpolation, default values and a lot of other goodies. Once a number of files have been loaded, more files can be loaded and their values will override the previous ones, if any.

    You load one or more config files, depending on the machine address, environment variables and even values in previously loaded config files. Then you just use the parsed values to populate the settings.

    One strategy I have successfully used has been:

    • Load a default defaults.ini file
    • Check the machine name, and load all files which matched the reversed FQDN, from the shortest match to the longest match (so, I loaded net.ini, then net.domain.ini, then net.domain.webserver01.ini, each one possibly overriding values of the previous). This account also for developers' machines, so each one could set up its preferred database driver, etc. for local development
    • Check if there is a "cluster name" declared, and in that case load cluster.cluster_name.ini, which can define things like database and cache IPs

    As an example of something you can achieve with this, you can define a "subdomain" value per-env, which is then used in the default settings (as hostname: %(subdomain).whatever.net) to define all the necessary hostnames and cookie things django needs to work.

    This is as DRY I could get, most (existing) files had just 3 or 4 settings. On top of this I had to manage customer configuration, so an additional set of configuration files (with things like database names, users and passwords, assigned subdomain etc) existed, one or more per customer.

    One can scale this as low or as high as necessary, you just put in the config file the keys you want to configure per-environment, and once there's need for a new config, put the previous value in the default config, and override it where necessary.

    This system has proven reliable and works well with version control. It has been used for long time managing two separate clusters of applications (15 or more separate instances of the django site per machine), with more than 50 customers, where the clusters were changing size and members depending on the mood of the sysadmin...

    0 讨论(0)
  • 2020-11-22 15:29

    In settings.py:

    try:
        from local_settings import *
    except ImportError as e:
        pass
    

    You can override what needed in local_settings.py; it should stay out of your version control then. But since you mention copying I'm guessing you use none ;)

    0 讨论(0)
提交回复
热议问题