For some time now, my unit testing has been taking a longer than expected time. I have tried to debug it a couple of times without much success, as the delays are before my tests even begin to run. This has affected my ability to do anything remotely close to test driven development (maybe my expectations are too high), so I want to see if I can fix this once and for all.
When a run a test, there is a 70 to 80sec delay between the start and the actual beginning of the test. For example, if I run a test for a small module (using time python manage.py test myapp
), I get
<... bunch of unimportant print messages I print from my settings>
Creating test database for alias 'default'...
......
----------------------------------------------------------------
Ran 6 tests in 2.161s
OK
Destroying test database for alias 'default'...
real 1m21.612s
user 1m17.170s
sys 0m1.400s
About 1m18 of the 1m:21 are between the
Creating test database for alias 'default'...
and the
.......
line. In other words, the test takes under 3sec, but the database initialization seems to be taking 1:18min
I have about 30 apps, most with 1 to 3 database models so this should give an idea of the project size. I use SQLite for unit testing, and have implemented some of the suggested improvements. I cannot post my whole setting file, but happy to add any information that is required.
I do use a runner
from django.test.runner import DiscoverRunner
from django.conf import settings
class ExcludeAppsTestSuiteRunner(DiscoverRunner):
"""Override the default django 'test' command, exclude from testing
apps which we know will fail."""
def run_tests(self, test_labels, extra_tests=None, **kwargs):
if not test_labels:
# No appnames specified on the command line, so we run all
# tests, but remove those which we know are troublesome.
test_labels = (
'app1',
'app2',
....
)
print ('Testing: ' + str(test_labels))
return super(ExcludeAppsTestSuiteRunner, self).run_tests(
test_labels, extra_tests, **kwargs)
and in my settings:
TEST_RUNNER = 'config.test_runner.ExcludeAppsTestSuiteRunner'
I have also tried using django-nose
with django-nose-exclude
I have read a lot about how to speed up the test themselves, but have not found any leads on how to optimize or avoid the database initialization. I have seen the suggestions on trying not to test with the database but I cannot or don't know how to avoid that completely.
Please let me know if
- This is normal and expected
- Not expected (and hopefully a fix or lead on what to do)
Again, I don't need help on how to speed up the test themselves, but the initialization (or overhead). I want the example above to take 10sec instead of 80sec.
Many thanks
I run the test (for single app) with --verbose 3
and discovered this is all related to migrations:
Rendering model states... DONE (40.500s)
Applying authentication.0001_initial... OK (0.005s)
Applying account.0001_initial... OK (0.022s)
Applying account.0002_email_max_length... OK (0.016s)
Applying contenttypes.0001_initial... OK (0.024s)
Applying contenttypes.0002_remove_content_type_name... OK (0.048s)
Applying s3video.0001_initial... OK (0.021s)
Applying s3picture.0001_initial... OK (0.052s)
... Many more like this
I squashed all my migrations but still slow.
The final solution that fixes my problem is to force Django to disable migration during testing, which can be done from the settings like this
TESTING = 'test' in sys.argv[1:]
if TESTING:
print('=========================')
print('In TEST Mode - Disableling Migrations')
print('=========================')
class DisableMigrations(object):
def __contains__(self, item):
return True
def __getitem__(self, item):
return "notmigrations"
MIGRATION_MODULES = DisableMigrations()
or use https://pypi.python.org/pypi/django-test-without-migrations
My whole test now takes about 1 minute and a small app takes 5 seconds.
In my case, migrations are not needed for testing as I update tests as I migrate, and don't use migrations to add data. This won't work for everybody
Summary
Use pytest
!
Operations
pip install pytest-django
pytest --nomigrations
instead of./manage.py test
Result
./manage.py test
costs 2 min 11.86 secpytest --nomigrations
costs 2.18 sec
Hints
You can create a file called
pytest.ini
in your project root directory, and specify default command line options and/or Django settings there.# content of pytest.ini [pytest] addopts = --nomigrations DJANGO_SETTINGS_MODULE = yourproject.settings
Now you can simply run tests with
pytest
and save you a bit of typing.You can speed up the subsequent tests even further by adding
--reuse-db
to the default command line options.[pytest] addopts = --nomigrations --reuse-db
However, as soon as your database model is changed, you must run
pytest --create-db
once to force re-creation of the test database.If you need to enable gevent monkey patching during testing, you can create a file called
pytest
in your project root directory with the following content, cast the execution bit to it (chmod +x pytest
) and run./pytest
for testing instead ofpytest
:#!/usr/bin/env python # -*- coding: utf-8 -*- # content of pytest from gevent import monkey monkey.patch_all() import os os.environ.setdefault("DJANGO_SETTINGS_MODULE", "yourproject.settings") from django.db import connection connection.allow_thread_sharing = True import re import sys from pytest import main if __name__ == '__main__': sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0]) sys.exit(main())
You can create a
test_gevent.py
file for testing whether gevent monkey patching is successful:# -*- coding: utf-8 -*- # content of test_gevent.py import time from django.test import TestCase from django.db import connection import gevent def f(n): cur = connection.cursor() cur.execute("SELECT SLEEP(%s)", (n,)) cur.execute("SELECT %s", (n,)) cur.fetchall() connection.close() class GeventTestCase(TestCase): longMessage = True def test_gevent_spawn(self): timer = time.time() d1, d2, d3 = 1, 2, 3 t1 = gevent.spawn(f, d1) t2 = gevent.spawn(f, d2) t3 = gevent.spawn(f, d3) gevent.joinall([t1, t2, t3]) cost = time.time() - timer self.assertAlmostEqual(cost, max(d1, d2, d3), delta=1.0, msg='gevent spawn not working as expected')
References
use ./manage.py test --keepdb when there are no changes in the migration files
Database initialization indeed takes too long...
I have a project with about the same number of models/tables (about 77), and approximately 350 tests and takes 1 minute total to run everything. Deving in a vagrant machine with 2 cpus allocated and 2GB of ram. Also I use py.test with pytest-xdist plugin for running multiple tests in parallel.
Another thing you can do is tell django reuse the test database and only re-create it when you have schema changes. Also you can use SQLite so that the tests will use an in-memory database. Both approaches explained here: https://docs.djangoproject.com/en/dev/topics/testing/overview/#the-test-database
EDIT: In case none of the options above work, one more option is to have your unit tests inherit from django SimpleTestCase or use a custom test runner that doesn't create a database as explained in this answer here: django unit tests without a db.
Then you can just mock django calls to the database using a library like this one (which admittingly I wrote): https://github.com/stphivos/django-mock-queries
This way you can run your unit tests locally fast and let your CI server worry about running integration tests that require a database, before merging your code to some stable dev/master branch that isn't the production one.
来源:https://stackoverflow.com/questions/36487961/django-unit-testing-taking-a-very-long-time-to-create-test-database