Make Django test case database visible to Celery

前端 未结 3 560
野性不改
野性不改 2020-12-03 05:55

When a Django test case runs, it creates an isolated test database so that database writes get rolled back when each test completes. I am trying to create an integration tes

相关标签:
3条回答
  • 2020-12-03 06:29

    I found another workaround for the solution based on @drhagen's one:

    Call celery.contrib.testing.app.TestApp() before calling start_worker(app)

    from celery.contrib.testing.worker import start_worker
    from celery.contrib.testing.app import TestApp
    
    from myapp.tasks import app, my_task
    
    
    class TestTasks:
        def setup(self):
            TestApp()
            self.celery_worker = start_worker(app)
            self.celery_worker.__enter__()
    
        def teardown(self):
            self.celery_worker.__exit__(None, None, None)
    
    0 讨论(0)
  • 2020-12-03 06:30

    For your unittests I would recommend skipping the celery dependency, the two following links will provide you with the necesarry infos to start your unittests:

    • http://docs.celeryproject.org/projects/django-celery/en/2.4/cookbook/unit-testing.html
    • http://docs.celeryproject.org/en/latest/userguide/testing.html

    If you really want to test the celery function calls including a queue I'd propably set up a dockercompose with the server, worker, queue combination and extend the custom CeleryTestRunner from the django-celery docs. But I wouldn't see a benefit from it because the test system is pbly to far away from production to be representative.

    0 讨论(0)
  • 2020-12-03 06:35

    This is possible by starting a Celery worker within the Django test case.

    Background

    Django's in-memory database is sqlite3. As it says on the description page for Sqlite in-memory databases, "[A]ll database connections sharing the in-memory database need to be in the same process." This means that, as long as Django uses an in-memory test database and Celery is started in a separate process, it is fundamentally impossible to have Celery and Django to share a test database.

    However, with celery.contrib.testing.worker.start_worker, it possible to start a Celery worker in a separate thread within the same process. This worker can access the in-memory database.

    This assumes that Celery is already setup in the usual way with the Django project.

    Solution

    Because Django-Celery involves some cross-thread communication, only test cases that don't run in isolated transactions will work. The test case must inherit directly from SimpleTestCase or its Rest equivalent APISimpleTestCase and set the class attribute allow_database_queries to True.

    The key is to start a Celery worker in the setUpClass method of the TestCase and close it in the tearDownClass method. The key function is celery.contrib.testing.worker.start_worker(app), which requires an instance of the current Celery app, presumably obtained from mysite.celery.app and returns a Python ContextManager, which has __enter__ and __exit__ methods, which must be called in setUpClass and tearDownClass, respectively. There is probably a way to avoid manually entering and existing the ContextManager with a decorator or something, but I couldn't figure it out. Here is an example tests.py file:

    from celery.contrib.testing.worker import start_worker
    from django.test import SimpleTestCase
    
    from mysite.celery import app
    
    class BatchSimulationTestCase(SimpleTestCase):
        allow_database_queries = True
    
        @classmethod
        def setUpClass(cls):
            super().setUpClass()
    
            # Start up celery worker
            cls.celery_worker = start_worker(app)
            cls.celery_worker.__enter__()
    
        @classmethod
        def tearDownClass(cls):
            super().tearDownClass()
    
            # Close worker
            cls.celery_worker.__exit__(None, None, None)
    
        def test_my_function(self):
            # my_task.delay() or something
    

    For whatever reason, the testing worker tries to use a task called 'celery.ping', probably to provide better error messages in the case of worker failure. Even setting perform_ping_check to False as a keyword argument ot start_worker still tests for its existence. The task it is looking for is celery.contrib.testing.tasks.ping. However, this task is not installed by default. It should possible to provide this task by adding celery.contrib.testing to INSTALLED_APPS in settings.py. However, this only makes it visible to the worker; not the code that generates the worker. The code that generates the worker does an assert 'celery.ping' in app.tasks, which fails. Commenting this out makes everything work, but modifying an installed library is not a good solution. I am probably doing something wrong, but the workaround I settled on is to copy the simple function somewhere it can be picked up by app.autodiscover_tasks(), such as celery.py:

    @app.task(name='celery.ping')
    def ping():
        # type: () -> str
        """Simple task that just returns 'pong'."""
        return 'pong'
    

    Now, when the tests are run, there is no need to start a separate Celery process. A Celery worker will be started in the Django test process as a separate thread. This worker can see any in-memory databases, including the default in-memory test database. To control the number of workers, there are options available in start_worker, but it appears the default is a single worker.

    0 讨论(0)
提交回复
热议问题