Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In-memory Celery backend for tests #3151

Merged
merged 2 commits into from
Aug 9, 2024

Conversation

dato
Copy link
Contributor

@dato dato commented Dec 12, 2023

The use of Redis is avoided in tests by mocking the .delay method of relevant Celery tasks. An alternative would be to run tests with an in-memory Celery backend, and let task objects be created there.

This PR implements this, but doesn't touch any existing patch('...') blocks or decorators. From the moment it is merged, patch blocks would not be needed in new tests (and could be removed when editing existing ones).

  • in-memory backend when running from pytest
  • in-memory backend when running from manage.py
  • check test suite run time if all mocks are dropped
  • add an example worker test

@dato dato changed the title rfc: in-memory Celery backend for tests wip: in-memory Celery backend for tests Jul 28, 2024
@dato dato force-pushed the celery_inmem branch 2 times, most recently from 74c9b5f to d7c7125 Compare July 28, 2024 09:30
dato added 2 commits July 28, 2024 06:42
This allows to easily configure an in-memory transport for tests.
At the moment, this doesn't have much of an effect since most task
calls are mock'd out.
@dato dato changed the title wip: in-memory Celery backend for tests In-memory Celery backend for tests Jul 28, 2024
@dato dato marked this pull request as ready for review July 28, 2024 22:30
@mouse-reeve mouse-reeve merged commit 66dd39e into bookwyrm-social:main Aug 9, 2024
10 checks passed
@dato dato deleted the celery_inmem branch August 9, 2024 21:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants