I’m have a project split on 2 separate apps and both have another shared app as reusable one(i store my models there and install it from git link. So when i run celery worker locally all is working perfect, but if i use docker(with celery in separate image) i got this error
Error: celery_api | Unable to load celery application. celery_api | The module common was not found.
In some reason celery gets improper configuration from django.
celery docker config
celery_api:
image: celery_api
container_name: celery_api
restart: unless-stopped
build: .
command: celery worker -A djookyapi --loglevel=info
env_file:
- ./.dev.env
volumes:
- .:/usr/src/app
working_dir: /usr/src/app
networks:
- dev
celery file in the root of project folder(near settings.py file)
__future__ import absolute_import, unicode_literals
import os
import sys
from celery import Celery
from django.conf import settings
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
sys.path.append(BASE_DIR)
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproj.settings')
os.environ.setdefault('DJANGO_CONFIGURATION', 'Local')
import configurations
configurations.setup()
app = Celery('myproj', )
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
So why celery doesn’t see my app?
Advertisement
Answer
changed config in docker-compose like this:
celery_api:
image: celery_api:latest
container_name: celery_api
restart: unless-stopped
build:
context: ./
command: celery worker -A djookyapi --loglevel=info -B
env_file:
- ./.dev.env
working_dir: /usr/src/app
networks:
- dev