Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Celery task results are not stored in the Django database #450

Open
nico2am opened this issue Oct 24, 2024 · 4 comments
Open

Celery task results are not stored in the Django database #450

nico2am opened this issue Oct 24, 2024 · 4 comments

Comments

@nico2am
Copy link

nico2am commented Oct 24, 2024

Celery task results are not stored in the Django database

Environment

  • Django: 5.1.2
  • Celery: 5.4.0
  • django-celery-results: 2.5.1
  • Flower: 2.0.1
  • psycopg2-binary==2.9.10 (PostgreSQL)

Description

Tasks are visible in Flower but are not being saved to the database task results table. The issue persists despite using the recommended configuration from the documentation and various solutions found in other issues and forums.

Current Configuration

Django Settings

INSTALLED_APPS = [
   ...,
   'django_celery_results',
]
CELERY_TIMEZONE = 'UTC'  # Europe/Paris
CELERY_BROKER_URL = os.environ.get('CELERY_BROKER_URL', 'redis://localhost:6379/0')
CELERY_RESULT_BACKEND = 'django-db'
CELERY_CACHE_BACKEND = 'django-cache'

Celery Configuration (celery.py)

import os

from celery import Celery

# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')

app = Celery('project')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

app.conf.update(
    # Task settings
    task_track_started=True,
    task_time_limit=30 * 60,
    task_ignore_result=False,
    result_extended=True,
    # Serialization settings
    accept_content=['application/json'],
    task_serializer='json',
    result_serializer='json',
    # Worker settings
    worker_send_task_events=True,  # Same as -E option
    worker_prefetch_multiplier=1,  # For better task distribution
)

# Load task modules from all registered Django apps.
app.autodiscover_tasks()


@app.task(bind=True, ignore_result=True)
def debug_task(self):
    print(f'Request: {self.request!r}')

Attempted Solution

I've implemented a signal handler to manually track task completion and save results:

import logging
from celery.signals import task_postrun
from django_celery_results.models import TaskResult

logger = logging.getLogger(__name__)

@task_postrun.connect
def task_postrun_handler(task_id=None, task=None, state=None, retval=None, **kwargs):
    logger.info(f"Task completed: {task_id}")
    logger.info(f"State: {state}")
    logger.info(f"Result: {retval}")
    
    try:
        TaskResult.objects.get_or_create(
            task_id=task_id,
            defaults={
                'status': state,
                'result': retval,
                'task_name': task.name if task else '',
            }
        )
        logger.info(f"Result saved for task: {task_id}")
    except Exception as e:
        logger.error(f"Failed to save result for task {task_id}: {str(e)}")

The signal handler successfully creates records in the database, but the default Celery result backend still isn't saving results automatically as expected.

Questions

  • Why aren't task results being saved automatically to the database?
  • Is there a configuration issue preventing the results from being saved?
  • Are there any known compatibility issues between these specific versions?

Additional Information

  • The database connection is working correctly (confirmed by signal handler saving results)
  • Tasks are executing successfully (visible in Flower)
  • Database migrations for django_celery_results have been applied

Any help or guidance would be greatly appreciated!

@rootart
Copy link

rootart commented Nov 6, 2024

Hi @nico2am it looks like your task explicitly says to not store and ignore the results

@app.task(bind=True, ignore_result=True)
...

Could you please check if results are stored when removing ignore_result or setting it to False?

@nico2am
Copy link
Author

nico2am commented Nov 13, 2024

Hello,

I check and I donn't have ignore_result, this is my configuration:

celery.py

app.conf.update(
    # Task settings
    task_track_started=True,
    task_time_limit=30 * 60,
    task_ignore_result=False,
    result_extended=True,
    # Serialization settings
    accept_content=['application/json'],
    task_serializer='json',
    result_serializer='json',
    # Worker settings
    worker_send_task_events=True,  # Same as -E option
    worker_prefetch_multiplier=1,  # For better task distribution
)

# Load task modules from all registered Django apps.
app.autodiscover_tasks()

tasks/tasks.py

from celery import shared_task

@shared_task
def send_mail(email_id):
    email = Email.objects.get(id=email_id)

@edwinludik-tease
Copy link

I have the same probem, it seems to be related to the postgresql module.
When I start the worker, it is blank at "results:"
image

When I comment out "CELERY_RESULT_BACKEND = 'django-db'", i see it gets disabled:
image

I stumbled upon this when i tried to view the result of one of the tasks that I manually executed:
celery -A djangoproject result 30949f92-13c0-4945-8f13-e5c1332ea74c

and it threw an error. First psycopg2-binary was missing, now I get django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet. :( Current status: not fixed

@sssuperman
Copy link

I have same issue. by add parameter while app.conf.update()

image

I can successfully store result to django database

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants