Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

celery.send_task does not return an AsyncResult causing application to hang intermittently #9113

Open
BrQE opened this issue Jul 1, 2024 · 0 comments

Comments

@BrQE
Copy link

BrQE commented Jul 1, 2024

Description:
I am experiencing an issue where tasks sent to Celery workers using executer.send_task from another process (other_app.py) sometimes cause the other_app process to hang, despite executer.send_task being asynchronous. Below are the details of my setup and the issue:

Environment:

Redis Version: 4.6.0
Celery Version: 5.3.1
Celery Singleton Version: 0.3.1
Kombu Version: 5.3.7
SQLAlchemy Celery Beat Version: 0.7.1
SQLAlchemy Version: 1.4.52
OS Version: Debian GNU/Linux 11 (bullseye)
Kernel Version: 5.15.38-amd64

requirements.txt
redis 4.6.0
celery 5.3.1
celery-singleton 0.3.1
kombu 5.3.7
sqlalchemy-celery-beat 0.7.1
SQLAlchemy 1.4.52

Celery Configuration (celery.py):

from celery import Celery
from datetime import timedelta

class CeleryConfig:
    task_serializer = 'json'
    accept_content = ['json']
    result_serializer = 'json'
    broker_url = 'redis://localhost/3'
    result_backend = 'redis://localhost/4'
    broker_connection_retry_on_startup = True
    result_backend_thread_safe = True
    result_expires = timedelta(seconds=1)
    worker_cancel_long_running_tasks_on_connection_loss = True
    task_time_limit = 300
    worker_send_task_event = False
    task_default_queue = 'subtask'
    include = [
        'tasks.job_first',
        'tasks.job_second',
    ]

executer = Celery(__name__)
executer.config_from_object(CeleryConfig)

Run Script (run.sh):

celery --workdir "${celery_working_dir}" -A "celery" worker \
                    -Ofair --without-heartbeat --without-gossip --without-mingle \
                    --pidfile="${celery_pid_file}" --loglevel="${celeryd_log_level}" \
                    --concurrency="${celery_concurrency}" -n "${worker_name}" --pool=gevent -Q "${queue_name}"

Other App (other_app.py):

from celery import executer

executer.send_task(message.event, kwargs=message.__dict__, task_id=message.id, queue="cloud")

Issue:

  • I am sending tasks to the Celery workers from other_app.py at irregular intervals using executer.send_task.
  • The tasks are processed correctly by the Celery workers.
  • Occasionally, the executer.send_task call does not return a response, causing the other_app process to hang.
  • This issue occurs despite executer.send_task being an asynchronous call.

Expected Behavior:

The executer.send_task call should always return a response without causing the other_app process to hang.

Additional Information:

No errors or exceptions are logged in the Celery workers or other_app.py when the issue occurs.

Please let me know if any additional information is required. Thank you for your assistance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant