Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ProcessPoolExecutor deadlocks on on python 3.9 or newer #115

Open
andyneff opened this issue Nov 25, 2021 · 1 comment
Open

ProcessPoolExecutor deadlocks on on python 3.9 or newer #115

andyneff opened this issue Nov 25, 2021 · 1 comment

Comments

@andyneff
Copy link
Member

andyneff commented Nov 25, 2021

Starting in python 3.9.0, using a ProcessPoolExecutor has a good chance of deadlocking on a terra task. It's almost always happens with 10 workers, and is practically guaranteed with 16 workers.

I've managed to put together a piece of code to reproduce the error:

#!/usr/bin/env python

from concurrent.futures import ProcessPoolExecutor, as_completed

from celery import shared_task

@shared_task
def foo(x):
  return x*x

if __name__ == '__main__':
  futures = {}
  with ProcessPoolExecutor(max_workers=10) as executor:
    for x in range(10):
      futures[executor.submit(foo, x)] = x

    results = {}
    for future in as_completed(futures):
      task_id = futures[future]
      results[task_id] = future.result()
      print(len(results))

As you can see here, the bug is actually not part of terra, but can be reproduced just using the celery task object. Something about how celery works and a "task" vs a "function" is causing workers to hang before they ever process a single job.

@andyneff
Copy link
Member Author

Diving into the problem further, it is very easy to offset the problem with print statements. Adding/removing them can hide or expose the problem, so it definitely seems like a timing affects the issue.


I've also replaced @shared_task with the following, and the problem persists, so leaving shared_task for easy of debugging:

app = Celery('tasks')
@app.task

Roughly speaking, celery creates a "task" out of your function, and then wraps that in a Proxy class:

# bar = shared_task(foo)
app = celery_state.get_current_app()
foo_task = app._task_from_fun(foo)
def thing():
  return foo_task
bar = Proxy(thing)

andyneff added a commit that referenced this issue Dec 2, 2021
@andyneff andyneff mentioned this issue Dec 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant