I run Redis in docker using the following command:
$ docker run -p 6379:6379 redis
My dash app in module called
playground.py
and its content is copy-pasted from the documentation:
import time
import dash
from dash import html
from dash.long_callback import CeleryLongCallbackManager
from dash.dependencies import Input, Output
from celery import Celery
celery_app = Celery(
__name__, broker="redis://localhost:6379/0", backend="redis://localhost:6379/1"
long_callback_manager = CeleryLongCallbackManager(celery_app)
app = dash.Dash(__name__, long_callback_manager=long_callback_manager)
app.layout = html.Div(
html.Div([html.P(id="paragraph_id", children=["Button not clicked"])]),
html.Button(id="button_id", children="Run Job!"),
html.Button(id="cancel_button_id", children="Cancel Running Job!"),
@app.long_callback(
output=Output("paragraph_id", "children"),
inputs=Input("button_id", "n_clicks"),
running=[
(Output("button_id", "disabled"), True, False),
(Output("cancel_button_id", "disabled"), False, True),
cancel=[Input("cancel_button_id", "n_clicks")],
def callback(n_clicks):
time.sleep(2.0)
return [f"Clicked {n_clicks} times"]
if __name__ == "__main__":
app.run_server(debug=True)
I run the app by executing:
$ python playground.py
The app starts and I get no errors. I open the app in the browser to see this:
image742×268 13 KB
My assumption is that the callback() is triggered on page load and that’s why the ‘Run Job!’ button is disabled. The button never gets activated so I assume that the job never finishes. The app does not throw any errors.
What am I missing? Perhaps I need another separate process for the Celery worker? I tried to follow the Celery tutorial but couldn’t adapt it to use with Dash. Thanks for any advice!
As I suspected, a separate process for the Celery worker is needed. In fact, two processes - one for the worker and another one for something called beat.
You are right. It works also without the beat process. I’ll need to do some more reading about the purpose of the beat process.
Anyways, I’m happy it works for you. I’m now struggling with making long_callbacks work in a multi-page app. Have you tried that?
Nice! Sorry, I haven’t tried it with multi-page Dash apps. Maybe you’ll have issues with circular imports that is common with Dash multi-page apps, so you’ll have to use the suggested approach of separating into two Python files the app declaration and server declaration (see URL Routing and Multiple Apps | Dash for Python Documentation | Plotly section “Structuring a Multi-Page App”).
I am not sure about how exactly it all works nor am I sure about the correct terminology but I’ll try to formulate it the way I understand it…
The Celery app is created in playground.py as an object of type Celery. The Celery app on its own does not run the asynchronous tasks. It probably only sends the information about tasks to be executed to Redis and also keeps an eye on finished tasks which are also collected in Redis.
There must be a separate worker process whose job is to look for tasks submitted to Redis, execute the tasks (asynchronously from the main dash process), and once finished, submit the results to Redis. This is the process that is started with the command:
$ celery -A playground.celery_app worker --loglevel=INFO
The celery worker will be started by executing: $ celery -A app.celery_app worker --loglevel=INFO
I found that I could avoid this need, and have the Celery worker automatically detect all long_callback by including in index.py a line:
from app import celery_app # noqa: F401
and then running the Celery worker with
celery -A index.celery_app worker --loglevel=INFO
I just hit the wall with @long_callback realising that it currently does NOT support pattern-matching dependencies.
I created a feature request for this functionality. You can add your comment there if you also find this feature important.
As a workaround, you can use a normal callback with pattern matching and store the inputs as a JSON string in a hidden div.
Then add a long_callback with the hidden div as input.
Not pretty but works for me.