I am writing an application in Flask, which works really well except that WSGI
is synchronous and blocking. I have one task in particular which calls out to a third party API and that task can take several minutes to complete. I would like to make that call (it's actually a series of calls) and let it run. while control is returned to Flask.
我在Flask中编写了一个应用程序,除了WSGI是同步和阻塞之外,它的工作原理非常好。我有一个特别的任务,它调用第三方API,该任务可能需要几分钟才能完成。我想打那个电话(它实际上是一系列电话)然后让它运行。控制权返回给Flask。
My view looks like:
我的观点如下:
@app.route('/render/<id>', methods=['POST'])
def render_script(id=None):
...
data = json.loads(request.data)
text_list = data.get('text_list')
final_file = audio_class.render_audio(data=text_list)
# do stuff
return Response(
mimetype='application/json',
status=200
)
Now, what I want to do is have the line
现在,我想做的就是拥有这条线
final_file = audio_class.render_audio()
run and provide a callback to be executed when the method returns, whilst Flask can continue to process requests. This is the only task which I need Flask to run asynchronously, and I would like some advice on how best to implement this.
运行并提供在方法返回时执行的回调,而Flask可以继续处理请求。这是我需要Flask异步运行的唯一任务,我想就如何最好地实现这一点提出一些建议。
I have looked at Twisted and Klein, but I'm not sure they are overkill, as maybe Threading would suffice. Or maybe Celery is a good choice for this?
我看过Twisted和Klein,但我不确定它们是否有点过分,因为Threading就足够了。或者也许Celery是一个不错的选择?
1 个解决方案
#1
44
I would use Celery to handle the asynchronous task for you. You'll need to install a broker to serve as your task queue (RabbitMQ and Redis are recommended).
我会使用Celery为您处理异步任务。您需要安装代理作为任务队列(建议使用RabbitMQ和Redis)。
app.py
:
from flask import Flask
from celery import Celery
broker_url = 'amqp://guest@localhost' # Broker URL for RabbitMQ task queue
app = Flask(__name__)
celery = Celery(app.name, broker=broker_url)
celery.config_from_object('celeryconfig') # Your celery configurations in a celeryconfig.py
@celery.task(bind=True)
def some_long_task(self, x, y):
# Do some long task
...
@app.route('/render/<id>', methods=['POST'])
def render_script(id=None):
...
data = json.loads(request.data)
text_list = data.get('text_list')
final_file = audio_class.render_audio(data=text_list)
some_long_task.delay(x, y) # Call your async task and pass whatever necessary variables
return Response(
mimetype='application/json',
status=200
)
Run your Flask app, and start another process to run your celery worker.
运行Flask应用程序,然后启动另一个进程来运行您的芹菜工作者。
$ celery worker -A app.celery --loglevel=debug
I would also refer to Miguel Gringberg's write up for a more in depth guide to using Celery with Flask.
我还要参考Miguel Gringberg撰写的关于使用Celery和Flask的更深入的指南。
#1
44
I would use Celery to handle the asynchronous task for you. You'll need to install a broker to serve as your task queue (RabbitMQ and Redis are recommended).
我会使用Celery为您处理异步任务。您需要安装代理作为任务队列(建议使用RabbitMQ和Redis)。
app.py
:
from flask import Flask
from celery import Celery
broker_url = 'amqp://guest@localhost' # Broker URL for RabbitMQ task queue
app = Flask(__name__)
celery = Celery(app.name, broker=broker_url)
celery.config_from_object('celeryconfig') # Your celery configurations in a celeryconfig.py
@celery.task(bind=True)
def some_long_task(self, x, y):
# Do some long task
...
@app.route('/render/<id>', methods=['POST'])
def render_script(id=None):
...
data = json.loads(request.data)
text_list = data.get('text_list')
final_file = audio_class.render_audio(data=text_list)
some_long_task.delay(x, y) # Call your async task and pass whatever necessary variables
return Response(
mimetype='application/json',
status=200
)
Run your Flask app, and start another process to run your celery worker.
运行Flask应用程序,然后启动另一个进程来运行您的芹菜工作者。
$ celery worker -A app.celery --loglevel=debug
I would also refer to Miguel Gringberg's write up for a more in depth guide to using Celery with Flask.
我还要参考Miguel Gringberg撰写的关于使用Celery和Flask的更深入的指南。