Django和Celery - 在更改后将代码重新加载到Celery中

时间:2022-08-20 19:17:33

If I make a change to tasks.py while celery is running, is there a mechanism by which it can re-load the updated code? or do I have to shut Celery down a re-load?

如果我在celery运行时对tasks.py进行了更改,是否有一种机制可以重新加载更新的代码?或者我必须关闭芹菜重新装载?

I read celery had an --autoreload argument in older versions, but I can't find it in the current version:

我读过芹菜在旧版本中有一个--autoreload参数,但我在当前版本中找不到它:

celery: error: unrecognized arguments: --autoreload

celery:错误:无法识别的参数:--autoreload

3 个解决方案

#1


16  

Unfortunately --autoreload doesn't work and it is deprecated.

不幸的是--autoreload不起作用,它已被弃用。

You can use Watchdog which provides watchmedo a shell utilitiy to perform actions based on file events.

您可以使用Watchdog为watchmedo提供shell实用程序,以根据文件事件执行操作。

pip install watchdog

You can start worker with

你可以开始工作

watchmedo auto-restart -- celery worker -l info -A foo

By default it will watch for all files in current directory. These can be changed by passing corresponding parameters.

默认情况下,它将监视当前目录中的所有文件。可以通过传递相应的参数来改变它们。

watchmedo auto-restart -d . -p '*.py' -- celery worker -l info -A foo

If you are using django and don't want to depend on watchdog, there is a simple trick to achieve this. Django has autoreload utility which is used by runserver to restart WSGI server when code changes.

如果你正在使用django并且不想依赖看门狗,那么有一个简单的技巧可以实现这一点。 Django具有自动重载实用程序,当代码更改时,runserver将使用该实用程序重新启动WSGI服务器。

The same functionality can be used to reload celery workers. Create a seperate management command called celery. Write a function to kill existing worker and start a new worker. Now hook this function to autoreload as follows.

相同的功能可用于重新加载芹菜工人。创建一个名为celery的独立管理命令。编写一个函数来杀死现有的worker并启动一个新的worker。现在将此函数挂钩以自动重载如下。

import shlex
import subprocess

from django.core.management.base import BaseCommand
from django.utils import autoreload


def restart_celery():
    cmd = 'pkill celery'
    subprocess.call(shlex.split(cmd))
    cmd = 'celery worker -l info -A foo'
    subprocess.call(shlex.split(cmd))


class Command(BaseCommand):

    def handle(self, *args, **options):
        print('Starting celery worker with autoreload...')
        autoreload.main(restart_celery)

Now you can run celery worker with python manage.py celery which will autoreload when codebase changes.

现在您可以使用python manage.py celery运行celery worker,它会在代码库更改时自动重载。

This is only for development purposes and do not use it in production.

这仅用于开发目的,不在生产中使用。

#2


1  

You could try SIGHUP on the parent worker process, it restarts the worker, but I'm not sure if it picks up new tasks. Worth a shot, thought :)

您可以在父工作进程上尝试SIGHUP,它会重新启动工作程序,但我不确定它是否会接收新任务。值得一试,想:)

#3


1  

FYI, for anyone using Docker, I couldn't find an easy way to make the above options work, but I found (along with others) another little script here which does use watchdog and works perfectly.

仅供参考,对于任何使用Docker的人来说,我找不到一个简单的方法来使上述选项有效,但我发现(和其他人一样)这里的另一个小脚本确实使用看门狗并且工作得很好。

Save it as some_name.py file in your main directory, add pip install psutil and watchdog to requirements.txt, update the path/cmdline variables at the top, then in the worker container of your docker-compose.yml insert:

将它保存为主目录中的some_name.py文件,将pip install psutil和watchdog添加到requirements.txt,更新顶部的路径/ cmdline变量,然后在docker-compose.yml插入的worker容器中:

command: python ./some_name.py

#1


16  

Unfortunately --autoreload doesn't work and it is deprecated.

不幸的是--autoreload不起作用,它已被弃用。

You can use Watchdog which provides watchmedo a shell utilitiy to perform actions based on file events.

您可以使用Watchdog为watchmedo提供shell实用程序,以根据文件事件执行操作。

pip install watchdog

You can start worker with

你可以开始工作

watchmedo auto-restart -- celery worker -l info -A foo

By default it will watch for all files in current directory. These can be changed by passing corresponding parameters.

默认情况下,它将监视当前目录中的所有文件。可以通过传递相应的参数来改变它们。

watchmedo auto-restart -d . -p '*.py' -- celery worker -l info -A foo

If you are using django and don't want to depend on watchdog, there is a simple trick to achieve this. Django has autoreload utility which is used by runserver to restart WSGI server when code changes.

如果你正在使用django并且不想依赖看门狗,那么有一个简单的技巧可以实现这一点。 Django具有自动重载实用程序,当代码更改时,runserver将使用该实用程序重新启动WSGI服务器。

The same functionality can be used to reload celery workers. Create a seperate management command called celery. Write a function to kill existing worker and start a new worker. Now hook this function to autoreload as follows.

相同的功能可用于重新加载芹菜工人。创建一个名为celery的独立管理命令。编写一个函数来杀死现有的worker并启动一个新的worker。现在将此函数挂钩以自动重载如下。

import shlex
import subprocess

from django.core.management.base import BaseCommand
from django.utils import autoreload


def restart_celery():
    cmd = 'pkill celery'
    subprocess.call(shlex.split(cmd))
    cmd = 'celery worker -l info -A foo'
    subprocess.call(shlex.split(cmd))


class Command(BaseCommand):

    def handle(self, *args, **options):
        print('Starting celery worker with autoreload...')
        autoreload.main(restart_celery)

Now you can run celery worker with python manage.py celery which will autoreload when codebase changes.

现在您可以使用python manage.py celery运行celery worker,它会在代码库更改时自动重载。

This is only for development purposes and do not use it in production.

这仅用于开发目的,不在生产中使用。

#2


1  

You could try SIGHUP on the parent worker process, it restarts the worker, but I'm not sure if it picks up new tasks. Worth a shot, thought :)

您可以在父工作进程上尝试SIGHUP,它会重新启动工作程序,但我不确定它是否会接收新任务。值得一试,想:)

#3


1  

FYI, for anyone using Docker, I couldn't find an easy way to make the above options work, but I found (along with others) another little script here which does use watchdog and works perfectly.

仅供参考,对于任何使用Docker的人来说,我找不到一个简单的方法来使上述选项有效,但我发现(和其他人一样)这里的另一个小脚本确实使用看门狗并且工作得很好。

Save it as some_name.py file in your main directory, add pip install psutil and watchdog to requirements.txt, update the path/cmdline variables at the top, then in the worker container of your docker-compose.yml insert:

将它保存为主目录中的some_name.py文件,将pip install psutil和watchdog添加到requirements.txt,更新顶部的路径/ cmdline变量,然后在docker-compose.yml插入的worker容器中:

command: python ./some_name.py