Django + Celery在多个工作节点上执行任务

时间:2021-01-22 19:18:53

I've deployed a django(1.10) + celery(4.x) on the same VM, with rabbitmq being the broker(on the same machine). I want to develop the same application on a multi-node architecture like I can just replicate a number of worker nodes, and scale the tasks to run quickly. Here,

我在同一个VM上部署了一个django(1.10)+芹菜(4.x),其中rabbitmq是代理(在同一台机器上)。我想在多节点架构上开发相同的应用程序,就像我可以复制许多工作节点一样,并将任务扩展为快速运行。这里,

  1. How to configure celery with rabbitmq for this architecture?
  2. 如何使用rabbitmq为这种架构配置芹菜?

  3. On the other worker nodes, what should be the setup?
  4. 在其他工作节点上,应该设置什么?

1 个解决方案

#1


8  

You should have borker in one node and configure it so that, workers from other nodes can access it.

您应该在一个节点中安装borker并对其进行配置,以便来自其他节点的工作人员可以访问它。

For that, you can create a new user/vhost on rabbitmq.

为此,您可以在rabbitmq上创建新的用户/虚拟主机。

# add new user
sudo rabbitmqctl add_user <user> <password>

# add new virtual host
sudo rabbitmqctl add_vhost <vhost_name>

# set permissions for user on vhost
sudo rabbitmqctl set_permissions -p <vhost_name> <user> ".*" ".*" ".*"

# restart rabbit
sudo rabbitmqctl restart

From other nodes, you can queue up tasks or you can just run workers to consume tasks.

从其他节点,您可以排队任务,也可以只运行工作人员来执行任务。

from celery import Celery

app = Celery('tasks', backend='amqp',
broker='amqp://<user>:<password>@<ip>/<vhost>')

def add(x, y):
    return x + y

If you have a file(say task.py) like this, you can queue up tasks using add.delay().

如果您有这样的文件(比如task.py),则可以使用add.delay()排队任务。

You can also start worker with

你也可以开始工作

celery worker -A task -l info

You can see my answer here to get a brief idea about how to run tasks on remote machines. For a step by step process, you can checkout a post i have written on scaling celery.

您可以在此处查看我的答案,以便快速了解如何在远程计算机上运行任务。对于一步一步的过程,您可以查看我在缩放芹菜上写的帖子。

#1


8  

You should have borker in one node and configure it so that, workers from other nodes can access it.

您应该在一个节点中安装borker并对其进行配置,以便来自其他节点的工作人员可以访问它。

For that, you can create a new user/vhost on rabbitmq.

为此,您可以在rabbitmq上创建新的用户/虚拟主机。

# add new user
sudo rabbitmqctl add_user <user> <password>

# add new virtual host
sudo rabbitmqctl add_vhost <vhost_name>

# set permissions for user on vhost
sudo rabbitmqctl set_permissions -p <vhost_name> <user> ".*" ".*" ".*"

# restart rabbit
sudo rabbitmqctl restart

From other nodes, you can queue up tasks or you can just run workers to consume tasks.

从其他节点,您可以排队任务,也可以只运行工作人员来执行任务。

from celery import Celery

app = Celery('tasks', backend='amqp',
broker='amqp://<user>:<password>@<ip>/<vhost>')

def add(x, y):
    return x + y

If you have a file(say task.py) like this, you can queue up tasks using add.delay().

如果您有这样的文件(比如task.py),则可以使用add.delay()排队任务。

You can also start worker with

你也可以开始工作

celery worker -A task -l info

You can see my answer here to get a brief idea about how to run tasks on remote machines. For a step by step process, you can checkout a post i have written on scaling celery.

您可以在此处查看我的答案,以便快速了解如何在远程计算机上运行任务。对于一步一步的过程,您可以查看我在缩放芹菜上写的帖子。