使用Celery同时执行两个任务

时间:2021-11-14 18:09:12

I'm testing celery in a local environment. My Python file has the following two lines of code:

我正在当地的环境中测试芹菜。我的Python文件有以下两行代码:

celery_app.send_task('tasks.test1', args=[self.id], kwargs={})
celery_app.send_task('tasks.test2', args=[self.id], kwargs={})

Looking at the console output they seem to execute one after another in sequence. But test2 only runs after test1 has finished. At least this is the way it seems reading the console output.

看着控制台输出,它们似乎依次按顺序执行。但test2仅在test1完成后运行。至少这是看起来读取控制台输出的方式。

These tasks have no dependancies on each other so I don't want one task waiting for another to complete before moving onto the next line.

这些任务彼此之间没有依赖关系,因此我不希望一个任务等待另一个任务完成,然后再转到下一行。

How can I execute both tasks as the same time?

如何同时执行这两项任务?

---- **** -----
--- * ***  * -- Darwin-14.0.0-x86_64-i386-64bit
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x104cd8c10
- ** ---------- .> transport:   sqs://123
- ** ---------- .> results:     disabled
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery

2 个解决方案

#1


12  

There are multiple ways to achieve this.

有多种方法可以实现这一目标。

1. Single Worker - Single Queue.

1.单一工作人员 - 单一队列。

$ celery -A my_app worker -l info  -c 2 -n my_worker

This will start a worker which executes 2 tasks at the same time.

这将启动一个同时执行2个任务的worker。

2. Multiple workers - Single Queue.

2.多个工人 - 单个队列。

$ celery -A my_app worker -l info  -c 1 -n my_worker1
$ celery -A my_app worker -l info  -c 1 -n my_worker2

This will start two workers which executes one task at a time. Note both tasks are in the same queue.

这将启动两个一次执行一个任务的工作人员。请注意,两个任务都在同一队列中。

3. Multiple workers - Multiple Queues.

3.多个工作人员 - 多个队列。

$ celery -A my_app worker -l info  -c 1 -n my_worker1 -Q queue1
$ celery -A my_app worker -l info  -c 1 -n my_worker2 -Q queue2

This will start two workers which executes one task at a time. But here you have route the tasks accordingly.

这将启动两个一次执行一个任务的工作人员。但是在这里你可以相应地路由任务。

celery_app.send_task('tasks.test1', args=[self.id], kwargs={}, queue='queue1')
celery_app.send_task('tasks.test2', args=[self.id], kwargs={}, queue='queue2')

#2


1  

Call the worker with --autoscale option which would scale up and down processes as required.

使用--autoscale选项调用worker,可根据需要扩展和缩小进程。

--autoscale AUTOSCALE
                       Enable autoscaling by providing max_concurrency,
                       min_concurrency. Example:: --autoscale=10,3 (always
                       keep 3 processes, but grow to 10 if necessary)

example.

celery -A sandbox worker --autoscale=10,0 --loglevel=info 

#1


12  

There are multiple ways to achieve this.

有多种方法可以实现这一目标。

1. Single Worker - Single Queue.

1.单一工作人员 - 单一队列。

$ celery -A my_app worker -l info  -c 2 -n my_worker

This will start a worker which executes 2 tasks at the same time.

这将启动一个同时执行2个任务的worker。

2. Multiple workers - Single Queue.

2.多个工人 - 单个队列。

$ celery -A my_app worker -l info  -c 1 -n my_worker1
$ celery -A my_app worker -l info  -c 1 -n my_worker2

This will start two workers which executes one task at a time. Note both tasks are in the same queue.

这将启动两个一次执行一个任务的工作人员。请注意,两个任务都在同一队列中。

3. Multiple workers - Multiple Queues.

3.多个工作人员 - 多个队列。

$ celery -A my_app worker -l info  -c 1 -n my_worker1 -Q queue1
$ celery -A my_app worker -l info  -c 1 -n my_worker2 -Q queue2

This will start two workers which executes one task at a time. But here you have route the tasks accordingly.

这将启动两个一次执行一个任务的工作人员。但是在这里你可以相应地路由任务。

celery_app.send_task('tasks.test1', args=[self.id], kwargs={}, queue='queue1')
celery_app.send_task('tasks.test2', args=[self.id], kwargs={}, queue='queue2')

#2


1  

Call the worker with --autoscale option which would scale up and down processes as required.

使用--autoscale选项调用worker,可根据需要扩展和缩小进程。

--autoscale AUTOSCALE
                       Enable autoscaling by providing max_concurrency,
                       min_concurrency. Example:: --autoscale=10,3 (always
                       keep 3 processes, but grow to 10 if necessary)

example.

celery -A sandbox worker --autoscale=10,0 --loglevel=info