Python Celery - 如何在其他任务中调用celery任务

时间:2022-05-01 19:18:08

I'm calling a task within a tasks in Django-Celery

我在Django-Celery的任务中调用任务

Here are my tasks.

这是我的任务。

@shared_task
def post_notification(data,url):
    url = "http://posttestserver.com/data/?dir=praful" # when in production, remove this line.
    headers = {'content-type': 'application/json'}
    requests.post(url, data=json.dumps(data), headers=headers)


@shared_task
def shipment_server(data,notification_type):
    notification_obj = Notification.objects.get(name = notification_type)
    server_list = ServerNotificationMapping.objects.filter(notification_name=notification_obj)

    for server in server_list:
        task = post_notification.delay(data,server.server_id.url)
        print task.status # it prints 'Nonetype' has no attribute id

How can I call a task within a task? I read somewhere it can be done using group, but I'm not able to form the correct syntax. How do I do it?

如何在任务中调用任务?我在某处可以看到它可以使用group完成,但我无法形成正确的语法。我该怎么做?

I tried this

我试过这个

for server in server_list:
    task = group(post_notification.s(data, server.server_id.url))().get()
    print task.status

Throws a warning saying

发出警告说

TxIsolationWarning: Polling results w│                                                                        
ith transaction isolation level repeatable-read within the same transacti│                                                                        
on may give outdated results. Be sure to commit the transaction for each │                                                                        
poll iteration.                                                          │                                                                        
  'Polling results with transaction isolation level '

Dont know what it is!!!

不知道它是什么!

How do I solve my problem?

我该如何解决我的问题?

3 个解决方案

#1


6  

This should work:

这应该工作:

celery.current_app.send_task('mymodel.tasks.mytask', args=[arg1, arg2, arg3])

#2


2  

You are right, because each task in you for loop will be overwrite task variable.

你是对的,因为你循环中的每个任务都将覆盖任务变量。

You can try celery.group like

你可以试试celery.group之类的

from celery import group

and

@shared_task
def shipment_server(data,notification_type):
    notification_obj = Notification.objects.get(name = notification_type)
    server_list = ServerNotificationMapping.objects.filter(notification_name=notification_obj)


    tasks = [post_notification.s(data, server.server_id.url) for server in server_list]
    results = group(tasks)()
    print results.get() # results.status() what ever you want

#3


0  

you can call task from a task using delay function

您可以使用延迟功能从任务中调用任务

from app.tasks import celery_add_task
    celery_add_task.apply_async(args=[task_name]) 

... it will work

......它会奏效

#1


6  

This should work:

这应该工作:

celery.current_app.send_task('mymodel.tasks.mytask', args=[arg1, arg2, arg3])

#2


2  

You are right, because each task in you for loop will be overwrite task variable.

你是对的,因为你循环中的每个任务都将覆盖任务变量。

You can try celery.group like

你可以试试celery.group之类的

from celery import group

and

@shared_task
def shipment_server(data,notification_type):
    notification_obj = Notification.objects.get(name = notification_type)
    server_list = ServerNotificationMapping.objects.filter(notification_name=notification_obj)


    tasks = [post_notification.s(data, server.server_id.url) for server in server_list]
    results = group(tasks)()
    print results.get() # results.status() what ever you want

#3


0  

you can call task from a task using delay function

您可以使用延迟功能从任务中调用任务

from app.tasks import celery_add_task
    celery_add_task.apply_async(args=[task_name]) 

... it will work

......它会奏效