django - Is it possible to delegate celery tasks and block until all is processed? -


i have task:

@app.task(name='somesmalltask') def some_small_task(some_input):    some_list = []    #do some_list    return some_list 

is possible like:

all_results = map(lambda x: some_small_task.delay(x), inputs) #do stuff later all_results 

but instead of returning celery task, result.

would have every task id?

result = some_small_task.asyncresult(task_id) result.get() 

if want start multiple tasks group, this:

>>> job = group([ ...             add.subtask((2, 2)), ...             add.subtask((4, 4)), ...             add.subtask((8, 8)), ...             add.subtask((16, 16)), ...             add.subtask((32, 32)), ... ]) >>> result = job.apply_async() >>> result.join() [4, 8, 16, 32, 64] 

this processes tasks simultaneously , returns result using join() wait last task end.

more information groups , other workflows here.


Comments

Popular posts from this blog

node.js - Node js - Trying to send POST request, but it is not loading javascript content -

javascript - Replicate keyboard event with html button -

javascript - Web audio api 5.1 surround example not working in firefox -