task-queue

In Celery Task Queue, is running tasks in a group any different than multiple asyncs in a loop?

笑着哭i 提交于 2019-12-12 12:40:28
问题 Let's say I have a very simple task like this: @celery.task(ignore_result=True) def print_page(page): with open('path/to/page','w') as f: f.write(page) (Please ignore the potential race condition in the above code... this is a simplified example) My question is whether the following two code samples would produce identical results, or if one is better than the other: Choice A: @celery.task(ignore_result=True) def print_pages(page_generator): for page in page_generator: print_page.s(page)

How to design an app that does heavy tasks and show the result in the frontend (ex Google Search Console)

十年热恋 提交于 2019-12-12 04:12:19
问题 Let's imagine this: I have to download an XML document from an URL; I have to elaborate this document and persist its information in the database, creating or updating a lot of entites. I think the best way is to use queues. Or maybe I can also use cronjobs. My problem is this: if I use the same app to do the heavy tasks and also to show to the end user the results of those heavy tasks, it may happen that the heavy tasks slow down the main website. Take a more concrete example from real life:

How do I integrate google app engine, taks queue and google compute engine?

我的未来我决定 提交于 2019-12-12 03:16:20
问题 I been trying to understand how I can setup the follow architecture on Google's cloud: Google app engine receives HTTP request Google app engine queues a pull task as a result of the HTTP request The task is received by a auto scaling google compute engine instance group Are there solutions that someone can point me to how to setup a auto scaling task pull task queue handler? Each of my tasks will take approximately a minute to process I estimate. 回答1: GCE has a new feature called autoscaler

Google App Engine: Modifying 1000 entities using TaskQueue

岁酱吖の 提交于 2019-12-11 23:38:27
问题 I am hoping to modify 1000 entities using task queue, as suggested Zig Mandel in my original question here: Google App Engine: Modifying 1000 entities I have a UserAccount kind like this: class UserAccount(ndb.Model): email = ndb.StringProperty() Some of the UserAccount email s contain uppercases (example: JohnathanDough@email.com), and I would like to apply email.lower() to every entity's email. So I've set up a task queue like this: class LowerEmailQueue(BaseHandler): def get(self): all

Google App Engine Task Queue: DeadlineExceededError on file upload

て烟熏妆下的殇ゞ 提交于 2019-12-11 13:02:28
问题 I have a large file I'm uploading. The entire request can take more than the 30 second limit, so I moved it to a task queue. The problem is that I'm still getting this error, even in a task. I'm assuming this is because it's a single request to upload a file, and is not immune to the 30-second limit because of this. Is there any way to circumvent this limit, aside from using a 'backend' solution (App engine just added this I think, but it's a pay feature and looks a bit complicated)? I can't

Undefined property: Illuminate\Queue\Jobs\BeanstalkdJob:: $name

那年仲夏 提交于 2019-12-11 10:26:29
问题 I'm using beanstalkd with Laravel to queue some tasks but I'm having trouble to send data to the function that handles the queue , Here is my code //Where I call the function $object_st = new stdClass(); $object_st->Person_id = 2 ; //If I do this: echo($object_st->Person_id); , I get 2 Queue::push('My_Queue_Class@My_Queue_Function', $object_st ); And the function that handle the queue is the following public function My_Queue_Function( $Data ) { $Person_id = $Data->Person_id; //This generate

GAE - What is the fastest way to add tasks to queue? Why does this appear to be so slow?

∥☆過路亽.° 提交于 2019-12-11 10:23:17
问题 I am using Google App Engine (Python) to process some event messages in real time. In short I have 100+ tasks that I need to run fast when a message comes in. I have tried a few approaches (deferred library, threads) and I think the best solution involves using the task queue and asynchronously adding these tasks to the queue I want. Here's an example of what I am doing. tasks = [] task = Task(url=url_for('main.endpoints_worker'),params={'id': id}) tasks.append(task.add_async(queue_name=

handling failure after maximum number of retries in google app engine task queues

瘦欲@ 提交于 2019-12-11 09:05:58
问题 I am using google-api-python-client and I am using google app engine task queues for some async operations. For the specific task queue, I am also setting max number of times that the task should be retried(In my case retries are less likely to be successful, so I want to limit them). Is there a way to write a handler which can handle the case where the task is still failing even after the specified number of retries? Basically if my retry limit is 5, after 5 unsuccessful retries, I want to

App Engine - Task Queue Retry Count with Mapper API

99封情书 提交于 2019-12-11 06:28:57
问题 here is what I'm trying to do: I set up a MapReduce job with the new Mapper API. This basically works fine. The problem is that the Task Queue retries all tasks that have failed. But actually I don't want him to do that. Is there a way to delete a task from the queue or tell it that the task was completed successfully? Perhaps passing a 200 status code? I know that I can fetch the X-Appengine-Taskretrycount, but that doesn't really help since I don't know how to stop the task. I tried using a

How can I tell if a set of app engine tasks have all completed?

时光总嘲笑我的痴心妄想 提交于 2019-12-11 05:12:39
问题 If I have a loop that enqueues say 100 tasks and each one of those tasks potentially enqueues a task, how can I tell if all tasks have completed? I've thought about this problem using ShardedCounters. Once each task is completed, I could increment a counter then check to see if count == 100. Of course that falls apart with tasks spawning their own tasks unless I get into this recursive counting scenario. I'm not sure it's a good idea to go down that rabbit hole because it appears the sharded