Running a cron job in Elastic Beanstalk

后端 未结 1 1476
[愿得一人]
[愿得一人] 2021-01-18 06:55

So I have a functionality in a Django Elastic Beanstalk app that works like so:

  • Download a file
  • Parse the file, run some calls to API\'s with the data
相关标签:
1条回答
  • 2021-01-18 07:14

    Only one instance will run the command because the cron job does not actually run in a cron daemon per-se.

    There are few concepts that might help you quickly grok amazon's Elastic Beanstalk mindset.

    • An elastic beanstalk environment must elect a leader instance of which there must only ever be one (And it must be a healthy instance etc).
    • A worker environment allocates work via an SQS (Simple Queue Service) queue.
    • Once a message has been read from the queue it is considered 'in-flight' until the worker returns 200 or the request times out/fails. In the first scenario the message is deleted, and in the latter scenario it re-enters the queue. (Redrive policies can determine how many times a message can fail before it is sent to the Dead Letter Queue)
    • In flight messages cannot be read again (Unless returned).

    A message in the queue is picked up only once by one of the instances in the worker environment at a time.

    Now the cron.yaml file actually just tells the leader to create a message in the queue with special attributes, at the times specified in the schedule. When it then finds this message, it's dispatched to one instance only as a POST request to the specified URL.

    When I use Django in a worker environment I create a cron app with views that map to the action I want. For example if I wanted to periodically poll a Facebook endpoint I might have a path /cron/facebook/poll/ which calls a poll_facebook() function in views.py

    That way if I have a cron.yaml as follows, it'll poll Facebook once every hour:

    version: 1
    cron:
     - name: "pollfacebook"
       url: "/cron/facebook/poll/"
       schedule: "0 * * * *"
    
    0 讨论(0)
提交回复
热议问题