A Celery-like Python Task Queue in 55 Lines of Code

  • For an alternative, check out RQ: http://python-rq.org/

    We use it in production and it's been rock-solid. The documentation is sparse but the source is easy to follow.

  • I'd recommend looking at alternative serialization formats. Pickle is a security risk that programmers writing distributed systems in Python should be educated about.

  •     Having a way to pickle code objects and their dependencies is a huge win, 
        and I'm angry I hadn't heard of PiCloud earlier.
    
    That's a nice use of the cloud library, without using the PiCloud service. Unfortunately, the PiCloud service itself is shutting down on February 25th (or thereabouts).

  • Although Celery can use it, why is Amazon SQS treated as a second class citizen in python background worker systems?

    I've yet to find/see a background worker pool that played nicely (properly) with SQS.

  • Thanks Jeff. As someone else mentioned, I love these little projects that demonstrate the basics of what the big projects actually do. Makes it much easier to understand the big picture.

  • http://docs.python.org/2/library/multiprocessing.html#sharin...

    Why don't anyone build Celery and Redis alternative using this?

  • I scratched an itch in this space to create, in Python, a web hook task queue. I wrote it up here http://ntorque.com -- would love to know if the rationale makes sense...

  • Are there any non-distributed task queues for Python? I need something like this for a tiny web application that just needs a queue for background tasks that is persisted, so tasks can resume in case the application crashes/restarts. Installing Redis or even ZeroMQ seems kind of excessive to me, given that the application runs on a Raspberry Pi and serves maximum 5 users at a time.

  • I like these one-off projects that Jeff is doing, but it would be particularly instructive to see one, or a combination, make it to 'real' status.