Python asynchronous callbacks and generators

前端 未结 3 1819
生来不讨喜
生来不讨喜 2021-02-04 12:13

I\'m trying to convert a synchronous library to use an internal asynchronous IO framework. I have several methods that look like this:

def foo:
  ....
  sync_cal         


        
相关标签:
3条回答
  • 2021-02-04 12:57

    There are several way for multiplexing tasks. We can't say what is the best for your case without deeper knowledge on what you are doing. Probably the most easiest/universal way is to use threads. Take a look at this question for some ideas.

    0 讨论(0)
  • 2021-02-04 13:00

    UPDATE: take this with a grain of salt, as I'm out of touch with modern python async developments, including gevent and asyncio and don't actually have serious experience with async code.


    There are 3 common approaches to thread-less async coding in Python:

    1. Callbacks - ugly but workable, Twisted does this well.

    2. Generators - nice but require all your code to follow the style.

    3. Use Python implementation with real tasklets - Stackless (RIP) and greenlet.

    Unfortunately, ideally the whole program should use one style, or things become complicated. If you are OK with your library exposing a fully synchronous interface, you are probably OK, but if you want several calls to your library to work in parallel, especially in parallel with other async code, then you need a common event "reactor" that can work with all the code.

    So if you have (or expect the user to have) other async code in the application, adopting the same model is probably smart.

    If you don't want to understand the whole mess, consider using bad old threads. They are also ugly, but work with everything else.

    If you do want to understand how coroutines might help you - and how they might complicate you, David Beazley's "A Curious Course on Coroutines and Concurrency" is good stuff.

    Greenlets might be actualy the cleanest way if you can use the extension. I don't have any experience with them, so can't say much.

    0 讨论(0)
  • 2021-02-04 13:09

    You need to make function foo also async. How about this approach?

    @make_async
    def foo(somearg, callback):
        # This function is now async. Expect a callback argument.
        ...
    
        # change 
        #       x = sync_call1(somearg, some_other_arg)
        # to the following:
        x = yield async_call1, somearg, some_other_arg
        ...
    
        # same transformation again
        y = yield async_call2, x
        ...
    
        # change
        #     return bar
        # to a callback call
        callback(bar)
    

    And make_async can be defined like this:

    def make_async(f):
        """Decorator to convert sync function to async
        using the above mentioned transformations"""
        def g(*a, **kw):
            async_call(f(*a, **kw))
        return g
    
    def async_call(it, value=None):
        # This function is the core of async transformation.
    
        try: 
            # send the current value to the iterator and
            # expect function to call and args to pass to it
            x = it.send(value)
        except StopIteration:
            return
    
        func = x[0]
        args = list(x[1:])
    
        # define callback and append it to args
        # (assuming that callback is always the last argument)
    
        callback = lambda new_value: async_call(it, new_value)
        args.append(callback)
    
        func(*args)
    

    CAUTION: I haven't tested this

    0 讨论(0)
提交回复
热议问题