This is a follow-up to Handle an exception thrown in a generator and discusses a more general problem.
I have a function that reads data in different formats. All fo
Thinking deeper about what would happen in a more complex case kind of vindicates the Python choice of avoiding bubbling exceptions out of a generator.
If I got an I/O error from a stream object the odds of simply being able to recover and continue reading, without the structures local to the generator being reset in some way, would be low. I would somehow have to reconcile myself with the reading process in order to continue: skip garbage, push back partial data, reset some incomplete internal tracking structure, etc.
Only the generator has enough context to do that properly. Even if you could keep the generator context, having the outer block handle the exceptions would totally flout the Law of Demeter. All the important information that the surrounding block needs to reset and move on is in local variables of the generator function! And getting or passing that information, though possible, is disgusting.
The resulting exception would almost always be thrown after cleaning up, in which case the reader-generator will already have an internal exception block. Trying very hard to maintain this cleanliness in the brain-dead-simple case only to have it break down in almost every realistic context would be silly. So just have the try
in the generator, you are going to need the body of the except
block anyway, in any complex case.
It would be nice if exceptional conditions could look like exceptions, though, and not like return values. So I would add an intermediate adapter to allow for this: The generator would yield either data or exceptions and the adapter would re-raise the exception if applicable. The adapter should be called first-thing inside the for loop, so that we have the option of catching it within the loop and cleaning up to continue, or breaking out of the loop to catch it and and abandon the process. And we should put some kind of lame wrapper around the setup to indicate that tricks are afoot, and to force the adapter to get called if the function is adapting.
That way each layer is presented errors that it has the context to handle, at the expense of the adapter being a tiny bit intrusive (and perhaps also easy to forget).
So we would have:
def read(stream, parsefunc):
try:
for source in frozen(parsefunc(stream)):
try:
record = source.thaw()
do_stuff(record)
except Exception, e:
log_error(e)
if not is_recoverable(e):
raise
recover()
except Exception, e:
properly_give_up()
wrap_up()
(Where the two try
blocks are optional.)
The adapter looks like:
class Frozen(object):
def __init__(self, item):
self.value = item
def thaw(self):
if isinstance(value, Exception):
raise value
return value
def frozen(generator):
for item in generator:
yield Frozen(item)
And parsefunc
looks like:
def parsefunc(stream):
while not eof(stream):
try:
rec = read_record(stream)
do_some_stuff()
yield rec
except Exception, e:
properly_skip_record_or_prepare_retry()
yield e
To make it harder to forget the adapter, we could also change frozen from a function to a decorator on parsefunc.
def frozen_results(func):
def freezer(__func = func, *args, **kw):
for item in __func(*args, **kw):
yield Frozen(item)
return freezer
In which case we we would declare:
@frozen_results
def parsefunc(stream):
...
And we would obviously not bother to declare frozen
, or wrap it around the call to parsefunc
.
(I answered the other question linked in the OP but my answer applies to this situation as well)
I have needed to solve this problem a couple of times and came upon this question after a search for what other people have done.
One option- which will probably require refactoring things a little bit- would be to simply create an error handling generator, and throw
the exception in the generator (to another error handling generator) rather than raise
it.
Here is what the error handling generator function might look like:
def err_handler():
# a generator for processing errors
while True:
try:
# errors are thrown to this point in function
yield
except Exception1:
handle_exc1()
except Exception2:
handle_exc2()
except Exception3:
handle_exc3()
except Exception:
raise
An additional handler
argument is provided to the parsefunc
function so it has a place to put the errors:
def parsefunc(stream, handler):
# the handler argument fixes errors/problems separately
while not eof(stream):
try:
rec = read_record(stream)
do some stuff
yield rec
except Exception as e:
handler.throw(e)
handler.close()
Now just use almost the original read
function, but now with an error handler:
def read(stream, parsefunc):
handler = err_handler()
for record in parsefunc(stream, handler):
do_stuff(record)
This isn't always going to be the best solution, but it's certainly an option, and relatively easy to understand.