Streaming results with Blaze and SqlAlchemy
问题 I am trying to use Blaze/Odo to read a large (~70M rows) result set from Redshift. By default SqlAlchemy witll try to read the whole result into memory, before starting to process it. This can be prevented by either execution_options(stream_results=True) on the engine/session or yield_per(sane_number) on the query. When working from Blaze SqlAchemy queries are generated behind the covers, leaving the execution_options approach. Unfortunately the following throws and error. from sqlalchemy