I would like to make this process in batches, because of the volume.
Here\'s my code:
getconn = conexiones()
con = getconn.mysqlDWconnect()
with con:
First point: a python db-api.cursor
is an iterator, so unless you really need to load a whole batch in memory at once, you can just start with using this feature, ie instead of:
cursor.execute("SELECT * FROM mytable")
rows = cursor.fetchall()
for row in rows:
do_something_with(row)
you could just:
cursor.execute("SELECT * FROM mytable")
for row in cursor:
do_something_with(row)
Then if your db connector's implementation still doesn't make proper use of this feature, it will be time to add LIMIT and OFFSET to the mix:
# py2 / py3 compat
try:
# xrange is defined in py2 only
xrange
except NameError:
# py3 range is actually p2 xrange
xrange = range
cursor.execute("SELECT count(*) FROM mytable")
count = cursor.fetchone()[0]
batch_size = 42 # whatever
for offset in xrange(0, count, batch_size):
cursor.execute(
"SELECT * FROM mytable LIMIT %s OFFSET %s",
(batch_size, offset))
for row in cursor:
do_something_with(row)