I try to insert bunch of data to database
insert_list = [(1,1,1,1,1,1),(2,2,2,2,2,2),(3,3,3,3,3,3),....] #up to 10000 tuples in this list
conn = pyodbc.conn
Your problem is not with the volume of data per se, it is that some of your tuples contain numpy.int64
values that cannot be used directly as parameter values for your SQL statement. For example,
a = numpy.array([10, 11, 12], dtype=numpy.int64)
params = (1, 1, a[1], 1, 1, 1)
crsr.execute(sql, params)
will throw
ProgrammingError: ('Invalid parameter type. param-index=2 param-type=numpy.int64', 'HY105')
because the third parameter value is a numpy.int64
element from your numpy array a
. Converting that value with int()
will avoid the issue:
a = numpy.array([10, 11, 12], dtype=numpy.int64)
params = (1, 1, int(a[1]), 1, 1, 1)
crsr.execute(sql, params)
By the way, the reason that
sql = 'SET GLOBAL max_allowed_packet=50*1024*1024'
cursor.execute(sql)
didn't work is that max_allowed_packet
is a MySQL setting that does not have any meaning for Microsoft SQL Server.
For anyone out there reading this, it was driving me up the wall.
My eventual solution was to convert all variables to 'str' and it worked fine.