I have managed to work with the bulk insert in SQLAlchemy like:
conn.execute(addresses.insert(), [
{\'user_id\': 1, \'email_address\' : \'jack@yahoo.com\
Read Inserts, Updates and Deletes section of the documentation. Following code should get you started:
from sqlalchemy.sql.expression import bindparam
stmt = addresses.update().\
where(addresses.c.id == bindparam('_id')).\
values({
'user_id': bindparam('user_id'),
'email_address': bindparam('email_address'),
})
conn.execute(stmt, [
{'user_id': 1, 'email_address' : 'jack@yahoo.com', '_id':1},
{'user_id': 1, 'email_address' : 'jack@msn.com', '_id':2},
{'user_id': 2, 'email_address' : 'www@www.org', '_id':3},
{'user_id': 2, 'email_address' : 'wendy@aol.com', '_id':4},
])
@Jongbin Park's solution DID work for me with a composite primary key. (Azure SQL Server).
update_vals = []
update_vals.append(dict(Name='name_a', StartDate='2020-05-26 20:17:32', EndDate='2020-05-26 20:46:03', Comment='TEST COMMENT 1'))
update_vals.append(dict(Name='name_b', StartDate='2020-05-26 21:31:16', EndDate='2020-05-26 21:38:37', Comment='TEST COMMENT 2'))
s.bulk_update_mappings(MyTable, update_vals)
s.commit()
where Name, StartDate, and EndDate are all part of the composite pk. 'Comment' is the value to update in the db
The session has function called bulk_insert_mappings
and bulk_update_mappings
: documentation.
Be aware that you have to provide primary key in mappings
# List of dictionary including primary key
user_mappings = [{
'user_id': 1, # This is pk?
'email_address': 'jack@yahoo.com',
'_id': 1
}, ...]
session.bulk_update_mappings(User, user_mappings)
session.commit()