I need help to get this working. I have a pd.DataFrame (df)
, which I need to load to a MySQL database. I don\'t understand what the error message means and how
I was able to resolve this issue. I was trying to load a large table into MySQL and as a result of which was getting the error. A simple for-loop to upload data in chunks solved the issue ! Many thanks to everyone who replied.
When using sqlalchemy, you should pass the engine and not the raw connection:
engine = create_engine("mysql+mysqldb://...")
df.to_sql('demand_forecast_t', engine, if_exists='replace', index=False)
Writing to MySQL without sqlalchemy (so with specifying flavor='mysql'
) is deprecated.
When the problem is that you have a too large frame to write at once, you can use the chunksize
keyword (see the docstring). Eg:
df.to_sql('demand_forecast_t', engine, if_exists='replace', chunksize=10000)
You can write pandas dataframe in mysql table using mysql flavour(with DBAPI connection) in following ways
step1: install mysqldb module -
$ sudo apt-get install python-dev libmysqlclient-dev
then
$ pip install MySQL-python
step2: make a connection with mysql
import MySQLdb
con = MySQLdb.connect("hostname","username","password","databasename")
step3: write pandas dataframe in mysql table by using df.to_sql
df.to_sql('TableName',con = con,flavor='mysql',if_exists='replace', chunksize=100)
For me this was fixed using
MySQLdb.connect("127.0.0.1","root","","db" )
instead of
MySQLdb.connect("localhost","root","","db" )
and then
df.to_sql('df',sql_cnxn,flavor='mysql',if_exists='replace', chunksize=100)