How can I import an \"xxxx.sql\" dump from MySQL to a PostgreSQL database?
Don't expect that to work without editing. Maybe a lot of editing.
mysqldump has a compatibility argument, --compatible=name
, where "name" can be "oracle" or "postgresql", but that doesn't guarantee compatibility. I think server settings like ANSI_QUOTES have some effect, too.
You'll get more useful help here if you include the complete command you used to create the dump, along with any error messages you got instead of saying just "Nothing worked for me."
You could potentially export to CSV from MySQL and then import CSV into PostgreSQL.
Use your xxx.sql file to set up a MySQL database and make use of FromMysqlToPostrgreSQL. Very easy to use, short configuration and works like a charm. It imports your database with the set primary keys, foreign keys and indices on the tables. You can even import data alone if you set appropriate flag in the config file.
FromMySqlToPostgreSql migration tool by Anatoly Khaytovich, provides an accurate migration of table data, indices, PKs, FKs... Makes an extensive use of PostgreSQL COPY protocol.
See here too: PG Wiki Page
brew update && brew install pgloader
pgloader mysql://user@host/db_name postgresql://user@host/db_name
I had to do this recently to a lot of large .sql files approximately 7 GB in size. Even VIM had troubling editing those. Your best bet is to import the .sql into MySql and then export it as a csv which can be then imported to Postgres.
But, the MySQL export as a csv is horrendously slow as it runs the select * from yourtable query. If you have a large database/table I would suggest using some other method. One way is to write a script that reads the sql inserts line by line and uses string manipulation to reformat it to "Postgres-compliant" insert statements and then execute these statements in Postgres