Psycopg2, Postgresql, Python: Fastest way to bulk-insert

前端 未结 8 1344
轻奢々
轻奢々 2020-11-30 22:48

I\'m looking for the most efficient way to bulk-insert some millions of tuples into a database. I\'m using Python, PostgreSQL and psycopg2.

I have created a long lis

相关标签:
8条回答
  • 2020-11-30 23:16

    Anyone using SQLalchemy could try 1.2 version which added support of bulk insert to use psycopg2.extras.execute_batch() instead of executemany when you initialize your engine with use_batch_mode=True like:

    engine = create_engine(
        "postgresql+psycopg2://scott:tiger@host/dbname",
        use_batch_mode=True)
    

    http://docs.sqlalchemy.org/en/latest/changelog/migration_12.html#change-4109

    Then someone would have to use SQLalchmey won't bother to try different combinations of sqla and psycopg2 and direct SQL together.

    0 讨论(0)
  • 2020-11-30 23:18

    Yeah, I would vote for COPY, providing you can write a file to the server's hard drive (not the drive the app is running on) as COPY will only read off the server.

    0 讨论(0)
提交回复
热议问题