问题
I have an application that parses and loads data from csv files into a Postgres 9.3 database. In serial execution insert statements/cursor executions work without an issue.
I added celery in the mix to add parallel parsing and inserting of the data files. Parsing works fine. However, I go to run insert statements and I get:
[2015-05-13 11:30:16,464: ERROR/Worker-1] ingest_task.work_it: Exception
Traceback (most recent call last):
File "ingest_tasks.py", line 86, in work_it
rowcount = ingest_data.load_data(con=con, statements=statements)
File "ingest_data.py", line 134, in load_data
ingest_curs.execute(statement)
DatabaseError: error with no message from the libpq
回答1:
I encountered a similar problem when multiprocessing engine.execute()
. I solved this problem finally by just adding engine.dispose()
right in the first line under the function where the subprocess is supposed to enter, as suggested in the official document:
When a program uses multiprocessing or
fork()
, and anEngine
object is copied to the child process,Engine.dispose()
should be called so that the engine creates brand new database connections local to that fork. Database connections generally do not travel across process boundaries.
来源:https://stackoverflow.com/questions/30241911/psycopg2-error-databaseerror-error-with-no-message-from-the-libpq