I have a script that generates tens of thousands of inserts into a postgres db through a custom ORM. As you can imagine, it\'s quite slow. This is used for development purpose
If you are just initializing constant test data, you could also put the test data into a staging table(s), then just copy the table contents, using
INSERT INTO... SELECT...
that should be about as fast as using COPY (though I did not benchmark it), with the advantage that you can copy using just SQL commands, without the hassle of setting up an external file like for COPY.