问题
Looking for good way to load FIXED-Width data into postgres tables. I do this is sas and python not postgres. I guess there is not a native method. The files are a few GB. The one way I have seen does not work on my file for some reason (possibly memory issues). There you load as one large column and then parse into tables. I can use psycopy2 but because of memory issues would rather not. Any ideas or tools that work. Does pgloader work well or are there native methods?
http://www.postgresonline.com/journal/index.php?/archives/157-Import-fixed-width-data-into-PostgreSQL-with-just-PSQL.html
Thanks
回答1:
There's no convenient built-in method to ingest fixed-width tabular data in PostgreSQL. I suggest using a tool like Pentaho Kettle or Talend Studio to do the data-loading, as they're good at consuming many different file formats. I don't remember if pg_bulkload
supports fixed-width, but suspect not.
Alternately, you can generally write a simple script with something like Python and the psycopg2
module, loading the fixed-width data row by row and sending that to PostgreSQL. psycopg2
's support for the COPY
command via copy_from
makes this vastly more efficient. I didn't find a convenient fixed-width file reader for Python in a quick search but I'm sure they're out there. You can use whatever language you like anyway - Perl's DBI
and DBD::Pg
do just as well, and there are millions of fixed-width file reader modules for Perl.
来源:https://stackoverflow.com/questions/20873227/fixed-width-data-into-postgres