问题
I have multiple JSON files, they all have same format but the values are different based on each transaction. I want to migrate this data to a postgresql table. What is the best way to proceed with this?
Right now, I am using the following query:
CREATE TABLE TEST (MULTIPROCESS VARCHAR(20), HTTP_REFERER VARCHAR(50));
INSERT INTO TEST SELECT MULTIPROCESS, HTTP_REFERER FROM json_populate_record(NULL::test, '{"multiprocess": true,"http_referer": "http://localhost:9000/"}');
But, once the number of files become large, it becomes very difficult to use this technique. Is there any other way to do this effectively?
回答1:
You could use a LATERAL JOIN to do insert more than one row at a time:
WITH
json AS(
VALUES('{"multiprocess": true,"http_referer":"http://localhost:9000"}')
,('{"multiprocess": false,"http_referer": "http://localhost:9001/"}')
,('{"multiprocess": true,"http_referer": "http://localhost:9002/"}')
) INSERT INTO test
SELECT multiprocess, http_referer
FROM json, LATERAL json_populate_record(NULL::test, json.column1::json);
Or you could insert into a staging table first and then populate your other table.
来源:https://stackoverflow.com/questions/29497662/how-to-insert-multiple-json-files-into-postgresql-table-at-a-time