问题
I have a bunch of pairs of values [(foo1, bar1), (foo2, bar2), ...]
and I want to do a bunch of updates of "set the 'foo' column to 'foo1' where the 'bar' column is 'bar1'".
I am doing this in Python with psycopg2. I could do executemany
with the query UPDATE table SET foo = %s WHERE bar = %s
, but that's a lot of little updates and would take mad long.
How can I do this easily and fast? Perhaps something with a temp table?
Postgres version 9.3.
回答1:
UPDATE tbl t
SET foo = v.foo
FROM (
VALUES ('foo1'::text, 'bar1'::text), ('foo2', 'bar2'), ...
) v(foo, bar)
WHERE t.bar = v.bar;
Explicit type casts are only required in the first row of the values expression. text
in the example - could be anything. String literal in subsequent rows are coerced to the same types.
Depending on the form you have the key-value pairs, other methods may be more convenient. Like: create a temporary table, COPY
to it, then use the temp table in the UPDATE
like any other table. Details:
- How to update selected rows with values from a CSV file in Postgres?
Or you can pass two simple arrays and unnest in parallel (syntax for Postgres 9.3):
UPDATE tbl t
SET foo = v.foo
FROM (
SELECT unnest('{foo1,foo2,...}'::text[]) AS foo
, unnest('{bar1,bar2,...}'::text[]) AS bar
) v(foo, bar)
WHERE t.bar = v.bar;
Postgres 9.4 has a better way:
- Is there something like a zip() function in PostgreSQL that combines two arrays?
来源:https://stackoverflow.com/questions/30204102/bulk-update-table-with-where-clause-that-is-variable