I have got a datatable with thousands of records. I have got a postgres table with the same fields of the datatable. I want everyday to truncate this table and fill again wi
PostgreSQL definitely does have a bulk copy (it's actually called copy
), and it has a nice wrapper for .NET. If you are loading, you want to use the NpgsqlCopyIn
, and if you are extracting data you can use NpgsqlCopyOut.
Your question is a little vague on details -- I don't know the fields in your datatable or anything about your actual database, so take this as a brief example on how to bulk insert data into a table using C#/PostgreSQL:
NpgsqlCopyIn copy = new NpgsqlCopyIn("copy table1 from STDIN WITH NULL AS '' CSV;",
conn);
copy.Start();
NpgsqlCopySerializer cs = new NpgsqlCopySerializer(conn);
cs.Delimiter = ",";
foreach (var record in RecordList)
{
cs.AddString(record.UserId);
cs.AddInt32(record.Age);
cs.AddDateTime(record.HireDate);
cs.EndRow();
}
cs.Close();
copy.End();
-- Edit 8/27/2019 --
The construct for Npgsql has completely changed. Below is a boilerplate for the same example above, using binary import (text is also available):
using (var writer = conn.BeginBinaryImport(
"copy user_data.part_list from STDIN (FORMAT BINARY)"))
{
foreach (var record in RecordList)
{
writer.StartRow();
writer.Write(record.UserId);
writer.Write(record.Age, NpgsqlTypes.NpgsqlDbType.Integer);
writer.Write(record.HireDate, NpgsqlTypes.NpgsqlDbType.Date);
}
writer.Complete();
}