问题
I don't know whether this is an issue with how I'm using them or Microsoft's implementation, but SQL 2008 table-value parameters are painfully slow.
Generally if I need to use a TVP it's because I've got lots of records - currently they appear to be unusably slow for anything more than the fewest records.
I'm calling them in .Net like this:
// get the data
DataTable data = GetData();
com.CommandText = "sprocName"
// create the table-value parameter
var tvp = com.Parameters.AddWithValue("data", data);
tvp.SqlDbType = SqlDbType.Structured;
com.ExecuteNonQuery();
I ran profiler to see why, and the actual SQL statement is something like this:
declare @data table ...
insert into @data ( ... fields ... ) values ( ... values ... )
-- for each row
insert into @data ( ... fields ... ) values ( ... values ... )
sprocName(@data)
That's a really slow way to do it though. It would be much quicker if it did this instead:
insert into @data ( ... fields ... )
values ( ... values ... ),
( ... values ... ),
-- for each row
( ... values ... )
I'm not sure why it isn't using the newer, much quicker syntax. Or even whatever it does under the hood with SqlBulkCopy
.
The new syntax was added in SQL 2008, but then so are TVPs (I think).
Is there some option to make it do this? Or something that I'm missing?
回答1:
If TVPs are "noticeably slower" than the other options, then most likely you are not implementing them correctly.
- You should not be using a DataTable, unless your application has use for it outside of sending the values to the TVP. Using the
IEnumerable<SqlDataRecord>
interface is faster and uses less memory as you are not duplicating the collection in memory only to send it to the DB. I have this documented in the following places:- How can I insert 10 million records in the shortest time possible? (lots of extra info and links here as well)
- Pass Dictionary to Stored Procedure T-SQL
- Streaming Data Into SQL Server 2008 From an Application (on SQLServerCentral.com ; free registration required)
You should not use
AddWithValue
for the SqlParameter, though this is not likely a performance issue. But still, it should be:SqlParameter tvp = com.Parameters.Add("data", SqlDbType.Structured); tvp.Value = MethodThatReturnsIEnumerable<SqlDataRecord>(MyCollection);
- TVPs are Table Variables and as such do not maintain statistics. Meaning, they report only having 1 row to the Query Optimizer. So, in your proc, either:
- Use statement-level recompile on any queries using the TVP for anything other than a simple SELECT:
OPTION (RECOMPILE)
- Create a local temporary table (i.e. single
#
) and copy the contents of the TVP into the temp table - You could try adding a clustered primary key to the User-Defined Table Type
- If using SQL Server 2014 or newer, you can try making use of In-Memory OLTP / memory-optimized tables. Please see: Faster temp table and table variable by using memory optimization
- Use statement-level recompile on any queries using the TVP for anything other than a simple SELECT:
Regarding why you are seeing:
insert into @data ( ... fields ... ) values ( ... values ... )
-- for each row
insert into @data ( ... fields ... ) values ( ... values ... )
instead of:
insert into @data ( ... fields ... )
values ( ... values ... ),
( ... values ... ),
IF that is actually what is happening, then:
- If the inserts are being done within a Transaction then there is no real performance difference
- The newer value-list syntax (i.e.
VALUES (row1), (row2), (row3)
) is limited to something like 1000 rows and hence not a viable option for TVPs that do not have that limit. HOWEVER, this is not likely the reason that individual inserts are being used, given that there is no limit when doingINSERT INTO @data (fields) SELECT tab.[col] FROM (VALUES (), (), ...) tab([col])
, which I documented here: Maximum Number of Rows for the Table Value Constructor. Instead... - The reason is most likely that doing individual inserts allows for streaming the values from the app code into SQL Server:
- using an iterator (i.e. the
IEnumerable<SqlDataRecord>
noted in #1 above), the app code sends each row as it is returned from the method, and - constructing the
VALUES (), (), ...
list, even if doing theINSERT INTO ... SELECT FROM (VALUES ...)
approach (which is not limited to 1000 rows), that would still require building the entireVALUES
list before sending any of the data into SQL Server. If there is a lot of data, that would take longer to construct the super-long string, and it would take up a lot more memory while doing it.
- using an iterator (i.e. the
Please also see this whitepaper from the SQL Server Customer Advisory Team: Maximizing Throughput with TVP
回答2:
See the section "Table-Valued Parameters vs. BULK INSERT Operations"
http://msdn.microsoft.com/en-us/library/bb510489.aspx
Quote: "... table-valued parameters perform well for inserting less than 1000 rows."
It also has a table to show what technology to use based on the speed of insert operations.
I hope this helps, good luck.
来源:https://stackoverflow.com/questions/6071818/problems-with-table-value-parameter-performance