问题
I have a DataTable/collection that is cached in memory, I want to use this as a source to generate results for an auto complete textbox (using AJAX of course). I am evaluating various options to fetch the data quickly. The number of items in the collection/rows in the datatable could vary from 10000 to 2,000,000. (So that we dont get diverted, for the moment assume that the decision has been made, I have ample RAM and I will be using the cache and not database query for this)
I have some additional business logic for this processing; I have to prioritize the auto complete list as per a priority
column (int) in the collection. So if I someone searches for Micro
and I get say 20 results of words/sentences that start with Micro
then I would pick the top 10 resultant items with highest priority. (hence the need to have a priority property associated with the string value).
The collection items are already sorted alphabetically.
What would be the best solution in this case.
1. Using DataTable.Select(.
2. Using DataTable.Rows.Find(.
3. use a custom collection with foreach or for to iterate through its values.
4. use a generic collection with anonymous delegates or lambda (since both give same performance or not?)
回答1:
The charts aren't posted on my blog entry; more details can be found at http://msdn.microsoft.com/en-us/library/dd364983.aspx
One other thing that I've since discovered is that, for large data sets, using a chained generic dictionary performs incredibly well. It also helps alleviate many of the issues caused by the sort operations required for aggregation operations such as min and max (either with DataTable.Compute
or LINQ
).
By "chained generic dictionary," I mean a Dictionary(Of String, Dictionary(Of String, Dictionary(Of Integer, List(Of DataRow))))
or similar technique, where the key for each dictionary is a search term.
Granted, this won't be useful in all circumstances, but I have at least one scenario where implementing this approach lead to a 500x
performance improvement.
In your case, I'd consider using a simple dictionary with the first 1-5 characters, then a List(Of String)
. You'd have to build up this dictionary once, adding the words to the lists with the first 1-5 characters, but after that you'll be able to get blazingly fast results.
I generally wrap things like this in a class that allows me to do things like add words easily. You may also want to use a SortedList(Of String)
, to get the results sorted automatically. This way, you can quickly look up the list of words that match the first N characters that have been typed.
回答2:
On my autocomplete
, i tried first the linq/lambda
approach, the performance is a little slow. DataTable.Select
is faster than linq
, so I use this. I haven't yet compared the performance between datatable.Select
and datatable.Find
回答3:
We could speculate about it all day, but since this is not a huge piece of code, why not write each one and benchmark them against each other?
public delegate void TestProcedure();
public TimeSpan Benchmark(TestProcedure tp)
{
int testBatchSize = 5;
List<TimeSpan> results = new List<TimeSpan>();
for(int i = 0; i<testBatchSize; i++)
{
DateTime start = DateTime.Now;
tp();
results.Add(DateTime.Now - start);
}
return results.Min();
}
回答4:
as per the following blog
http://blog.dotnetspeech.net/archive/2008/08/26/performance----datatable.select-vs-dictionary.aspx
DataTable.Rows.Find is much, much faster than DataTable.Select.
回答5:
What about a DataView? You could apply your filter condition AND sort by the the priority, and easily iterate through the results to add to your results.
来源:https://stackoverflow.com/questions/626679/datatable-select-vs-datatable-rows-find-vs-foreach-vs-findpredicatet-lambda