问题
I have read the theory and views behind SQL Server 2008's "OPTIMIZE FOR UNKNOWN" query plan option. I understand what it does well enough.
I did some limited experiments and found that with a warm cache, it only was of benefit on > 100k rows. However, this was on a simple table and query with no joins, filtering, etc. On a cold cache, the picture would undoubtedly be much more in its favor.
I don't currently have a production system to bench the before/after. So I am curious if anyone has done before/after testing and made any useful discoveries as to exactly when to use this option and when not to.
UPDATE:
I created a table with 3 cols, a PK on the UID, and an index on Col2 (int). All processing was against Col2. Indicated are number of rows and time (DATEDIFF * 1000000):
Type 1,000 5,000 20,000 100,000
Normal 0.3086 6.327 26.427 144.83, 141.126
Recompile 117.59 584.837
For Unknown 0.8101 6.52 26.89 143.788, 143.248
回答1:
You would use it when your data is sufficiently skewed that a plan generated with one parameter value would be completely unsuitable for another potential value of the parameter. i.e. to resolve a parameter sniffing issue.
The remainder of your question doesn't seem particularly relevant to the purpose of the hint or answerable IMO.
来源:https://stackoverflow.com/questions/4376313/experience-with-when-to-use-optimize-for-unknown