Udostępnij za pośrednictwem


Conor vs. Dynamic SQL vs. Procedures vs. Plan Quality for Parameterized Queries

I received a question from a customer today about query plan choice for a parameterized query - specifically that different plans get picked based on where the plan is located, and sometimes one might be better than another.  Apparently, for their application, a particular query plan was “better” when compiled in dynamic sql instead of a stored procedure.  Others say that query plans are just as fine when compiled within a stored procedure.  Which is it?

(If I were still consulting, the answer to this is “it depends” ;))

There are a lot of things necessary to answer this question, as it really isn’t about the exact location, at least not as such.  It has more to do with the way different batches/procedures are compiled and also about how parameters are sniffed during plan compilation.

Let’s start at the beginning:  The model for T-SQL (the procedural part of the programming surface area) is compiled in batches in SQL Server.  This means that if you submit a batch of 5 queries, SQL Server will generally try to compile them all before executing any of them.  (There are exceptions)

Now, let’s mix things like "set @p=5” in that query batch, and let’s have a later query reference that parameter.  In this example, the whole batch is “compiled” before it is “executed”.  That means that the “set @p=5” does not actually set @p=5 until that statement is executed.  That means that the later query that references that parameter will sniff a NULL instead of a 5.

I wrote about the basics of this issue a few years ago in the QO team blog:

I Smell a Parameter!

So, back to our question – which one is “better”.  The answer is that this problem is about scopes and not procedures vs. dynamic sql.  Basically, putting the parameterized query within a nested scope (dynamic sql or another procedure) happens to impact whether the parameter gets sniffed or not.  It’s the sniffing that impacts the plan choice, not from where it is hosted in T-SQL.

 

Now, why did the plan become “good” or “bad”:

* Sometimes when you sniff a NULL, you get a perfectly reasonable plan (if NULL is represented in similar proportions to other possible values). 

* Sometimes if you sniff a NULL, you can get an absolutely horrible plan.

 

Generally, you want to get the Optimizer to see a good average value for your parameterized queries so that it will put a plan that works well, on average, for the average parameter case.  If you never call the procedure with NULL for a parameter AND it generates a different plan than if you had compiled it with an actual value you would use, then you probably want to look at that and see how to get it to sniff a more common value – you want to get that Optimizer working for _you_ and not for that dreaded NULL :).

 

So, we’re done, right? Unfortunately no – it gets more complex again.  There are some cases where one plan may not perform great for all parameter values (so, there is either not a good average value or there is otherwise some large performance difference based on the parameter value for the “best” plan).  Let’s say that you are querying some big table for the “USA” table during the day and the “Togo” (a small island nation) data at night.  These plans may be very different because their filter’s selectivities will be different.  We refer to this as a plan with parameter-sensitivity.

 

For these parameter sensitive plans, you can also see “good” vs. ”bad” performance.  By that, the performance differs, perhaps even by a lot, based on the parameter value.  If you have cases like this, you have a few options. 

So, there are a few different ways you can see “good” vs. “bad” plans for parameterized queries.  Let’s talk about some practical guidance for how to avoid these issues.

 

First, you can move things to a new scope – that’s what happens when you move to dynamic sql.  This likely gets the parameter sniffed and almost always will give you a good plan (at least for that parameter value).  Another option is to just use an OPTIMIZE FOR hint to pass a common value to be sniffed – this is a great way to not have to worry about whether the batch compiles before it executes, etc.  Another option is to use OPTION(RECOMPILE).  If you have a parameter-sensitive case where sniffing the value does not work well, then you will want to consider OPTION(RECOMPILE) – when the compilation cost is smaller than the performance difference from using a suboptimal plan for the wrong parameter value, then it is a very good idea. 

 

I’ve given you the short version :)  There are some additional details around “deferred compilation” where a batch is only partially compiled before parts of it gets executed.  SQL 2005+ also has statement-level recompile, an this can change the sniffing logic as well.  I am not trying to give you a complete answer other than to give you the tools to ask the right questions – this stuff is tricky!

 

Happy Querying!

 

Conor Cunningham

Architect, SQL Server QP

Comments

  • Anonymous
    June 03, 2009
    How about some samples Conner? Err.... the long version? :)