Share via


Parameter Sniffing Problem and Possible Workarounds

SQL Server compiles the stored procedures using (sniffing) the parameters send the first time the procedure is compiled and put it in plan cache. After that every time the procedure executed again, SQL Server retrieves the execution plan from the cache and uses it (unless there is a reason for recompilation). The potential problem arises when the first time the stored procedure is executed, the set of parameters generate an acceptable plan for that set of parameters but very bad for other more common sets of parameters.

There are some workarounds to overcome this problem.

  • OPTION (RECOMPILE)
  • OPTION (OPTIMIZE FOR (@VARIABLE=VALUE))
  • OPTION (OPTIMIZE FOR (@VARIABLE UNKNOWN))
  • Use local variables

We will be examining first 4 workarounds with below scripts.

/*

Turgay Sahtiyan - Microsoft SQL Server Premier Field Engineer

Blog : www.turgaysahtiyan.com

Twitter : @turgaysahtiyan

*/

--Parameter Sniffing

Use AdventureWorks2012

GO

--Hit ctrl+M to Include Actual Execution Plan

--Here our query

--Select * from Person.Address where City=@City

--If you run this query for "Seattle" you got Clustered Index Scan

Select * from Person.Address

where city='Seattle'

--If you run it for "Bothell" you got Index Seek+Key Lookup

Select * from Person.Address

where city='Bothell'

--SQL Server uses the statistics to determine which index and method should be used

--ok now, we can turn to the main topic. Parameter sniffing

--Create a stored procedure

create proc my_AddressSP (@city nvarchar(30))

as

      select *

      from Person.Address

      where city=@city

--Call SP first time with "Seattle" parameter

exec my_AddressSP 'Seattle'

--It did index scan

--Call it again, but this time with "Bothell" parameter

exec my_AddressSP 'Bothell'

--Normally "Bothell" query does Index seek+lookup

--But a query plan was created when SP called first time (with Seattle paramater)

-- ,cached and reused for "Bothell" execution

--And we call this problem as Paramater Sniffing

------------------------------------

--Workarounds

------------------------------------

/*

1.Workaround : OPTION (Recompile)

- Every time the query or stored procedure is executed when it arrives to the query marked with the OPTION(RECOMPILE), this query is recompiled using the current parameters values.

- In this way the plan behaves well for any combination of parameters (is statistics are good) but the overhead is recompilation of the query every time

*/

--Alter SP

alter proc my_AddressSP (@city nvarchar(30))

as

      select *

      from Person.Address

      where city=@city

      option (recompile)

--Call it for "Seattle"

exec my_AddressSP 'Seattle'

--Call it for "Bothell"

exec my_AddressSP 'Bothell'

/*

2. Workaround : OPTION (OPTIMIZE FOR (@VARIABLE=VALUE))

- Let's assume that this SP is called %99 percent for "Bothell" and only %1 percent for "Seattle"

- But if the first time it is called with "Seattle" paramater, after that all "Bothell" queries run with undesired query plan

- If you have a workload like this, you can use OPTION (OPTIMIZE FOR (@VARIABLE=VALUE)) workaround

- This option causes the compilation process to ignore the value for the specified variable and use the specified value instead.

- Unlike the first workaround (Option(Recompile)), this option avoid the overhead of recompiling the query every time.

*/

--Alter SP

alter proc my_AddressSP (@city nvarchar(30))

as

      select *

      from Person.Address

      where city=@city

      option (optimize for (@city='Bothell'))

--We call it first time with "Seattle" paramater, but the query plan is optimized for "Bothell" and cached like that

exec my_AddressSP 'Seattle'

--Call it for "Bothell", same query plan

exec my_AddressSP 'Bothell'

/*

3. Workaround : OPTION (OPTIMIZE FOR (@VARIABLE UNKNOWN))

- In this way SQL uses statistic densities instead of statistic histograms.

- So It estimates the same number of records for all paramaters

- The disadvantage is that some queries will use suboptimal plans because densities are not precise enough as the statistic histogram.

*/

--Alter SP

alter proc my_AddressSP (@city nvarchar(30))

as

      select *

      from Person.Address

      where city=@city

      option (optimize for (@city UNKNOWN))

--check the estimated number of rows. It's 34.1113 for every city

exec my_AddressSP 'Seattle'

exec my_AddressSP 'Bothell'

/*

4. Workaround : Use local variable

- This workaround is very similar with previous one (OPTION (OPTIMIZE FOR (@VARIABLE UNKNOWN)))

- when you assign the paramaters to local ones SQL Server uses statistic densities instead of statistic histograms

- So It estimates the same number of records for all paramaters

- The disadvantage is that some queries will use suboptimal plans because densities are not precise enough as the statistic histogram.

*/

--Alter SP

alter proc my_AddressSP (@city nvarchar(30))

as

      declare @city_x nvarchar(30)

      set @city_x = @city

      select *

      from Person.Address

      where city=@city_x

     

exec my_AddressSP 'Seattle'

exec my_AddressSP 'Bothell'

--drop sp

drop proc my_AddressSP

Comments

  • Anonymous
    November 18, 2013
    I just want to know how would you approach a scenerio where all SPs (+-500 SPs) already exists in your reporting database. Which option would you apply in order limit the parameter sniffing problem. Would you go through all the SP and compile them changing the where clause to use the local variable (option 4)?

  • Anonymous
    November 20, 2013
    The comment has been removed

  • Anonymous
    December 04, 2013
    Turgay, The issue will also happen , when the StoredProc is called first time SQL server design and save the plan based on the Number of records it is there at first time. If the number of records grow also it will be using the same rite ?

  • Anonymous
    March 30, 2014
    Hi Kathir Yes it might be, but we can not count it as same problem. Because you can handle your situation by updating the statistics. The basic point behind the parameter sniffing problem is; when the first time the stored procedure is executed, the set of parameters generate an acceptable plan for that set of parameters but very bad for other more common sets of parameters. Like I demod, statistics estimate 9 records for bothell but 126 records for Seattle. Even the statistics up to date.

  • Anonymous
    July 22, 2014
    Would something such as below make sense? i.e. would SQL be able to judge whether the parameter sniffed vs unknown local method was more likely to be appropriate based on table stats; thus the developer defers this judgement to the query engine (e.g. if addresses per city are fairly evenly distributed it goes with the parameter sniffing plan; if there's big differences between the number of results for each city it uses the unknown local one? alter proc my_AddressSP (@city nvarchar(30)) as      declare @city_x nvarchar(30)      set @city_x = @city      select *      from Person.Address      where city in (@city_x, @city) --or alter proc my_AddressSP (@city nvarchar(30)) as      declare @city_x nvarchar(30)      set @city_x = @city      select *      from Person.Address      where city = @city_x      or city = @city

  • Anonymous
    November 05, 2014
    i am beginner to indexes in sql,,can u tell me why sql query for "Seattle" got Clustered Index Scan and "Bothell" you got Index Seek+Key Lookup  in Parameter Sniffing Problem and Possible Workarounds ,,plz describe me breifly

  • Anonymous
    November 18, 2014
    Excellent article, helped me sort out my problem.

  • Anonymous
    November 30, 2014
    Informative.... keep on writing  !!!!

  • Anonymous
    December 22, 2014
    The comment has been removed

  • Anonymous
    January 06, 2015
    Sort of a reply to karthik. Basically, when the optimizer examines statistics for that table, it determines that a clustered index "scan" would suit the query better than multiple clustered index "seek"+"lookup" operations. This is because the non-clustered index does not contain any non-key data (unless it is specifically "included" when the NCI is created). It contains the index key(s), plus the clustering key which is used to "lookup" the data in the table itself. So because of that, for a large enough sample it is more efficient to just scan the clustered index (table) to get the data directly to begin with. For a smaller sample the seek+lookup can be more effective. And as pointed out, it uses statistics to get a general idea of the distribution of data in the table itself. Now, if space/storage is less of a concern, you could also consider using "include" in your nonclustered index which can prevent the lookup. But that's another topic :) Hope this helps!

  • Anonymous
    January 11, 2016
    How is the sniffing handled in an inline query in .Net ? I don't have a procedure in my case and due to limitations and impact i cannot convert change the inline query to a proc. To give you an idea, I have the following - IDataReader dbRdr = dbSession.ExecuteQuery                                (                                    SQL_QUERY,                                    new SqlParameter("@OrganizationType", (int)type),                                    new SqlParameter("@OrganizationId", id)