SQL Server 2022 Parameter Sensitive Plan Optimization: How PSP Can Help Some Queries With IF Branches

Time Served


I’ve spent a bit of time talking about how IF branches can break query performance really badly in SQL Server.

While the Parameter Sensitive Plan (PSP) optimization won’t fix every problem with this lazy coding habit, it can fix some of them in very specific circumstances, assuming:

  • The parameter is eligible for PSP
  • The parameter is present across IF branches

We’re going to use a simple one parameter example to illustrate the potential utility here.

After all, if I make these things too complicated, someone might leave a comment question.

The horror

IFTTT


Here’s the procedure we’re using. The point is to execute one branch if @Reputation parameter is equal to one, and another branch if it equals something else.

In the bad old days, both queries would get a plan optimized at compile time, and neither one would get the performance boost that you hoped for.

In the good news days that you’ll probably get to experience around 2025, things are different!

CREATE OR ALTER PROCEDURE 
    dbo.IFTTT 
(
    @Reputation int
)
AS 
BEGIN
SET NOCOUNT, XACT_ABORT ON;

SET STATISTICS XML ON;  

    IF @Reputation = 1
    BEGIN
        SELECT 
            u.Id, 
            u.DisplayName, 
            u.Reputation, 
            u.CreationDate
        FROM dbo.Users AS u
        WHERE u.Reputation = @Reputation;
    END;

    IF @Reputation > 1
    BEGIN
        SELECT 
            u.Id, 
            u.DisplayName, 
            u.Reputation, 
            u.CreationDate
        FROM dbo.Users AS u
        WHERE u.Reputation = @Reputation;
    END;

SET STATISTICS XML OFF; 

END;
GO 

Johnson & Johnson


If we execute these queries back to back, each one gets a new plan:

EXEC dbo.IFTTT 
    @Reputation = 1;
GO 

EXEC dbo.IFTTT 
    @Reputation = 2;
GO
SQL Server Query Plan
psychic driving

Optimize For You


The reason why is in the resulting queries, as usual. The Reputation column has enough skew present to trigger the PSP optimization, so executions with differently-bucketed parameter values end up with different plans.

option (PLAN PER VALUE(QueryVariantID = 3, predicate_range([StackOverflow2013].[dbo].[Users].[Reputation] = @Reputation, 100.0, 1000000.0)))

option (PLAN PER VALUE(QueryVariantID = 2, predicate_range([StackOverflow2013].[dbo].[Users].[Reputation] = @Reputation, 100.0, 1000000.0)))

And of course, each plan has different compile and runtime values:

SQL Server Query Plan
care

If I were to run this demo in a compatibility level under 160, this would all look totally different.

This is one change I’m sort of interested to see the play-out on.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

SQL Server IF Branches And Query Performance Part 5: Does Dynamic SQL Work?

Bestest


If we want to start this post in the right place, we have to go back to 2009. That was the first time a (much) young(er) Erik Darling wrote dynamic SQL that didn’t suck.

There was a terrible piece of SSRS report code that used the standard catch-all pattern repeated over most of a dozen columns: (a.col = @a_parameter OR @a_parameter IS NULL)

I neither pioneered the non-sucky dynamic SQL approach, nor did I end up solving the problem entirely just by implementing non-sucky dynamic SQL. I did get things most of the way there, but every so often something bad would still happen:

At no point did I stop, take a deep breath, and just use a recompile hint on this piece of code that would run every 6-8 hours at most.

I was obsessed. I was possessed.  I had to do things the right way.

But the right way didn’t always give me the best results, and I had a bad case of Egg-On-Face syndrome from swearing that I did things the right way but still ended up with a bad result.

Not all the time, but when it happened I’d get just as much guff as when things were always bad.

Goodie Howser


Good dynamic SQL and good stored procedures can suffer from the same issue: Parameter Sensitivity.

  • Plan A works great for Parameter A
  • Plan A does not work great for Parameter B
  • Plan B works great for Parameter B
  • Plan B does not work great for Parameter A

And so on. Forever. Until the dragons return. Which, you know, any day now would be great 🤌

In our quest to solve the problems with IF branching, we’ve tried a lot of things. Most of them didn’t work. Stored procedures worked for the most part, but we’ve still got a problem.

A parameter sensitivity problem.

The problem within stored procedures is a lot like the problems we saw with IF branches in general: the optimizer can’t ever pause to take a breath and make better decisions.

It would be nice if the concept of deferred name resolution had a deeper place in plan creation that would allow for deferred query optimization when IF branch boundaries are involved.

Instead, we’ll probably just keep getting different ways to work with JSON.

Looking forward to FOR YAML PATH queries, too.

Toothsome


The lovely part about dynamic SQL is that you get to construct your own query, and you can take what you know about data distributions to construct different queries.

You know all the Intelligent Query Processing and Adaptive doodads and gizmos galore that the optimizer is getting added to it? You could have been the optimizer all along.

But you were too busy rebuilding indexes and other memes that don’t bear mentioning at this late date.

We still need to write dynamic SQL the right way, but we also need to offer the optimizer a way to understand that while every parameter is created equally, not all data distributions are.

In other words, we need to give it some room to breathe.

But here I am meandering again. Let’s get on to the SQL.

Grim


I’m going to use the example from yesterday’s post to show you what you can do to further optimize queries like this.

To make the code fit in the post a little better, I’m going to skip the IF branch for the Posts table and go straight to Votes. Using dynamic SQL here will get you the same behavior at stored procedures, though.

CREATE OR ALTER PROCEDURE
    dbo.counter_if_dynamic
(
    @PostTypeId int = NULL,
    @VoteTypeId int = NULL,
    @CreationDate datetime = NULL
)
AS
SET NOCOUNT, XACT_ABORT ON;
BEGIN

DECLARE
    @sql nvarchar(4000) = N'',
    @posts_parameters nvarchar(4000) = N'@PostTypeId int, @CreationDate datetime',
    @votes_parameters nvarchar(4000) = N'@VoteTypeId int, @CreationDate datetime';

/*
    Cutting out the Posts query for brevity.
    This will behave similarly to stored procedures
    in separating the execution plan creation out.
*/

IF @VoteTypeId IS NOT NULL
BEGIN

SELECT
    @sql += N'
    /*dbo.counter_if_posts*/
    SELECT
        c = COUNT_BIG(*)
    FROM dbo.Posts AS p
    JOIN dbo.Votes AS v
        ON p.Id = v.PostId
    WHERE v.VoteTypeId = @VoteTypeId
    AND   v.CreationDate >= @CreationDate
    '
IF @VoteTypeId IN 
    (2, 1, 3, 5, 10, 6, 16, 15, 11, 8)
BEGIN
    SELECT
        @sql += 
      N'AND   1 = (SELECT 1);'
END;
ELSE
BEGIN
    SELECT
        @sql += 
      N'AND   2 = (SELECT 2);'
END;

EXEC sys.sp_executesql
    @sql,
    @votes_parameters,
    @VoteTypeId,
    @CreationDate;

END;

END;
GO 

I determined which values to assign to each branch by grabbing a count from the Votes table. You could theoretically automate this a bit by dumping the count into a helper table, and updating it every so often to reflect current row counts.

You’re essentially building your own histogram by doing that, which for this case makes sense because:

  • There’s a limited range of values to store and evaluate for an equality predicate
  • Getting the count is very fast, especially if you get Batch Mode involved

I basically ran the proc with recompile hints to see at which row threshold for the VoteTypeId filter I’d tip between Nested Loops and Hash Joins, which is what bit us yesterday.

Running Wild


Let’s compare these two runs!

DBCC FREEPROCCACHE
GO 
EXEC dbo.counter_if_dynamic
    @PostTypeId = NULL, 
    @VoteTypeId = 7,
    @CreationDate = '20080101';
GO 

EXEC dbo.counter_if_dynamic
    @PostTypeId = NULL, 
    @VoteTypeId = 2,
    @CreationDate = '20080101';
GO

Here’s what happens:

exacto

For this example, we can consider this separation good enough.

D-Rive


Perhaps the larger point behind this series would be that multi-purpose stored procedures are a pretty bad idea. In many cases they’re completely avoidable, but in others the queries are similar-enough that grouping them into one proc makes sense at first glance.

If you’re going to do this, you need to be prepared to provide some additional separation for each query, or at least each branch of the logic.

Stored procedures can provide a reasonable amount of shelter, but you can still encounter parameter sensitivity issues.

In the end, using parameterized dynamic SQL allows you to generate queries that are just different enough without changing the logic of the query so that the optimizer spits out different query plans for them.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

SQL Server IF Branches And Query Performance Part 4: Do Stored Procedures Work?

Sub Sub


You know and I know and everyone knows that stored procedures are wonderful things that let you tune queries in magickal ways that stupid ORMs and ad hoc queries don’t really allow for.

Sorry about your incessant need to use lesser ways to manifest queries. They just don’t stack up.

But since we’re going to go high brow together, we need to learn how to make sure we don’t do anything to tarnish the image of our beloved stored procedures.

Like… Probably not doing most of the stuff that we talked about so far this week. We’ve seen some really bad ideas in action, which are largely only possible with stored procedures.

Giveth. Taketh.

It’s up to you.

Likened


The good news is that stored procedures help us with the problem we’ve been hitting all week, where when different IF branches are expanded, we get quality query plans for each one.

Here’s today’s setup, starting with the two procs that we’re going to be calling from our main proc in lieu of plain selects.

CREATE OR ALTER PROCEDURE
    dbo.counter_if_posts
(
    @PostTypeId int = NULL,
    @CreationDate datetime = NULL
)
AS
SET NOCOUNT, XACT_ABORT ON;
BEGIN

        SELECT
            c = COUNT_BIG(*)
        FROM dbo.Posts AS p
        JOIN dbo.Votes AS v
            ON p.Id = v.PostId
        WHERE p.PostTypeId = @PostTypeId
        AND   p.CreationDate >= @CreationDate;

END;
GO 

CREATE OR ALTER PROCEDURE
    dbo.counter_if_votes
(
    @VoteTypeId int = NULL,
    @CreationDate datetime = NULL
)
AS
SET NOCOUNT, XACT_ABORT ON;
BEGIN

        SELECT
            c = COUNT_BIG(*)
        FROM dbo.Posts AS p
        JOIN dbo.Votes AS v
            ON p.Id = v.PostId
        WHERE v.VoteTypeId = @VoteTypeId
        AND   v.CreationDate >= @CreationDate;

END;
GO

And here’s the main calling proc now:

CREATE OR ALTER PROCEDURE
    dbo.counter_if
(
    @PostTypeId int = NULL,
    @VoteTypeId int = NULL,
    @CreationDate datetime = NULL
)
AS
SET NOCOUNT, XACT_ABORT ON;
BEGIN
    
    IF @PostTypeId IS NOT NULL
    BEGIN
    
    EXEC dbo.counter_if_posts
        @PostTypeId = @PostTypeId,
        @CreationDate = @CreationDate;
    
    END;

    IF @VoteTypeId IS NOT NULL
    BEGIN
 
    EXEC dbo.counter_if_votes
        @VoteTypeId = @VoteTypeId,
        @CreationDate = @CreationDate;
    
    END;

END;
GO

In this example, I’m letting the two procs “share” the creation date parameter for simplicity. There is almost no good reason to have a separate one.

The Plans


The good news here is that we can run queries like this back to back and get well-optimized plans for each:

DBCC FREEPROCCACHE
GO 

EXEC dbo.counter_if
    @PostTypeId = 2, 
    @VoteTypeId = NULL,
    @CreationDate = '20080101';
GO 

EXEC dbo.counter_if
    @PostTypeId = NULL, 
    @VoteTypeId = 2,
    @CreationDate = '20080101';
GO
tip top

This is much happier than any of the results we’ve seen so far. The queries run quickly, have no spills, and cardinality estimation is on point.

Miced


The problem is when we do something like this, which is an actual parameter sensitivity issue. We can use this as an extreme example:

DBCC FREEPROCCACHE
GO 
EXEC dbo.counter_if
    @PostTypeId = NULL, 
    @VoteTypeId = 4,
    @CreationDate = '20080101';
GO 

EXEC dbo.counter_if
    @PostTypeId = NULL, 
    @VoteTypeId = 2,
    @CreationDate = '20080101';
GO

There are not many Vote Types of 4. There are quite a few manys of Type 2.

summation

Going from 3 milliseconds to one minute and twenty seconds is what many would consider a failure.

I would agree.

More Problem Than Problem


We’ve solved the problem of getting different individual queries to optimize well in IF branches by using stored procedures.

The different execution contexts here provide optimization fences between IF branches, but we still need to worry about parameter sensitivity for each stored procedure.

Tomorrow we’re going to look at how we can use dynamic SQL to resolve some parameter sensitivity issues.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

SQL Server IF Branches And Query Performance Part 3: Local Variables Still Suck

Blip In The Wall


Parameters get a bad name. It seems everything that gets sniffed does, whether it’s a stored procedure, a piece of anatomy, or an illicit substance.

Once you stop worshipping memes, you’ll find that things aren’t always black and white. End results are important.

What never seems to get a bad name, despite numerical supremacy in producing terrible results, are local variables.

In this particular scenario, I see developers use them to try to beat “parameter sniffing” to no avail.

A chorus of “it seemed to work at the time”, “I think it made things a little better”, “it worked on my machine”, and all that will ensue.

But we know the truth.

Hush, Puppy


This general strategy is one that I see often enough to warrant a blog post to talk about how bad it is.

It’s a little different from yesterday, but equally misguided in its effect.

CREATE OR ALTER PROCEDURE
    dbo.counter_if
(
    @PostTypeId int = NULL,
    @VoteTypeId int = NULL,
    @CreationDate datetime = NULL
)
AS
SET NOCOUNT, XACT_ABORT ON;
BEGIN
    
    IF @PostTypeId IS NOT NULL
    BEGIN

        DECLARE
            @PostTypeIdCopy int = @PostTypeId,
            @CreationDatePostCopy datetime = @CreationDate;
    
        SELECT
            c = COUNT_BIG(*)
        FROM dbo.Posts AS p
        JOIN dbo.Votes AS v
            ON p.Id = v.PostId
        WHERE p.PostTypeId = @PostTypeIdCopy
        AND   p.CreationDate >= @CreationDatePostCopy;
    
    END;

    IF @VoteTypeId IS NOT NULL
    BEGIN

        DECLARE
            @VoteTypeIdCopy int = @VoteTypeId,
            @CreationDateVoteCopy datetime = @CreationDate;
    
        SELECT
            c = COUNT_BIG(*)
        FROM dbo.Posts AS p
        JOIN dbo.Votes AS v
            ON p.Id = v.PostId
        WHERE v.VoteTypeId = @VoteTypeIdCopy
        AND   v.CreationDate >= @CreationDateVoteCopy;
    
    END;

END;

In this (bad, threatening individual) display of transact SQL, we declare local copies of each parameter as a variable and then use those local copies as arguments in the queries.

There are other variations on this theme that I see from time to time, but I think that you get the point.

Cardinality Sins


What I want to show you here is related to cardinality estimation issues with the above strategy.

DBCC FREEPROCCACHE
GO 
EXEC dbo.counter_if
    @PostTypeId = 8, 
    @VoteTypeId = NULL,
    @CreationDate = '20080101';
GO 
DBCC FREEPROCCACHE
GO 
EXEC dbo.counter_if
    @PostTypeId = 2, 
    @VoteTypeId = NULL,
    @CreationDate = '20080101';
GO 

DBCC FREEPROCCACHE
GO 
EXEC dbo.counter_if
    @PostTypeId = NULL, 
    @VoteTypeId = 4,
    @CreationDate = '20080101';
GO 
DBCC FREEPROCCACHE
GO 
EXEC dbo.counter_if
    @PostTypeId = NULL, 
    @VoteTypeId = 2,
    @CreationDate = '20080101';
GO

The point here is to execute each query with a cleared plan cache and show that cardinality estimation for both parameter values is the same value, and also completely wrong.

Posts


Here are the plans that hit the Posts table:

SQL Server Query Plan
nope

Votes


Here are query plans for the Votes table:

SQL Server Query Plan
also nope

Planama!


Regardless of how many rows may actually flow out of the table, SQL Server makes the same “density vector” guess, which sucks in these cases.

This isn’t parameter sniffing, or plan caching, because we’re wiping out the plan cache between executions. This is what happens when local variables get used, even when you copy parameter values over to them.

I’ve seen this work under rare circumstances where SQL Server’s estimates were bad regardless of the supplied parameters, but… I want to emphasize that it’s incredibly rare for this to be a good long-term solution.

You can sort of ignore the cardinality estimates on the joined tables a bit, because the Bitmaps are going to throw those off a bit.

So, we’ve seen the problem, and we’ve seen things that don’t work to fix the problem.

Tomorrow we’re going to see if stored procedures will give us better results.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

SQL Server IF Branches And Query Performance Part 2: Trying To Fix Parameters Doesn’t Work

Jerked


Everyone thinks they’ve outsmarted the optimizer. All the time.

Like it’s a bumbling video game security guard that walks in the same circle and can’t see you if you just hold real still.

In reality, the optimizer is more like a dutiful parent playing along with your childish ruses.

One thing I see developers do quite a bit is try to “fix” a parameter in an IF branch.

Maybe it’s to protect against bad search values, but more often it’s to nix NULLs.

I know that the stored procedure I’m showing only has one branch in it where a query is executed, and the series is supposed to be about if branching with multiple queries.

I’m only doing that to simplify the point of this post, which is that “fixing” supplied values does not correct performance and cardinality estimation issues with queries in IF branches.

Sometimes that’s easier to demonstrate without additional noise.

The Thing


Here’s close to what I normally see someone trying:

CREATE OR ALTER PROCEDURE
    dbo.counter_if
(
    @PostTypeId int = NULL,
    @CreationDate datetime = NULL
)
AS
SET NOCOUNT, XACT_ABORT ON;
BEGIN

    IF @CreationDate IS NULL
    BEGIN
        SET @CreationDate = '20080101';
    END;
    
    IF @PostTypeId IS NOT NULL
    BEGIN
    
        SELECT
            c = COUNT_BIG(*)
        FROM dbo.Posts AS p
        JOIN dbo.Users AS u
            ON p.OwnerUserId = u.Id
        WHERE p.PostTypeId = @PostTypeId
        AND   p.CreationDate >= @CreationDate;
    
    END;

END;
GO

The problem here is that by the time we hit the point where @CreationDate gets set to another value, we’ve already got a query plan.

You might get a search for the value you assign there, but the plan gets optimized for NULL.

Puddings


If you execute the proc like so, and get the query plan for it, here’s what happens:

EXEC dbo.counter_if
    @PostTypeId = 2;
SQL Server Query Plan
humbled

We get a real bad cardinality estimate there, and I’ll show you that it’s because of the NULL we passed in, even though we set it to 2008-1-01-01 later.

SQL Server Query Plan Parameters
video

Digging into the operator properties of the select, here’s what the execution plan shows us about the parameters:

  • @PostTypeId is compiled and executed with 2 for both
  • @CreationDate is compiled with NULL, but executed with 2008-01-01 00:00:00.000

Different World


If we clear out the procedure cache — and I’m allowed to do that because I am a doctor (in Minecraft) — and re-run the proc with 2008-01-01, we get accurate cardinality estimation.

EXEC dbo.counter_if
    @PostTypeId = 2, 
    @CreationDate = '20080101';
SQL Server Query Plan
morphin’

We no longer get a one row estimate. Look at us. Look at how smart we are.

I’m starting to understand why so many people hate NULLs.

But Is It Null?


For brevity, I’m going to list out a bunch of similar patterns that also end up poorly:

SELECT
    c = COUNT_BIG(*)
FROM dbo.Posts AS p
JOIN dbo.Users AS u
    ON p.OwnerUserId = u.Id
WHERE p.PostTypeId = @PostTypeId
AND   p.CreationDate >= ISNULL(@CreationDate, '20080101')
AND   p.CreationDate >= COALESCE(@CreationDate, '20080101')
AND   (p.CreationDate >= @CreationDate OR @CreationDate IS NULL)
AND   p.CreationDate >= CASE WHEN @CreationDate IS NULL THEN p.CreationDate ELSE @CreationDate END

None of these patterns or similar permutations yield desirable results in most cases.

You may find an edge case where they’re acceptable, but most folks I end up talking to aren’t calling me because what they’ve done is working out well.

More or less, they all results in this estimate/plan:

SQL Server Query Plan
skewpie

See? You’re still not clever, and I still got your nose. Go play outside, slugger.

S Dot


Hopefully by now you can see why this technique doesn’t necessarily give you good results.

In tomorrow’s post, we’ll look at another anti-pattern I see a lot with local variables.

If you’re looking for working solutions, you’re gonna have to hang on until the end of the week.

That’s just how culminations work. They Culm and then they Inate.

Duh.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

SQL Server IF Branches And Query Performance Part 1: The Problem

Manifesto


This is a problem I deal with quite a bit when helping people track down performance problems and, you know, solve them.

The basic scenario is something like this:

CREATE PROCEDURE
    dbo.iffy_kid
(
    @p1 int,
    @p2 int,
    @decider varchar(10)
)
AS
SET NOCOUNT, XACT_ABORT ON;

IF @decider = 'this_table'
BEGIN

    SELECT
        this.*
    FROM dbo.this_table AS this
    WHERE this.this_column = @p1;

END;

IF @decider = 'that_table'
BEGIN

    SELECT
        that.*
    FROM dbo.that_table AS that
    WHERE that.that_column = @p2;

END;

ELSE
BEGIN

    /*Do something else*/

END;

You have some parameter that decides which logical execution path that a query will take, and different queries that run based on that path.

What this does not control is query optimization paths, or cardinality estimation paths, at least not written in this manner.

First Blood


When this stored procedure is executed for the first time, or when some recompilation event happens, both queries will get a query plan generated and cached.

For simplicity, let’s say that when a query plan is cached, it it’s compiled and executed with

  • @p1 = 100
  • @p2 = NULL
  • @decider = ‘this_table’

SQL Server’s query optimizer will generate a query plan for the entire stored procedure based on cardinality estimation for:

  • @p1 = 100 as a predicate on this_table
  • @p2 = NULL as a predicate on that_table

On future executions, if the runtime execution parameters change to:

  • @p1 = NULL
  • @p2 = 200
  • @decider = ‘that_table’

The query plan with cardinality estimation for @p2 = NULL will be reused.

You’ve essentially multiplied any parameter sensitivity issue by:

  • The number of separate IF branched queries
  • The number of parameters fed into the stored procedure

Exploration


Over the rest of the week, I’m going to cover this topic from a few different angles to show you what works and what doesn’t work for fixing the problem.

Clients that I work with are often very surprised by the gotchas, intricacies, and weird details that crop up when writing queries like this.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

Software Vendor Mistakes With SQL Server: IF Branching In Stored Procedures

What Goes Wrong


I still see people calling into this trap when writing stored procedures. What no one seems to realize until it’s too late data grows past a toy data base size, is that SQL Server’s query optimizer doesn’t respect IF branches as boundaries in the way you’d expect.

Sure, the control-flow logic boundaries are respected, but the big surprise to most people comes from how the optimizer treats query plans in IF branches: It compiles them all no matter what.

In the video below, I’ll show you exactly what that means, and how it can really screw up performance. This video is a small part of my paid training offering, and if you enjoy it, there’s a link at the end of the post to get 75% off.

IF Branching Video!


Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

Starting SQL: Why Not Make Every SQL Server Query Dynamic SQL?

Problem Solver


I think it was Doug Lane who coined the stages of dynamic SQL. One of them dealt with the fact that once you start using it, you just wanna use it everywhere, even places where you think you don’t need it.

Most of the time, you don’t. A parameter is good enough. But like we saw last week, sometimes parameters can backfire, and you can use dynamic SQL to save query performance.

That’s one great use, but it’s one you’re gonna have to use constantly. Most of the time when you’re using dynamic SQL, it’s not going to be to correct performance.

  • You want to build or use a different string based on some input
  • You have a statement you want to execute over multiple targets
  • Your script has to support multiple versions of SQL Server

Of course, one can’t reasonably write about dynamic SQL in SQL Server without linking to Erland. Boy howdy, did I learn most everything I know from him.

I Disconnect From You


One of the big downsides of dynamic SQL is that statements in a stored procedure are no longer associated with that stored procedure in system DMVs.

Of course, you can address this somewhat by adding a comment to the query inside the dynamic SQL:

/*headless*/
DECLARE @super_cool_sql NVARCHAR(MAX) = N'
SELECT * 
FROM ALL THE TABLES!
';

/*more headed*/
DECLARE @super_cool_sql NVARCHAR(MAX) = N'
SELECT * 
/*super cool procedure name*/
FROM ALL THE TABLES!
';

Where you put the comment is irrelevant, but if it’s a particularly long query, I’d probably want it close to or above the select so it doesn’t get truncated.

But we’re all nice people who don’t write queries with more than 65k characters and spaces in them.

Right? Right.

While it’s nice to know where they came from, they’re not easy to track down because they don’t have a parent object id — they’re rogue agents in the plan cache.

It can also make troubleshooting a stored procedure difficult because it can be a little harder to see which statements did the most work.

  • You might be conditionally executing certain blocks of dynamic SQL for different reasons
  • You might be building dynamic where clauses that have different performance profiles

Other Thans


The first thing I want to cover outright is that IF branching without dynamic SQL does not work. Flat out.

Yes, you can control the flow of logic, but it has no desirable impact on query plan compilation. Everything gets compiled the first time.

Instead of dynamic SQL, though, you could use separate stored procedures, which at least makes the objects a little easier to track down in the plan cache or Query Store.

CREATE OR ALTER PROCEDURE dbo.VoteCount (@VoteTypeId INT, @YearsBack INT)
AS
BEGIN

IF @VoteTypeId IN (2, 1, 3, 5, 10, 6)
BEGIN
    EXEC dbo.VoteCount_InnerBig @VoteTypeId, @YearsBack;
END;

IF @VoteTypeId IN (16, 15, 11, 8, 9, 7, 12, 4)
BEGIN
    EXEC dbo.VoteCount_InnerSmall @VoteTypeId, @YearsBack;
END;

END;

Both of those stored procedures can have the same statement in them, without the ? = (SELECT ?) addition needed with the dynamic SQL option.

That they’re owned by different objects is enough to get them separate optimization paths. You’re also a bit less prone to permissions issues, if you’re the kind of person who takes those seriously. But if your app logs in as db_owner or whatever, well, BOFL with that.

Speaking of which, even though I find security incredibly dull and frustrating, let’s talk a little about how people can take advantage of bad dynamic SQL.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

SQL Server’s Not Very Mighty IF Branch

Falling For Rome


Okay, so like, an IF branch in a stored proc can be helpful to control logic, but not to control performance.

That’s the most important line in the blog post, now lemme show you why.

All Possible Permutations Thereof


Let’s say for our stored procedure, we want to use a different plan for different TOPs, and our tipping point is 10,000.

That’s the tip of our TOP, if you will. And you will, because my name’s on the blog, pal.

CREATE OR ALTER PROCEDURE dbo.top_sniffer (@top INT, @vtid INT)
AS
BEGIN

    IF @top <= 10000
    BEGIN
    SELECT   TOP (@top) 
             v.Id, 
             v.PostId, 
             v.UserId, 
             v.BountyAmount, 
             v.VoteTypeId, 
             v.CreationDate
    FROM     dbo.Votes AS v
    WHERE    v.VoteTypeId = @vtid
    ORDER BY v.CreationDate DESC;
    END
    
    IF @top > 10000
    BEGIN
    SELECT   TOP (@top) 
             v.Id, 
             v.PostId, 
             v.UserId, 
             v.BountyAmount, 
             v.VoteTypeId, 
             v.CreationDate
    FROM     dbo.Votes AS v
    WHERE    v.VoteTypeId = @vtid
    ORDER BY v.CreationDate DESC;
    END

END;

Soup Sandwich


This goes quite poorly. If we just get estimated plans, here’s that they produce.

/*Run me*/
EXEC dbo.top_sniffer @top = 1, @vtid = 2;

/*Run me*/
EXEC dbo.top_sniffer @top = 5000000, @vtid = 2;

Contestant Number One

A SQL Server query plan
Why are there two with the same plan?

Contestant Number Two

The optimizer explores both paths, and the plan cache concurs.

Dead giveaway

If you were to run it with the higher value first, you’d see the same thing for the parallel plans.

Logic, Not Performance


Making plan choices with IF branches like this plain doesn’t work.

The optimizer compiles a plan for both branches based on the initial compile value.

What you end up with is a stored proc that doesn’t do what it’s supposed to do, and parameter sniffing times two.

For a lot more information and examples, check out this Stack Exchange Q&A.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.