Why SQL Server’s OPTIMIZE FOR UNKNOWN Hint Hurts Performance

“Best Practice”


It’s somewhat strange to hear people carry on about best practices that are actually worst practices.

One worst practice that has strong staying power is the OPTIMIZE FOR UNKNOWN hint, which we talked about yesterday.

It probably doesn’t help that Microsoft has products (I’m looking at you, Dynamics) which have a setting to add the hint to every query. Shorter: If Microsoft recommends it, it must be good.

Thanks, Microsoft. Dummies.

Using the OPTIMIZE FOR UNKNOWN hint, or declaring variables inside of a code block to be used in a where clause have the same issue, though: they make SQL Server’s query optimizer make bad guesses, which often lead to bad execution plans.

You can read great detail about that here.

Mistakenly


We’re going to create two indexes on the Posts table:

CREATE INDEX
    p0
ON dbo.Posts
(
    OwnerUserId
)
WITH
(
    SORT_IN_TEMPDB = ON,
    DATA_COMPRESSION = PAGE
);
GO

CREATE INDEX
    p1
ON dbo.Posts
(
    ParentId,
    CreationDate,
    LastActivityDate
)
INCLUDE
(
    PostTypeId
)
WITH
(
    SORT_IN_TEMPDB = ON,
    DATA_COMPRESSION = PAGE
);
GO

The indexes themselves are not as important as how SQL Server goes about choosing them.

Support Wear


This stored procedure is going to call the same query in three different ways:

  • One with the OPTIMIZE FOR UNKNOWN hint that uses parameters
  • One with local variables set to parameter values with no hints
  • One that accepts parameters and uses no hints
CREATE OR ALTER PROCEDURE
    dbo.unknown_soldier
(
    @ParentId int,
    @OwnerUserId int
)
AS
BEGIN
SET NOCOUNT, XACT_ABORT ON;

    SELECT TOP (1)
        p.*
    FROM dbo.Posts AS p
    WHERE p.ParentId = @ParentId
    AND   p.OwnerUserId = @OwnerUserId
    ORDER BY
        p.Score DESC,
        p.Id DESC
    OPTION(OPTIMIZE FOR UNKNOWN);

    DECLARE
        @ParentIdInner int = @ParentId,
        @OwnerUserIdInner int = @OwnerUserId;

    SELECT TOP (1)
        p.*
    FROM dbo.Posts AS p
    WHERE p.ParentId = @ParentIdInner
    AND   p.OwnerUserId = @OwnerUserIdInner
    ORDER BY
        p.Score DESC,
        p.Id DESC;

    SELECT TOP (1)
        p.*
    FROM dbo.Posts AS p
    WHERE p.ParentId = @ParentId
    AND   p.OwnerUserId = @OwnerUserId
    ORDER BY
        p.Score DESC,
        p.Id DESC;

END;
GO

Placebo Effect


If we call the stored procedure with actual execution plans enabled, we get the following plans back.

EXEC dbo.unknown_soldier 
    @OwnerUserId = 22656, 
    @ParentId = 0;
SQL Server Query Plan With Optimize For Unknown Hint
Not a good guess.

The assumed selectivity that the OPTIMIZE FOR UNKNOWN hint produces as a cardinality estimate is way off the rails.

SQL Server thinks three rows are going to come back, but we get 6,050,820 rows back.

We get identical behavior from the second query that uses variables declared within the stored procedure, and set to the parameter values passed in.

SQL Server Query Plan With Local Variables
release me

Same poor guesses, same index choices, same long running plan.

Parameter Effect


The query that accepts parameters and doesn’t have any hints applied to it fares much better.

SQL Server Query Plan
transporter

In this case, we get an accurate cardinality estimate, and a more suitable index choice.

Note that both queries perform lookups, but this one performs far fewer of them because it uses an index that filters way more rows out prior to doing the lookup.

The optimizer is able to choose the correct index because it’s able to evaluate predicate values against the statistics histograms rather than using the assumed selectivity guess.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that, and need to solve database performance problems quickly. You can also get a quick, low cost health check with no phone time required.

How OPTIMIZE FOR UNKNOWN Makes Troubleshooting SQL Server Performance Problems Harder

Detained


Despite the many metric tons of blog posts warning people about this stuff, I still see many local variables and optimize for unknown hints. As a solution to parameter sniffing, it’s probably the best choice 1/1000th of the time. I still end up having to fix the other 999/1000 times, though.

In this post, I want to show you how using either optimize for unknown or local variables makes my job — and the job of anyone trying to fix this stuff — harder than it should be.

Passenger


Like most things, we’re going to start with an index:

CREATE INDEX r ON dbo.Users(Reputation);
GO 

I’m going to  have a stored procedure that uses three different ways to pass a value to a where clause:

CREATE OR ALTER PROCEDURE 
    dbo.u 
(
    @r int, 
    @u int
)
AS
BEGIN

    /* Regular parameter */
    SELECT
        c = COUNT_BIG(*)
    FROM dbo.Users AS u
    WHERE u.Reputation = @r
    AND   u.UpVotes = @u;

    /* Someone who saw someone else do it at their last job */
    DECLARE 
        @LookMom int = @r,
        @IDidItAgain int = @u;
    
    SELECT
        c = COUNT_BIG(*)
    FROM dbo.Users AS u
    WHERE u.Reputation = @LookMom
    AND   u.UpVotes = @IDidItAgain;

    /* Someone who read the blog post URL wrong */
    SELECT
        c = COUNT_BIG(*)
    FROM dbo.Users AS u
    WHERE u.Reputation = @r
    AND   u.UpVotes = @u
    OPTION(OPTIMIZE FOR UNKNOWN);

END;
GO

First Way


The best case is we run this for a small number of rows, and no one really notices. Even though we get bad guesses for the second two queries, it’s not a huge deal.

SQL Server Query Plan
hands on

When you run procs like this, SQL Server doesn’t cache the compile time values the same way it does when you use parameters. Granted, this is because it technically shouldn’t matter, but if you’re looking for a way to execute the proc again to reproduce the issue, it’s up to you to go figure out what someone did.

SQL Server Query Plan
? vs ?‍♂️

Since I’m getting the actual plans here, I get the runtime values for both, but those don’t show up in the plan cache or query store version of plans.

That’s typically a huge blind spot when you’re trying to fix performance issues of any kind, but it’s up to you to capture that stuff.

Just, you know, good luck doing it in a way that doesn’t squash performance.

Second Way


In this example, our index is only on the Reputation column, but our where clause is also on the UpVotes column.

In nearly every situations, it’s better to have your query do all the filtering it can from one index source — there are obviously exceptions — but the point here is that the optimizer doesn’t bother with a missing index request for the second two queries, only for the first one.

That doesn’t matter a toif you’re looking at the query and plan right in front of you, but if you’re also using the missing index DMVs to get some idea about how useful overall a new index might be, you’re out of luck.

SQL Server Query Plan
mattered

In this case, the optimizer doesn’t think the second two plans are costly enough to warrant anything, but it does for the first plan.

I’m not saying that queries with local variables or optimize for unknown hints always do this, or that parameterized plans will always ask for (good) indexes. There are many issues with costing and SARGability that can prevent them from showing up, including getting a trivial plan.

This is just a good example of how Doing Goofy Things™ can backfire on you.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

What’s The Best Statistics Sample Rate For SQL Server?

Big Think


Most people fire off their index and statistics maintenance jobs with the default options and do just fine. Sometimes, though, statistics need a little bit more consideration, and I get a lot of questions about them.

  • Do I need to use a sampling rate other than the default?
  • Do I need to use FULLSCAN?
  • What if auto update stats ruins my good statistics?
  • Should I update statistics asynchronously?
  • Are there easy ways to tell if my statistics are causing problems?

There are a lot of local factors involved in answering these questions, and I can’t cover all of them in this post, but I can outline some general patterns I see when people have these concerns.

Is The Default Sampling Rate Good Enough?


Look — usually it is — I know you hate to hear it. You want to be that special edge case where it’s not. Moreover, it’s really efficient.

The tradeoff you make between higher sampling rates and better statistics and time it takes to update stats can be significant, especially on larger tables. And a lot of the time, those 201 histogram steps don’t end up a whole lot more informing than before.

I’m not saying never, but in particular the difference between 75% and 100% scans is kinda ?‍♂️

Do I Need A Full Scan?


If you’re at the point where you’re asking this question, you’re in one of two situations:

  • You’re just starting out with stats updates
  • You’re unhappy with the way your stats look now

If you’re just starting out, stop right here and just make sure you’re keeping them updated regularly.

If you’re unhappy with the way they look because of the way updates — manual or automatic — are going now, you might want to think about creating filtered statistics on the parts of the data that aren’t well-reflected by them.

Is Auto Update Stats Bad?


No, not usually. It can be quite helpful. Especially on supported versions of SQL Server where trace flag 2371 is the default behavior, it doesn’t take as many modifications for them to happen on large tables.

This can better help you avoid the ascending key problem that used to really hurt query plans, where off-histogram values got some really bad estimates. That’s somewhat addressed in higher compat levels, but I’ve still seen some wonkiness.

If you don’t want auto update stats to ruin your higher sampling rates, you probably wanna use the PERSIST_SAMPLE_PERCENT option, with your chosen value.

But again, you wanna be careful with this on big tables, and this is when you might also wanna use…

Asynchronous Stats Updates


This setting can be really useful for avoiding long running queries while waiting for updated stat histograms to be available. The downside is that you might hit some woof query plans until the background task does its thing and the plans get invalidated.

Make sure you’re solving the right problem by flipping this switch. SQL Server 2019 can help you figure out if this is something you’re hitting regularly, with the WAIT_ON_SYNC_STATISTICS_REFRESH wait type. Keep in mind this only shows up when stats are refreshed. Not when they’re created initially.

You can also see stuff like this crop up in monitoring tools and query store when queries take a while, but they’re doing stuff like SELECT StatMan.

Are Statistics My Problem?


I see a lot of people blaming statistics, when they’re really doing other stuff wrong.

There are many ways to write a query that are equivalent to the Calvin Peeing™ decal, with cardinality estimation being on the receiving end, like eschewing SARGability, or using local variables or table variables.

Another common issue is around cached plans and parameter sniffing. Those inaccurate guesses might have been really accurate for the compiled parameters, but not for the runtime parameters.

There are a lot of query hints that can help you figure out if statistics are the problem, or if there’s something in the cardinality estimation process that’s yucking things up.

  • ASSUME_JOIN_PREDICATE_DEPENDS_ON_FILTERS
  • ASSUME_MIN_SELECTIVITY_FOR_FILTER_ESTIMATES
  • ENABLE_HIST_AMENDMENT_FOR_ASC_KEYS
  • FORCE_LEGACY_CARDINALITY_ESTIMATION
  • FORCE_DEFAULT_CARDINALITY_ESTIMATION

I use this pretty regularly to see if they end up giving me a better plan. They don’t always pan out, but it’s worth trying when what you get ain’t what you want.

Just remember that things like query complexity will have a lot to do with how accurate the estimates you get are. Chaining together a bunch of CTEs, derived tables, views, or 20 joins usually isn’t gonna end up well for you, or your query plan.

There are many times when I see people complaining that statistics are bad or out of date, and basic stuff like proper indexes aren’t there.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

Local Variables Get Bad Cardinality Estimates In Cursors In SQL Server, Too

Work Off


While helping a client out with weird performance issues, we isolated part of the code that was producing a whole bunch of bad plans.

At the intersection of bad ideas, there was a cursor looping over a table gathering some data points with a local variable in the where clause.

For more background on that, check out these posts:

One might expect the query inside the cursor to be subject to some special rules, but alas, nothing good comes of it.

Eau de Sample


To repro a little bit, we need to create a certainly suboptimal index.

CREATE INDEX p ON dbo.Posts
    (OwnerUserId);

If you have a lot of single key column indexes, you’re probably doing indexing wrong.

Full demo code is at the end because it’s a bit long, but the end result is five query plans that all share the same bad estimate based off the density vector.

The worst of them looks like this:

SQL Server Query Plan
Occurling

And the best of them looks like this:

SQL Server Query Plan
Gruntled

Over and Over


If you’re getting bad guesses like this over and over again in any loop-driven code, local variables might just be to blame.

That guess of 10 rows of course comes from  this calculation:

SELECT 
    density = 
        (
            1 / 
            CONVERT
            (
                float, 
                COUNT(DISTINCT p.OwnerUserId)
            )
        ) *
        COUNT_BIG(*)
FROM Posts AS p

Which, with a little bit of rounding, gets us to the estimate we see in the query plan:

SQL Server Query Plan
hectic

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

Demo Code


SET STATISTICS XML OFF;
SET NOCOUNT ON;

DECLARE 
    @CurrentId int;

DROP TABLE IF EXISTS #UserIds; 

CREATE TABLE #UserIds 
(
    UserId int PRIMARY KEY CLUSTERED
);

INSERT
    #UserIds WITH(TABLOCK)
(
    UserId
)
SELECT
    u.Id
FROM dbo.Users AS u
WHERE u.Reputation > 850000
ORDER BY u.Reputation DESC;


DECLARE counter CURSOR 
    LOCAL STATIC
FOR
SELECT 
    UserId 
FROM #UserIds;
 
OPEN counter;

FETCH NEXT FROM counter 
    INTO @CurrentId;

WHILE @@FETCH_STATUS = 0
BEGIN 
   
    SET STATISTICS XML ON;
    
    SELECT
        p.PostTypeId,
        c = COUNT_BIG(*)
    FROM dbo.Posts AS p
    WHERE p.OwnerUserId = @CurrentId
    GROUP BY p.PostTypeId
    ORDER BY c DESC;
    
    SET STATISTICS XML OFF;
 
FETCH NEXT FROM counter 
    INTO @CurrentId;
END;

CLOSE counter;
DEALLOCATE counter;

 

Why Expressions Are Better Than Local Variables For Performance In SQL Server Queries

Often Enough


I see people writing stored procedures that use local variables either because they read that all functions in a where clause are dangerous, or because they need to perform a calculation on one of the parameters that get passed in.

In nearly all cases, you’re better off… not doing that.

Here are a couple examples of why.

Hard Yes


Passing in a function like GETDATE, and even passing it to another function like DATEADD, the optimizer can interpret them to their current values and make a pretty decent guess based on them.

SQL Server Query Plan
goo

This is a Good Enough™ guess.

Hard Pass


Once you assign that function to a value, everything gets awkward.

SQL Server Query Plan
bang bang

That’s a Pretty Bad™ guess, but not the end of the world. What’s bad is that the guess never changes even if you change the span of time.

Look what happens if we just add one day instead of one year.

SQL Server Query Plan
one day at a time

We get the exact same guess as before — 821,584 rows. Bummer.

Storied Tradition


It’s nearly the same thing in stored procedures. What I see more often is people there is people passing in one parameter for a start date, and then using a local variable to figure out an end date.

CREATE OR ALTER PROCEDURE dbo.dangit_bobby
(
    @start_date datetime
)
AS
BEGIN
SET NOCOUNT, XACT_ABORT ON;

DECLARE
    @later datetime = DATEADD(DAY, 1, GETDATE());

    SELECT
        records = 
            COUNT_BIG(*)
    FROM dbo.express_yourself AS ey
    WHERE ey.some_date >= @start_date
    AND   ey.some_date <  @later;

END;
GO

EXEC dbo.dangit_bobby 
    @start_date = '2021-05-19';
GO
SQL Server Query Plan
FAR OFF DUDE

Advanced Calculus


Let’s change how we use the parameter, and put it into some date math in the where clause instead.

CREATE OR ALTER PROCEDURE dbo.dangit_bobby
(
    @start_date datetime
)
AS
BEGIN
SET NOCOUNT, XACT_ABORT ON;

    SELECT
        records = 
            COUNT_BIG(*)
    FROM dbo.express_yourself AS ey
    WHERE ey.some_date >= @start_date
    AND   ey.some_date <  DATEADD(DAY, 1, @start_date);

END;
GO 
EXEC dbo.dangit_bobby 
    @start_date = '2021-05-19';
GO

We get a Much Closer™ estimate. What a lovely day.

SQL Server Query Plan
go for both

Hardest Part


It’s often tempting to take shortcuts like this, especially if you need to reuse the same calculation multiple times.

Now, look, if you stare closely enough at these plans, you’ll notice that they all finish quickly enough. That’s because I have a small table with a wonderful index on the column I care about.

The point here is to show you how bad estimates can turn up even in ideal circumstances when you do things The Wrong Way™.

As long as you’re not wrapping columns in functions like this, you’re probably okay.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

Things SQL Server vNext Should Address: Table Variable Modification Performance

Canard


People still tell me things like “I only put 100 rows in table variables”, and think that’s the only consideration for their use.

There are definitely times when table variables can be better, but 100 rows is meaningless.

Even if you put one row in a table variable it can fudge up performance because SQL Server doesn’t know what’s in your table variable. That’s still true in SQL Server 2019, even if the optimizer knows how many rows are in your table variable.

The problem that you can run into, even with just getting 100 rows into a table variable, is that it might take a lot of work to get those 100 rows.

Bernard


I’ve blogged before about workarounds for this problem, but the issue remains that inserts, updates, and deletes against table variables aren’t naturally allowed to go parallel.

The reason why is a bit of a mystery to me, since table variables are all backed by temp tables anyway. If you run this code locally, you’ll see what I mean:

SET NOCOUNT ON;
SET STATISTICS IO ON;
DECLARE @t table(id int);
SELECT * FROM @t AS t;
SET STATISTICS IO OFF;

Over in the messages tab you’ll see something like this:

Table '#B7A53B3E'. Scan count 1, logical reads 0, physical reads 0, page server reads 0, read-ahead reads 0, page server read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob page server reads 0, lob read-ahead reads 0, lob page server read-ahead reads 0.

Now, look, I’m not asking for update or delete portions of the query plan to go parallel, but it might be nice if other child operators could go parallel. That’s how things go with regular tables and #temp tables. It would be nice if inserts could go parallel, but hey

Ardbeg


The problem this solves is one that I see often, usually from vendor code where the choice of which temporary object to use was dependent on individual developer preference, or they fell for the meme that table variables are “in memory” or something. Maybe the choice was immaterial at first with low data volume, and over time performance slowly degraded.

If I’m allowed to change things, it’s easy enough to replace @table variables with #temp tables, or use a workaround like from the above linked post about them to improve performance. But when I’m not, clients are often left begging vendors to make changes, who aren’t receptive.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

Things SQL Server vNext Should Address: Local Variable Estimates

And Then The World


I’ve written at length about what local variables do to queries, so I’m not going to go into it again here.

What I do want to talk about are better alternatives to what you currently have to do to fix issues:

  • RECOMPILE the query
  • Pass the local variable to a stored procedure
  • Pass the local variable to dynamic SQL

It’s not that I hate those options, they’re just tedious. Sometimes I’d like the benefit of recompiling with local variables without all the other strings that come attached to recompiling.

Hint Me Baby One More Time


Since I’m told people rely on this behavior to fix certain problems, you would probably need a few different places to and ways to alter this behavior:

  • Database level setting
  • Query Hint
  • Variable declaration

Database level settings are great for workloads you can’t alter, either because the queries come out of a black box, or you use an ORM and queries… come out of a nuclear disaster area.

Query hints are great if you want all local variables to be treated like parameters. But you may not want that all the time. I mean, look: you all do wacky things and you’re stuck in your ways. I’m not kink shaming here, but facts are facts.

You have to draw the line somewhere and that somewhere is always “furries”.

And then local variables.

It may also be useful to allow local variables to be declared with a special property that will allow the optimizer to treat them like parameters. Something like this would be easy enough:

DECLARE @p int PARAMETER = 1;

Hog Ground


Given that in SQL Server 2019 table variables got deferred compilation, I think this feature is doable.

Of course, it’s doable today if you’ve got a debugger and don’t mind editing memory space.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

Parameter Sniffing Is Usually A Good Thing In SQL Server

Tick Tock


I talk to a lot of people about performance tuning. It seems like once someone is close enough to a database for long enough, they’ll have some impression of parameter sniffing. Usually a bad one.

You start to hear some funny stuff over and over again:

  • We should always recompile
  • We should always use local variables
  • We should always recompile and use local variables

Often, even if it means writing unsafe dynamic SQL, people will be afraid to parameterize things.

Between Friends


To some degree, I get it. You’re afraid of incurring some new performance problem.

You’ve had the same mediocre performance for years, and you don’t wanna make something worse.

The thing is, you could be making things a lot better most of the time.

  • Fewer compiles and recompiles, fewer single-use plans, fewer queries with multiple plans
  • Avoiding the local variable nonsense is, more often than not, going to get you better performance

A Letter To You


I’m going to tell you something that you’re not going to like, here.

Most of the time when I see a parameter sniffing problem, I see a lot of other problems.

Shabbily written queries, obvious missing indexes, and a whole list of other things.

It’s not that you have a parameter sniffing problem, you have a general negligence problem.

After all, the bad kind of parameter sniffing means that you’ve got variations of a query plan that don’t perform well on variations of parameters.

Once you start taking care of the basics, you’ll find a whole lot less of the problems that keep you up at night.

If that’s the kind of thing you need help with, drop me a line.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

Signs You Need Dynamic SQL To Fix Query Performance Problems In SQL Server

Nothing Works


There are things that queries just weren’t meant to do all at once. Multi-purpose queries are often just a confused jumble with crappy query plans.

If you have a Swiss Army Knife, pull it out. Open up all the doodads. Now try to do one thing with it.

If you didn’t end up with a corkscrew in your eye, I’m impressed.

En Masse


The easiest way to think of this is conditionals. If what happens within a stored procedure or query depends on something that is decided based on user input or some other state of data, you’ve introduced an element of uncertainty to the query optimization process.

Of course, this also depends on if performance is of some importance to you.

Since you’re here, I’m assuming it is. It’s not like I spend a lot of time talking about backups and crap.

There are a lot of forms this can take, but none of them lead to you winning an award for Best Query Writer.

IFTTT


Let’s say a stored procedure will execute a different query based on some prior logic, or an input parameter.

Here’s a simple example:

IF @i = 1
BEGIN
    SELECT
        u.*
    FROM dbo.Users AS u
    WHERE u.Reputation = @i;
END;

IF @i = 2
BEGIN
    SELECT
        p.*
    FROM dbo.Posts AS p
    WHERE p.PostTypeId = @i;
END;

If the stored procedure runs for @i = 1 first, the second query will get optimized for that value too.

Using parameterized dynamic SQL can get you the type of optimization separation you want, to avoid cross-optimization contamination.

I made half of that sentence up.

For more information, read this article.

Act Locally


Local variables are another great use of dynamic SQL, because one query’s local variable is another query’s parameter.

DECLARE @i int = 2;
SELECT
    v.*
FROM dbo.Votes AS v
WHERE v.VoteTypeId = @i;

Doing this will get you weird estimates, and you won’t be happy.

You’ll never be happy.

For more information, read this article.

This Or That


You can replace or reorder the where clause with lots of different attempts at humor, but none of them will be funny.

SELECT
    c.*
FROM dbo.Comments AS c
WHERE (c.Score >= @i OR @i IS NULL);

The optimizer does not consider this SARGable, and it will take things out on you in the long run.

Maybe you’re into that, though. I won’t shame you.

We can still be friends.

For more information, watch this video.

Snortables


Dynamic SQL is so good at helping you with parameter sniffing issues that I have an entire session about it.

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.

Local Variables Also Cause Performance Problems In The ORDER BY Clause In SQL Server

Local Disasters


I see this kind of pattern a lot in paging queries where people are doing everything in their power to avoid writing dynamic SQL for some reason.

It’s almost as if an entire internet worth of SQL Server knowledge and advice doesn’t exist when they’re writing these queries.

Quite something. Quite something indeed.

First, let’s get what doesn’t work out of the way.

DECLARE @order_by INT = 3

SELECT p.Id
FROM dbo.Posts AS p
WHERE p.OwnerUserId = 22656
ORDER BY CASE WHEN @order_by = 1 THEN p.Score
              WHEN @order_by = 2 THEN p.CreationDate 
              WHEN @order_by = 3 THEN p.Id 
              ELSE NULL 
         END;
GO

You can’t write this as a single case expression with mismatched data types.

It’ll work for the first two options, but not the third. We’ll get this error, even with a recompile hint:

Msg 8115, Level 16, State 2, Line 46
Arithmetic overflow error converting expression to data type datetime.

What Works But Still Stinks


Is when you break the options out into separate case expressions, like so:

DECLARE @order_by INT = 1

SELECT p.Id
FROM dbo.Posts AS p
WHERE p.OwnerUserId = 22656
ORDER BY CASE WHEN @order_by = 1 THEN p.Score ELSE NULL END,
         CASE WHEN @order_by = 2 THEN p.CreationDate ELSE NULL END,
         CASE WHEN @order_by = 3 THEN p.Id ELSE NULL END;
GO

This will work no matter which option we choose, but something rather disappointing happens when we choose option three.

Here’s the query plan. Before you read below, take a second to try to guess what it is.

SQL Server Query Plan
Sorta Kinda

What Stinks Even Though It Works


My issue with this plan is that we end up with a sort operator, even though we’re ordering by Id, which is the primary key and clustered index key, and we use that very same index. We technically have the data in order, but the index scan has False for the Ordered attribute, and the Sort operator shows a series of expressions.

SQL Server Query Plan
stunk

The Sort of course goes away if we add a recompile hint, and the Scan now has True for the Ordered attribute.

DECLARE @order_by INT = 3

SELECT p.Id
FROM dbo.Posts AS p
WHERE p.OwnerUserId = 22656
ORDER BY CASE WHEN @order_by = 1 THEN p.Score ELSE NULL END,
         CASE WHEN @order_by = 2 THEN p.CreationDate ELSE NULL END,
         CASE WHEN @order_by = 3 THEN p.Id ELSE NULL END
OPTION(RECOMPILE);
GO 
SQL Server Query Plan
no worse

You Shouldn’t Do This


Unless you’re fine with recompile hints, which I don’t blame you if you are.

SQL Server seems to get a whole lot more right when you use one, anyway.

My point though, is that adding uncertainty like this to your queries is more often than not harmful in the long term. Though this post is about local variables, the same thing would happen with parameters, for example:

DECLARE @order_by INT = 3
DECLARE @sql NVARCHAR(MAX) = N'
SELECT p.Id
FROM dbo.Posts AS p
WHERE p.OwnerUserId = 22656
ORDER BY CASE WHEN @order_by = 1 THEN p.Score ELSE NULL END,
         CASE WHEN @order_by = 2 THEN p.CreationDate ELSE NULL END,
         CASE WHEN @order_by = 3 THEN p.Id ELSE NULL END;
';

EXEC sys.sp_executesql @sql, N'@order_by INT', 1;
EXEC sys.sp_executesql @sql, N'@order_by INT', 3;
GO

The way to address it would be something like this:

DECLARE @order_by INT = 3
DECLARE @sql NVARCHAR(MAX) = N'
SELECT p.Id
FROM dbo.Posts AS p
WHERE p.OwnerUserId = 22656
ORDER BY ';

SET @sql += 
CASE WHEN @order_by = 1 THEN N'p.Score'
     WHEN @order_by = 2 THEN N'p.CreationDate'
     WHEN @order_by = 3 THEN N'p.Id' 
     ELSE N'' 
END;

EXEC sys.sp_executesql @sql
GO

Thanks for reading!

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 75% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that and need to solve performance problems quickly.