2

I built a stored procedure which finds the shortest route in a company with many warehouses. The procedure is heavily used by the system. After many optimizations, including memory optimized tables, if I run sp_BlitzCache, I have 2 warnings:

  1. Parameter Sniffing;
  2. You have 83866 total plans in your cache, with 100.00% plans created in the past 24 hours, 100.00% created in the past 4 hours, and 100.00% created in the past 1 hour.

The procedure is called only with parameters (no hardcoded values). You can find here a query plan: https://www.brentozar.com/pastetheplan/?id=r1rW59QvN.

I really don't understand why the engine generates for each execution a plan and what I should check. Also, sp_BlitzCache says We couldn't find a plan for this query. Possible reasons for this include dynamic SQL, RECOMPILE hints, and encrypted code. I don't have either dynamic SQL, recompile hints nor encrypted code.

What should I check?

UPDATE

I checked the impact of this procedure on the server as Jeroen Mostert suggested in the comments (thank you for your response!) and this procedure uses 0.66% of top 10 procedures on the server regarding average IO, but is the only procedure without a plan, so the question still remains: why so many plans?

I haven't tried to run the stored procedure without the memory optimized tables. I had to use them because the proc needs to run as fast as lightning (and runs in less then 30 milliseconds in some cases). Also, I don't have the guts the procedure without the tables memory optimized, because I cannot reproduce the workload on test servers.

Thank you!

8
  • Have you considered performing the shortest path calculation in your application instead of the the database? Than you would be able to use existing graph libraries.
    – Magnus
    Commented Mar 11, 2019 at 8:39
  • It is not actually a geographical route calculations, but a schedule with cars leave and arrival hours, source and destinations, so only a select with a few filters and an order by. Commented Mar 11, 2019 at 8:43
  • 1
    Does the same thing happen if those two tables are not memory-optimized? Mixing memory-optimized tables with disk-based tables is a scenario that still doesn't work quite as smoothly as using only one type of object (while using only natively compiled stored procedures for the memory-based tables). Also, try things with the server option "optimize for ad-hoc workloads" enabled if it's not already -- this reduces one-off plan caching to stubs, which will not cut down on compilations, but may reduce their impact (and is almost never bad in and of itself). Commented Mar 12, 2019 at 15:23
  • 1
    Another thing worth exploring is OPTION (KEEP[FIXED] PLAN) for the parts of the procedure that are suspected sources of recompilation (again, the memory-optimized tables spring to mind). Obviously, some care needs to be taken to ensure this does not cache an inappropriate plan, but it can cut down considerably on unnecessary compilations. (I don't know if it has any effect with memory-optimized tables, I've only ever used it for disk-based tables.) Commented Mar 12, 2019 at 15:27
  • 1
    Last but not least: check if this is actually measurably impacting your server. Although recompilations are not good, if the work done by your stored procedure is considerably more than the work needed for recompilation (which tends to be the case) the excess plans may be survivable, if not optimal. Plan stubs can keep down the memory use, while the CPU cost can just be swallowed. Commented Mar 12, 2019 at 15:35

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.