I have a delete based on a simple select. I could tune it to delete in batches and it's way faster then a normal delete.
DECLARE
@StartDate DATETIME = '05/01/2024',
@EndDate DATETIME = '05/31/2024';
select TABLE_ID
into #temp
from [BIGGUS_TABLE]
WHERE TransactionDate between @StartDate AND @EndDate
AND ANOTHER_CHECK = 18;
go
DECLARE @BatchSize INT = 50000;
DECLARE @RowsDeleted INT;
DECLARE @DeletedIDs TABLE (TABLE_ID INT);
-- Initialize @RowsDeleted to a non-zero value
SET @RowsDeleted = @BatchSize;
WHILE @RowsDeleted > 0
BEGIN
-- INSERT INTO @DeletedIDs (TABLE_ID)
DELETE TOP (@BatchSize)
FROM [BIGGUS_TABLE]
OUTPUT DELETED.TABLE_ID INTO @DeletedIds
where TABLE_ID in
(
select TABLE_ID
from #temp
)
DELETE FROM #temp
WHERE TABLE_ID IN (SELECT TABLE_ID FROM @DeletedIDs)
-- DROP TABLE #DeletedIDs
SET @RowsDeleted = @@ROWCOUNT;
END
*TABLE_ID* is the PK
it would take 5 days to delete 100million rows.
The green table is BIGGUS_TABLE having the main deletes. the RED table is a secondary EMPTY table that has a beautiful FK pointing to BIGGUS_TABLE.
The index being used by BIGGUS_TABLE has TABLE_ID (The pk) But it doesn't has the ANOTHER_CHECK column, but funnily it's still using a seek?
So, I'm trying to think what we can do to improve this:
1 - Remove the FK from the empty table and then later re-add ( we don't even know if that table is being used for real.
2 - add ANOTHER_CHECK column into the index being used by the green Table/BIGGUS_TABLE?
3 - Anything we can add/remove from the query?
We went from some 400k each hours to 700k each hour but now I'm stuck.

BIGGUS_TABLE? Are you trying to empty the entire table (which is what your example code would do)? How many rows are inBIGGUS_TABLEtotal?