Sample Header Ad - 728x90

Need help determining ideal batch size for querying a large table

1 vote
1 answer
77 views
I have a table with maybe around 180-200k rows (~40 columns) that I'm trying to query. The premise is that there's a list of strings to compare an FK column against, totaling around 170k passed as an API req in batches, then that batch size is further chunked into smaller batches to filter in an IN command.
SELECT *
FROM table
WHERE IsActive = 1 AND Fk IN ( .. list of batch size .. )
Testing on a local database with around 5k rows, I found that the optimizer switched from using *Nested Loops -> Index Seek -> Key Lookup*, to *Hash Match -> Constant Scan -> Clustered Index Scan* for batch sizes larger than 64. I have seen that index seeks are generally said to be better, but in this case where the output is likely to contain >60% of the table rows, how should I be determining what batch size to use? Would index scans or seeks have better performance here? Or is this method itself inefficient and should use a TVP instead? Query plan for example batch size of 100: https://www.brentozar.com/pastetheplan/?id=3xy2ILe0K4
Asked by Kyrop (11 rep)
Sep 13, 2025, 03:24 AM
Last activity: Sep 14, 2025, 01:49 PM