I have one table [table] with two columns that needs to be filtered: [column1] and [column2].
In my program I execute a query like:
select * from [table] where [column1] = 'foo' and [column2] = 'bar';
Which is faster:
- Creating two indexes, one on each column. ([column1] and [column2])
- Creating one index containing both columns. ([column1]+[column2])
This question have been bugging me for a while, I have no idea how query optimization works and how SQL Server uses the created indexes to speed up queries.