I'm working on a web application that uses PostgreSQL for its database. One of the queries I run is fetching data from a large table with millions of rows. The query looks like this: SELECT * FROM products WHERE category = 'electronics' AND price > 1000;
While this query works, it's quite slow when the database grows larger. I've tried creating indexes on the category and price columns, but the improvement is minimal. What are some best practices or advanced techniques for optimizing queries like this? Should I consider restructuring my database or using a specific type of index? Any advice would be appreciated.