Sample Header Ad - 728x90

Refactoring a very large database table and having join over millions of records

1 vote
1 answer
243 views
I have a fairly big postgresql **jobs** table of **(more than 60 attributes)** and one critical query contains joins with this table of million records. and I cannot reduce its size(paritioning by range which is introduced recently in postgresql 10). Only 10 fields are required for this critical query(All other 50 fields when joining are a burden). is there any way that I can only join with smaller table(with 10 fields only, Because I think pg loads all the fields when applying join) when running this critical query but in all of the remaining app, I would be able to fully use this table. E.g jobs table fields critical for query are: cost, id, user_id, params, location and the fields which are not required in this query are quite a lot. Note: If I create two different tables then I had to update a lot of code everywhere
Asked by Radio Active (177 rep)
Mar 7, 2018, 06:58 PM
Last activity: Mar 8, 2018, 09:44 AM