Fetching rows from million rows table(Optimization)
0
votes
1
answer
103
views
I have millions of rows in a table of Greenplum database out of which I have to fetch around 45k rows and store them in a python List.
It's taking more than 2hrs to fetch the data. How can I optimize the time taken to fetch data?
resultList = []
for(item in list):
result = SELECT column_1, ... column_n from TABLE WHERE column = item
resultList.append(result)
Asked by Krishna Kumar S
(1 rep)
Jun 12, 2019, 07:57 AM
Last activity: Jun 12, 2019, 11:30 AM
Last activity: Jun 12, 2019, 11:30 AM