Sample Header Ad - 728x90

How to add index to a big table with 60M records without downtime?

10 votes
1 answer
3031 views
we have been struggling with one issue in the past few days. We want to add an index to a huge table with 60M records. At first we tried adding it with basic mysql syntax. But it clogged our production DB. That table is used very frequently in production queries. So everything suffered. Our DB is hosted on AWS RDS. Its Mysql 5.7. We are using Laravel as our PHP framework Next thing we read about was, we can copy the current table into a new one. Then add index to the new table. Then shift the laravel model to use the new table. We thought it made sense and it would be easy enough But copying the table data from one table to the new one, was taking quite a lot of time. Our calculations showed it would take days. We tried using Laravel as well as SQL commands. But it was too slow either way. Then we tried exporting the data as CSV and importing it, but again, too slow. The first few million records would insert fast, but then the table would become extremely slow in inserting. Finally we tried mysqldump and we realised it also locks the new table while inserting, so maybe that's why its fast enough. It took around 6 hours to copy the table into new one. BUT we were missing 2M records in this method. We also checked how many records came into the existing table while exporting/importing, it was only around a 100K. So the exporting/importing was missing 1.9M records, and we couldn't figure out why. After going through all these different ways, we have decided to put the app in downtime and add the index on the huge table I wanted to know, do others face this issue as well? Is there a way to either add indexes on a huge table without causing downtime on production? Or is there a faster way to copy a big mysql table without loss of data?
Asked by Rohan (53 rep)
Dec 15, 2023, 12:19 PM
Last activity: Dec 20, 2023, 10:24 AM