i have very hugh table , around 100 M records and 100 GB in a dump file , when i try to restore it it to a different DB i get sql query lost connection , i want to try and dump this table into chunks (something like 10 chinks of 10 GB) where each chink will be in seperate table.
what i managed to optimized so far is this :
mysqldump --skip-triggers --compact --no-create-info --single-transaction --quick --max_allowed_packet 1G -h {host} -u {user} -P 3306 -p{pwasword} {my_schema} {}> /mnt/datadir/{table_name}.sql
and now the output is that i have 1 file {table_name}.sql i na size of 100 GB i want to get 10 files in sizes of 10 GB each
Asked by Omer Anisfeld
(161 rep)
Jul 7, 2019, 06:29 AM
Last activity: Jul 7, 2025, 09:07 PM
Last activity: Jul 7, 2025, 09:07 PM