Sample Header Ad - 728x90

How to backup many large files to single compressed file on S3

0 votes
1 answer
245 views
I have an application that has many thousands of files totaling over 10TB. I'd need to backup this data somewhere (probably to AWS S3). I'd like to: 1. compress data being backed up 2. save the backup as a single file For example as a gzipped tarfile. Because of the size, I cannot create the gzipped tarfile locally because it'd be too large. How can I: 1. Stream all of these folders and files onto AWS S3 as a single compressed file? 2. Stream the compressed file from S3 back onto my disk to the original filesystem layout?
Asked by nick314 (3 rep)
Nov 30, 2022, 06:57 PM
Last activity: Nov 30, 2022, 07:22 PM