Is there any way to upload large amounts of files to Google Drive with a decent speed besides Filezilla Pro and RClone?
0
votes
3
answers
742
views
I signed for Google Drive's Premium 2TB plan for my backups and needed a way to send more than 1 million+ of files there. Web interface is very problematic for this purpose as if the transfer has any errors, identifying what's online and what's not would be very difficult.
So I started to look for some safe way to send files there.
First I found google-drive-ftp-adapter, but as I was trying to install it I got an error that I couldn't solve: *"This app is blocked. This app tried to access sensitive info in your Google Account. To keep your account safe Google blocked this access."* The project has no activity so I guessed that this was some move by Google that wasn't addressed by the maintainers.
Then I tried to add google drive to Gnome's online accounts. It did mount the drive, but when I tried to upload the files the speed was ridiculous: around 15 KB/s which would take me like 10000 hours to upload the files (around 543GB) and I got some errors even in the first files.
After that I bought Filezilla Pro which connects to Google Drive. So I bought it and it is worked fine. The speed is around 15 MB/s which is not ideal, it will take like 10 hours to upload but that's much better that 10000 and it's doable. Also filezilla has the "Failed transfers" tab that allows me to see the transfers that had an error and I just redo them and that's it.
I wanted a GUI tool, but also gave a chance to rclone that proved to work really well and I could make it reach 32 MB/s which downed the time to 5 hours. The drawback is that if any tranfer has an error I have to cherry pick them in a log file to retransfer them. This is more incovenient than Filezilla.
But none of these solutions felt like my endgame. Each one has it's drawbacks. I really would prefer a FOSS solution preferably with a GUI that simply let me choose the dirs I wanted to copy, redo the errors easily and it's fast enough. GUI tools are preffered but I would consider command line tools if it's really efficient, let me choose what I want and has decent logging while copying preferably visually.
Does anyone has any other solution besides Filezilla Pro and RClone ? How would you send 1 million+ of files from several different folders to Google Drive ?
**Edit**
I really had underestimated the speed. It's been copying files for 17 hours and there's only 368GB copied. With RClone. 🙄
**Edit2**
Underestimate ? that's a very far understatement. The upload finally concluded. And guess what: 3d 1h 13m 42.3s to upload 463.169 GiB*.
Now... ok I know the small file overhead and stuff. But this overhead level is not acceptable by any standard. I really have to find some way to speed things to an acceptable level.
Searching for a solution I saw that TeraBox has a feature called "cloud decompression". Will give it a try.
*I know the value differs from what I stated above. But that's because I had already uploaded a part of the files in a previous run.
**Edit3**
TeraBox do have cloud decompression. They do offer 1TB free, their paid plan (2TB) is as cheap as it gets (U$3.49/month) and does have the ability to decompress online. Unfortunately decompression of files larger than 12 Gib isn't supported and to divide 530 Gibs in packages of 12 Gibs I have to cherry pick files in the sub-dirs what is a lot of work. Impractical. So this solution is only applicable if either we want to do that 3 day upload again or maintain subdirs as packages.
No FTP client, paid or free, supports TeraBox as Filezilla supports Google Drive. But they do have desktop apps for all major OS (Win, Mac Linux) and a mobile app for Android.
to quote U2: "And I still haven't found, what I'm looking for" :D
Asked by Nelson Teixeira
(470 rep)
May 9, 2024, 07:16 AM
Last activity: May 15, 2024, 02:22 AM
Last activity: May 15, 2024, 02:22 AM