Sample Header Ad - 728x90

How can I download a very large list of URLs so that the downloaded files are split into subfolders containing the first letter of the filenames?

0 votes
2 answers
369 views
I want to download many files (> tens of millions). I have the URL for each file. I have the list of URLs in a file URLs.txt:
http://mydomain.com/0wd.pdf 
http://mydomain.com/asz.pdf 
http://mydomain.com/axz.pdf 
http://mydomain.com/b00.pdf 
http://mydomain.com/bb0.pdf 
etc.
I can download them via wget -i URLs.txt, however it would go over the [maximum](https://stackoverflow.com/a/466596/395857) number of files that can be placed in one folder. How can I download this large list of URLs so that the downloaded files are split into subfolders containing the first letter of the filenames? E.g.,:
0/0wd.pdf
a/asz.pdf
a/axz.pdf
b/b00.pdf
b/bb0.pdf
etc.
If that matters, I use Ubuntu.
Asked by Franck Dernoncourt (5533 rep)
Dec 21, 2023, 10:22 PM
Last activity: Dec 24, 2023, 12:15 AM