Sample Header Ad - 728x90

How do I use wget to download all links from my site and save to a text file?

14 votes
5 answers
74304 views
I am trying to download all links from aligajani.com. There are 7 of them, excluding the domain facebook.com–which I want to ignore. I don't want to download from links that start with facebook.com domain. Also, I want them saved in a .txt file, line by line. So there would be 7 lines. Here's what I've tried so far. This just downloads everything. Don't want that. wget -r -l 1 http://aligajani.com
Asked by Ali Gajani (295 rep)
Feb 26, 2014, 06:35 AM
Last activity: Apr 12, 2025, 02:37 AM