I used to utilize following command to get all links of a web-page and then
grep
what I want:
curl $URL 2>&1 | grep -o -E 'href="([^"#]+)"' | cut -d'"' -f2 | egrep $CMP-[0-9].[0-9].[0-9]$ | cut -d'-' -f3
It was doing great till yesterday. I tried to run curl
itself and I saw it returns:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
Was there any possible updates which causes the command not working or what?
**EDIT 1:**
I changed my approach to wget
regarding [this answer](https://stackoverflow.com/a/2804721/1626977) :
wget -q $URL -O - | grep -o -E 'href="([^"#]+)"' | cut -d'"' -f2 | egrep $CMP-[0-9].[0-9].[0-9]$ | cut -d'-' -f3
But still doesn't know why curl
approach suddenly stopped working.
Asked by Zeinab Abbasimazar
(303 rep)
Aug 29, 2017, 06:21 AM
Last activity: Jan 3, 2023, 05:28 PM
Last activity: Jan 3, 2023, 05:28 PM