You are not logged in.

#1 2014-08-27 15:30:50

hauntedbyshadow
Member
Registered: 2014-07-02
Posts: 19
Website

Curl timeout

Hey Guys,

So i'm hitting a bit of a brick wall here.  So i'm making a script that will all in one call my MySQL server for the domains, then from there grab all the domains on the server, and append them to a file.  Then using a for do loop going through that file assigning each as a variable.  It will then for each domain on this list curl them to check for a sitemap.  So this is what I have thus far.

  c=$(curl -sLw --max-time 5  "%{http_code}\n" $dir/sitemap.xml -o /dev/null) 

if [[ $c != 200 ]]

    then
echo -e "\033[0;35m \t $dir's A record is $d and it's status is $c which means no site map found.\t \033[0m" >> (file) 

As well I added a timer in the loop to check for the time it's taking each pass.

 t="$(date +%s)"
t="$(($(date +%s)-t))"
echo -e "Time for $dir in seconds: ${t} " >> /var/log/part-1-timelog 

However its taking forever because it's hitting some that are taking well over 100 seconds to curl.  I added the --max-time 5 flag which should kill it after 5 seconds but it's still curling some well over that... am I missing something any suggestions? Thanks

Offline

Board footer

Powered by FluxBB