Bash script to check url status

Shell Scripting

There are many instances where a administrator needs to check the status of application url from time to time to ensure all applications deployed on server is running fine. Below script is designed to do exactly same, and it helps a lot when you have a huge number of application urls which needs to be checked from time to time. Below bash script to check url status can be used as a script to check url availability.

This script contains 3 files:

urllist: File to store application urls one by one.

Example urllist” file Needs to be created in the same directory where script will reside:

http://www.google.com
http://www.yahoo.com

ScriptMonitor.log: This is the log file generated on the same directory where script is ran. It contains all status of urls checked.

CheckURL.sh : The bash script which will be used to check the application URL’s.

Contents/Variables Desc. Inside Script in Functions:

SetParam() : Contains all the variables which needs to be set in script.

URL_Status() : Main function which calculates the RESPONSE CODES

Mail_Admin() : Sends mail to admin mailing list incase of server down issue.

Send_Log() : Sends all logs generated by the script to admin mailing list.

Main_Menu() : Used to call all functions

tee Command: Used to record the logs.

curl Command: Used to get the status codes.

 

 

 

 

 

 

#!/bin/bash
#bash to check url status.
#set -x; # Enable this to enable debug mode.
#clear # Enable this to clear your screen after each run.

SetParam() {
export URLFILE="urllist"
export TIME=`date +%d-%m-%Y_%H.%M.%S`
SAFE_STATUSCODES=( 200 201 202 203 204 205 206 207 208 226 401 )
export STATUS_UP=`echo -e "\E[32m[ RUNNING ]\E[0m"`
export STATUS_DOWN=`echo -e "\E[31m[ DOWN ]\E[0m"`
export MAIL_TO="admin(at)techpaste(dot)com"
export SCRIPT_LOG="Script_Monitor.log"
}

URL_Status() {

SetParam
sed -i '/^$/d' $URLFILE; #To Parse the URLFILE for removal of blank rows
cat $URLFILE | while read next
do
STATUS_CODE=`curl --output /dev/null --silent --head --write-out '%{http_code}\n' $next`
# If you want to set a timeout then add --max-time 15, here 15 is 15seconds
case $STATUS_CODE in

100) echo "At $TIME: $next url status returned $STATUS_CODE : Continue" ;;
101) echo "At $TIME: $next url status returned $STATUS_CODE : Switching Protocols" ;;
102) echo "At $TIME: $next url status returned $STATUS_CODE : Processing (WebDAV) (RFC 2518) " ;;
103) echo "At $TIME: $next url status returned $STATUS_CODE : Checkpoint" ;;
122) echo "At $TIME: $next url status returned $STATUS_CODE : Request-URI too long" ;;
200) echo "At $TIME: $next url status returned $STATUS_CODE : OK" ;;
201) echo "At $TIME: $next url status returned $STATUS_CODE : Created" ;;
202) echo "At $TIME: $next url status returned $STATUS_CODE : Accepted" ;;
203) echo "At $TIME: $next url status returned $STATUS_CODE : Non-Authoritative Information" ;;
204) echo "At $TIME: $next url status returned $STATUS_CODE : No Content" ;;
205) echo "At $TIME: $next url status returned $STATUS_CODE : Reset Content" ;;
206) echo "At $TIME: $next url status returned $STATUS_CODE : Partial Content" ;;
207) echo "At $TIME: $next url status returned $STATUS_CODE : Multi-Status (WebDAV) (RFC 4918) " ;;
208) echo "At $TIME: $next url status returned $STATUS_CODE : Already Reported (WebDAV) (RFC 5842) " ;;
226) echo "At $TIME: $next url status returned $STATUS_CODE : IM Used (RFC 3229) " ;;
300) echo "At $TIME: $next url status returned $STATUS_CODE : Multiple Choices" ;;
301) echo "At $TIME: $next url status returned $STATUS_CODE : Moved Permanently" ;;
302) echo "At $TIME: $next url status returned $STATUS_CODE : Found" ;;
303) echo "At $TIME: $next url status returned $STATUS_CODE : See Other" ;;
304) echo "At $TIME: $next url status returned $STATUS_CODE : Not Modified" ;;
305) echo "At $TIME: $next url status returned $STATUS_CODE : Use Proxy" ;;
306) echo "At $TIME: $next url status returned $STATUS_CODE : Switch Proxy" ;;
307) echo "At $TIME: $next url status returned $STATUS_CODE : Temporary Redirect (since HTTP/1.1)" ;;
308) echo "At $TIME: $next url status returned $STATUS_CODE : Resume Incomplete" ;;
400) echo "At $TIME: $next url status returned $STATUS_CODE : Bad Request" ;;
401) echo "At $TIME: $next url status returned $STATUS_CODE : Unauthorized" ;;
402) echo "At $TIME: $next url status returned $STATUS_CODE : Payment Required" ;;
403) echo "At $TIME: $next url status returned $STATUS_CODE : Forbidden" ;;
404) echo "At $TIME: $next url status returned $STATUS_CODE : Not Found" ;;
405) echo "At $TIME: $next url status returned $STATUS_CODE : Method Not Allowed" ;;
406) echo "At $TIME: $next url status returned $STATUS_CODE : Not Acceptable" ;;
407) echo "At $TIME: $next url status returned $STATUS_CODE : Proxy Authentication Required" ;;
408) echo "At $TIME: $next url status returned $STATUS_CODE : Request Timeout" ;;
409) echo "At $TIME: $next url status returned $STATUS_CODE : Conflict" ;;
410) echo "At $TIME: $next url status returned $STATUS_CODE : Gone" ;;
411) echo "At $TIME: $next url status returned $STATUS_CODE : Length Required" ;;
412) echo "At $TIME: $next url status returned $STATUS_CODE : Precondition Failed" ;;
413) echo "At $TIME: $next url status returned $STATUS_CODE : Request Entity Too Large" ;;
414) echo "At $TIME: $next url status returned $STATUS_CODE : Request-URI Too Long" ;;
415) echo "At $TIME: $next url status returned $STATUS_CODE : Unsupported Media Type" ;;
416) echo "At $TIME: $next url status returned $STATUS_CODE : Requested Range Not Satisfiable" ;;
417) echo "At $TIME: $next url status returned $STATUS_CODE : Expectation Failed" ;;
500) echo "At $TIME: $next url status returned $STATUS_CODE : Internal Server Error" ;;
501) echo "At $TIME: $next url status returned $STATUS_CODE : Not Implemented" ;;
502) echo "At $TIME: $next url status returned $STATUS_CODE : Bad Gateway" ;;
503) echo "At $TIME: $next url status returned $STATUS_CODE : Service Unavailable" ;;
504) echo "At $TIME: $next url status returned $STATUS_CODE : Gateway Timeout" ;;
505) echo "At $TIME: $next url status returned $STATUS_CODE : HTTP Version Not Supported" ;;
esac

URL_SafeStatus $STATUS_CODE

done;

}

URL_SafeStatus() {
flag=0
for safestatus in ${SAFE_STATUSCODES[@]}
do
#echo "got Value of STATUS CODE= $1";
#echo "Reading Safe Code= $safestatus";
if [ $1 -eq $safestatus ] ; then

echo "At $TIME: Status Of  URL $next = $STATUS_UP";
flag=1
break;
fi
done

if [ $flag -ne 1 ] ; then
echo "At $TIME: Status Of  URL $next = $STATUS_DOWN" | Mail_Admin $TIME $next
#break;
fi

}

Mail_Admin() {
SetParam
echo "At $1 URL $2 is DOWN!!" | mailx -s " Application URL: $2 DOWN!!!" $MAIL_TO
}

Send_Log() {
SetParam
if [ -f $SCRIPT_LOG ] ; then
mailx -s "$0 Script All Url Check Log Details Till $TIME" $MAIL_TO < $SCRIPT_LOG
else
echo "$SCRIPT_LOG NOT FOUND!!"
fi
}

Main_Menu() {

URL_Status

}
SetParam
Main_Menu | tee -a $SCRIPT_LOG
Send_Log

 

After Running the script with above urllist the output will look like below. And It will send a mail to the MAIL_TO list incase any url found down or status code does not come in the safe list defined in SetParam function.

url check script output

Note: Edited MAIL_TO due to bombardment of test emails from users :). See my comment.

In case of any ©Copyright or missing credits issue please check CopyRights page for faster resolutions.

40 Responses

  1. admin says:

    Hi All,

    Please change the email address in MAIL_TO=”admin(at)techpaste(dot)com” before you test the script.
    My INBOX is getting bombarded with your test emails. Please make sure you change it to some other email id before you test.

    🙂
    Thanks.
    Admin

  2. Kevin says:

    I like this script – it’s easy to understand and it works well when checking application urls.

    It’s worth adding –max-time 10 to the curl command so that it drops out after 10 seconds. Then, if there are network issues or the site is down, the script will not hang.

    PS I hate your share banner as it hides the lhs of the text in Firefox 13.0.1. Annoying.

    • Kevin says:

      I forgot to add that curl returns a staus of 28 if there is no response before max-time is reached so it’s easy to include a check to see if there was a timeout.

      • admin says:

        Hi Kevin,

        We liked your suggestion and have added a comment to add max-time to the script.

        PS: We liked your PS too and have disabled the share banner… 🙂

        Thanks.

  3. Veer says:

    Hi

    I was in looking for the same kind of script which checks for the URL.

    I tried running your script but did not get the expected out put.

    i guess i have missed to make appropriate changes to the below lines :

    STATUS_CODE=`curl –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`
    # If you want to set a timeout then add –max-time 15, here 15 is 15seconds

    could you please explain this …

    Thanks in advance 🙂

    • admin says:

      Hi Veer,

      Whats the error you are getting.. while running the script your side?

      STATUS_CODE=`curl –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`

      for the above one max-time is the timeout seconds to check the url load. You can use ti like below but not required without the maxtime stuff also the script should just work fine.

      STATUS_CODE=`curl –max-time 15 –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`

  4. Veer says:

    Hi Admin,

    I am trying to use the above script for checking few Application URL.
    I just need to check whether a particular URL is up & running or not.

    Also please let me know if i want to check the https URL.

    • admin says:

      you can use -k option in curl for ssl urls

      -k, –insecure (SSL) This option explicitly allows curl to perform “insecure” SSL connections and transfers.

  5. Veer says:

    Hi Admin,

    Let me clear the picture bit more…

    I am trying to access the Application URL which i used to launch via Citrix i.e. remote internet explorer.
    When i am running this script i am getting error code as “302” for few url which means that URL FOUND.
    but i can say that the url is up & running only when the return code is “200” i.e. URL is OK.

    Also for some of the url i am getting error code as ‘000’ which is not a valid return code.

    Appreciate your help on this.

    Thanks 🙂

    • admin says:

      Hi Veer,

      The HTTP response status code 302 Found is a common way of performing a redirection. Your few urls are giving 302 as they MUST be redirecting to some new url like for authentication, etc while accessing them. You will never get a 200 status code from this kind of urls. In this case what you can do is add the 302 to the safe status list like below:

      replace: SAFE_STATUSCODES=( 200 201 202 203 204 205 206 207 208 226 401 )

      with: SAFE_STATUSCODES=( 200 201 202 203 204 205 206 207 208 226 302 401 )

      After this it will report status OK for the urls which are getting redirected to some other url while you access them.

      Check more about 302 behavior here http://en.wikipedia.org/wiki/HTTP_302

      If you have static urls which does not redirect to multiple urls for authentication you can also use below script to check them.

      http://www.techpaste.com/2013/04/shell-script-monitor-soa-infra-url-status-linux/

      HTH
      Admin

  6. Veer says:

    Hi Admin,

    I had made few changes in the above script.
    And now it works perfectly fine for http connections but does not work for https connections i.e.

    http://XXXXXX.sg.ap.XXXXX.com:12050/mantas :: WORKS FINE

    https://XXXXXX.sg.ap.XXXXXX.com:9017/abn_adm_ae :: DOES NOT WORK

    Is there any way of checking for secured connections as well..

    Thanks

    • admin says:

      Hi Veer,

      Glad things are working fine for you. You can use -k option to check simple ssl urls .

      Replace : STATUS_CODE=`curl –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`
      With : STATUS_CODE=`curl -k –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`

      From the curl man pages:

      -k/–insecure
      (SSL) This option explicitly allows curl to perform “insecure” SSL connections and transfers. All SSL connections are attempted
      to be made secure by using the CA certificate bundle installed by default. This makes all connections considered “insecure” to
      fail unless -k/–insecure is used.

      If this option is used twice, the second time will again disable it.

      • Veer says:

        Hi Admin,

        Its working fine according to my requirement.

        Thanks a lot for your support!!!

        • veer says:

          Hi Admin,

          once again i am here..

          could you please let me know how the same script can work with wget command.

          Thanks!!

          • admin says:

            Something like this
            wget –server-response http://www.google.com 2>&1 | awk ‘/^ HTTP/{print $2}’
            or
            wget –spider -S “http://url/to/be/checked” 2>&1 | grep “HTTP/” | awk ‘{print $2}’

            do a quick google search you will get many posts about the same

  7. Raj says:

    Hi Admin

    I made the all the change but is is asking user id and password ?
    i am very new to shell scripting could you please help?

    Regards
    Raja…

  8. Raj says:

    Hi Admin

    Script is working fine but https:// xxxxx
    urls are not getting the proper result
    can you please help 🙂

    i tryied -k but no luck

    can you give me the exact code to use https:// ?

    Regards
    Raj

  9. Guru says:

    output file is created but not writing anything.

  10. pavan says:

    script doesn’t exit it will stuck at command prompt, how do we exit from script automatically.
    now i have to do cntl+c twice or cntrl+d

  11. Keshav says:

    Hi
    While running the curl command I am getting this error.
    curl: (6) Couldn’t resolve host ‘befr.geindustrial.com’

    Could you please help on this? Just to make sure I don’t have root access to make the entry in /etc/hosts file.

    Urgent reply

    • For me it works fine. See if you have some space or some special characters in between.

      [root@techpasteVM ~]# curl –output /dev/null –silent –head –write-out ‘%{http_code}\n’ befr.geindustrial.com
      200

      • Keshav says:

        curl .output /dev/null .silent .head .write-out .%{http_code}\n. befr.geindustrial.com
        curl: (6) Couldn’t resolve host ‘.output’
        curl: (3) malformed
        curl: (6) Couldn’t resolve host ‘.silent’
        curl: (6) Couldn’t resolve host ‘.head’
        curl: (6) Couldn’t resolve host ‘.write-out’
        curl: (6) Couldn’t resolve host ‘.%http_coden.’
        curl: (6) Couldn’t resolve host ‘befr.geindustrial.com’

        Please see above what I get

      • Keshav says:

        I am getting this error
        curl .output /dev/null .silent .head .write-out .%{http_code}\n. befr.geindustrial.com
        curl: (6) Couldn’t resolve host ‘.output’
        curl: (3) malformed
        curl: (6) Couldn’t resolve host ‘.silent’
        curl: (6) Couldn’t resolve host ‘.head’
        curl: (6) Couldn’t resolve host ‘.write-out’
        curl: (6) Couldn’t resolve host ‘.%http_coden.’
        curl: (6) Couldn’t resolve host ‘befr.geindustrial.com’

      • [ge001575@VDCLP3213 Application]$ curl .output /dev/null .silent .head .write-out .%{http_code}\n. befr.geindustrial.com
        curl: (6) Couldn’t resolve host ‘.output’
        curl: (3) malformed
        curl: (6) Couldn’t resolve host ‘.silent’
        curl: (6) Couldn’t resolve host ‘.head’
        curl: (6) Couldn’t resolve host ‘.write-out’
        curl: (6) Couldn’t resolve host ‘.%http_coden.’
        curl: (6) Couldn’t resolve host ‘befr.geindustrial.com’

        I a getting above error

    • curl .output /dev/null .silent .head .write-out .%{http_code}\n. befr.geindustrial.com
      curl: (6) Couldn’t resolve host ‘.output’
      curl: (3) malformed
      curl: (6) Couldn’t resolve host ‘.silent’
      curl: (6) Couldn’t resolve host ‘.head’
      curl: (6) Couldn’t resolve host ‘.write-out’
      curl: (6) Couldn’t resolve host ‘.%http_coden.’
      curl: (6) Couldn’t resolve host ‘befr.geindustrial.com’

  12. Keshav says:

    curl .output /dev/null .silent .head .write-out .%{http_code}\n. befr.geindustrial.com
    curl: (6) Couldn’t resolve host ‘.output’
    curl: (3) malformed
    curl: (6) Couldn’t resolve host ‘.silent’
    curl: (6) Couldn’t resolve host ‘.head’
    curl: (6) Couldn’t resolve host ‘.write-out’
    curl: (6) Couldn’t resolve host ‘.%http_coden.’
    curl: (6) Couldn’t resolve host ‘befr.geindustrial.com’

  13. Keshav says:

    Hi,

    Not working for me.
    Please see below output.

    [ge001575@VDCLP3213 Application]$ curl .output /dev/null .silent .head .write-out .%{http_code}\n. befr.geindustrial.com
    curl: (6) Couldn’t resolve host ‘.output’
    curl: (3) malformed
    curl: (6) Couldn’t resolve host ‘.silent’
    curl: (6) Couldn’t resolve host ‘.head’
    curl: (6) Couldn’t resolve host ‘.write-out’
    curl: (6) Couldn’t resolve host ‘.%http_coden.’
    curl: (6) Couldn’t resolve host ‘befr.geindustrial.com’

  14. Getting this error while using curl
    curl .output /dev/null .silent .head .write-out .%{http_code}\n. befr.geindustrial.com
    curl: (6) Couldn’t resolve host ‘.output’
    curl: (3) malformed
    curl: (6) Couldn’t resolve host ‘.silent’
    curl: (6) Couldn’t resolve host ‘.head’
    curl: (6) Couldn’t resolve host ‘.write-out’
    curl: (6) Couldn’t resolve host ‘.%http_coden.’
    curl: (6) Couldn’t resolve host ‘befr.geindustrial.com’
    ============

  15. Keshav says:

    Hi Team,

    curl –output /dev/null –silent –head –write-out ‘%{http_code}\n’ befr.geindustrial.com
    000

    this is the output I am getting

  16. Saurabh says:

    is there is any to check the URL latency also through the script?

  17. Saurabh says:

    is there is any to check the URL latency also through the script?

    • You can run commands like below to get the latency:

      curl -s -w ‘\nLookup time:\t%{time_namelookup}\nConnect time:\t%{time_connect}\nPreXfer time:\t%{time_pretransfer}\nStartXfer time:\t%{time_starttransfer}\n\nTotal time:\t%{time_total}\n’ -o /dev/null http://www.google.com

      Example Output:

      Lookup time: 0.004
      Connect time: 0.005
      PreXfer time: 0.005
      StartXfer time: 0.280

      Total time: 0.280

  18. Vinayak says:

    Hi Admin,

    I have enabled the script in our linux box and looks good. thanks for this. Also I have one more question about log file. Is logfile going upend automatically after some day or time line? or manually we have clear the log file? Please help me to understand thank you

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.