Simple bash website monitoring script

  • Not everyone can afford expensive operational monitoring tools, so here’s a cheap and dirty way to use cURL to scrape a webpage, return it’s status, and then compare it with the value you’d get if the page was up. If that doesn’t match, we assume the page is down, and then trigger an email as an alarm.

    # Your site URL goes here
    result=$(curl -m 60 -Is $website | head -n 1 | tr -d "\r")
    # note that this value varies depending on platform. You should echo $result to see what the correct value is when querying the site itself when you know it's running.
    expecting="HTTP/2 200"; 
    if [ "$result" = "$expecting" ];
        output="$timestamp -> $website $result -> Website UP";
        echo $output;
        echo $output >> /path/to/your/logfile/log.txt 
        output="$timestamp -> $website $result -> Website DOWN";
        echo $output;
        echo $output >> /path/to/your/logfile/log.txt 
        # Fire an alert as the result isn't what we expected
        echo "From: [email protected]" > /tmp/mail.tmp
        echo "To: [email protected]" >> /tmp/mail.tmp
        echo "Subject: ALERT: Website $website is DOWN" >> /tmp/mail.tmp
        echo "" >> /tmp/mail.tmp
        echo $mailbody >> /tmp/mail.tmp
        cat /tmp/mail.tmp | /usr/sbin/sendmail -t -f "[email protected]"

    This is very primitive, but works very well. Note, that sendmail is being used here to trigger an email. If you do not have that, you’ll need to substitute the path and command for whatever you want to use.

    Have fun…

  • @phenomlab this is useful 👍 thanks

Discover More

  • 7
  • 1