Not everyone can afford expensive operational monitoring tools, so here’s a cheap and dirty way to use cURL
to scrape a webpage, return it’s status, and then compare it with the value you’d get if the page was up. If that doesn’t match, we assume the page is down, and then trigger an email as an alarm.
#!/bin/bash
CURLOPT_SSL_VERIFYPEER => 0
# Your site URL goes here
website="https://your.forum.url*";
result=$(curl -m 60 -Is $website | head -n 1 | tr -d "\r")
# note that this value varies depending on platform. You should echo $result to see what the correct value is when querying the site itself when you know it's running.
expecting="HTTP/2 200";
timestamp=$(date);
if [ "$result" = "$expecting" ];
then
output="$timestamp -> $website $result -> Website UP";
echo $output;
echo $output >> /path/to/your/logfile/log.txt
else
output="$timestamp -> $website $result -> Website DOWN";
echo $output;
echo $output >> /path/to/your/logfile/log.txt
# Fire an alert as the result isn't what we expected
mailbody="$output"
echo "From: [email protected]" > /tmp/mail.tmp
echo "To: [email protected]" >> /tmp/mail.tmp
echo "Subject: ALERT: Website $website is DOWN" >> /tmp/mail.tmp
echo "" >> /tmp/mail.tmp
echo $mailbody >> /tmp/mail.tmp
cat /tmp/mail.tmp | /usr/sbin/sendmail -t -f "[email protected]"
fi
This is very primitive, but works very well. Note, that sendmail
is being used here to trigger an email. If you do not have that, you’ll need to substitute the path and command for whatever you want to use.
Have fun…