I was doing pretty much exactly that for my recently-ex employer.<br><br>There's a tool called 'curl' which is very good at getting web pages. You would do something like this in bash:<br><br> curl -s -S -L --max-time 5 -o /tmp/page.html <a href="http://server.domain.com/">http://server.domain.com/</a><br>
if [ $? -ne 0 ]; then<br> # something went wrong. the return code tells the actual error<br> COLOR=red<br> MESSAGE="page fetch failed. curl error $?"<br> else<br> COLOR=green<br>
MESSAGE="page fetch successful"<br> fi<br><br>then use server/bin/bb to send the report to the server. I generally saved the html somewhere in the web server's document tree, then added a link to the report so that Midrange Operations could click through and see the actual page the server returned.<br>
<br>Curl can also tell you how long the transaction took, which you could add to the report in a format that xymon could pick up for graphing.<br><br>Ralph Mitchell<br><br><br><div class="gmail_quote">On Fri, Feb 27, 2009 at 10:14 AM, Jason Hand <span dir="ltr"><<a href="mailto:jason@hands4christ.org">jason@hands4christ.org</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">I cannot find anything mentioning if Hobbit/Xymon can test how long an HTTP or HTTPS page takes to respond and alert on that. In other words, if a site is in a hung state so that it still technically "responds" but it takes 5 seconds for the page to respond how does Hobbit/Xymon handle that? I want to make sure that if a site that we are monitoring takes longer than a couple of seconds to load we are alerted,<br>
<br>
Do any of you do a test like that and how are you doing it?<br>
<br>
Thanks,<br>
Jason<br>
<br>
To unsubscribe from the hobbit list, send an e-mail to<br>
<a href="mailto:hobbit-unsubscribe@hswn.dk" target="_blank">hobbit-unsubscribe@hswn.dk</a><br>
<br>
<br>
</blockquote></div><br>