[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
URL monitoring
- To: hobbit (at) hswn.dk
- Subject: URL monitoring
- From: "Dan Simoes" <dan.simoes (at) gmail.com>
- Date: Wed, 31 May 2006 12:57:42 -0700
- Domainkey-signature: a=rsa-sha1; q=dns; c=nofws; s=beta; d=gmail.com; h=received:message-id:date:from:to:subject:mime-version:content-type; b=JUbZZB3Rrcu3sDJsxqy6m4cOq0Rz1r7KNoHqA5NEyVMCh1l1UGY9k7WRpY11Vizn9dWY0Ir30oiGKLaVyUdYFrWKNdzvj6RhTXjQ8vE9Ay3cKh1oJTM639tABS0iqkpHqbMU9LCzWMdMTiGoYAs2kLTwyafVBnamZjZFKjOyhJM=
I know that hobbit can monitor URLs, of course.
I have a need to monitor a constantly changing list of URLs, which could
number in the hundreds or thousands.
Will hobbit scale to this level, or should I instead write something custom
using wget, which can take a file containing a list of URLs as input?
Thanks for your suggestions, and thanks for hobbit.
| Dan |