[hobbit] - Internet sites testing

Ralph Mitchell ralphmitchell at gmail.com
Fri Dec 8 15:19:38 CET 2006


On 12/8/06, Henrik Stoerner <henrik at hswn.dk> wrote:
> I've done it for one or two sites, and abandoned it because it is
> incredibly fragile when dealing with real-world websites. You're
> basically going to mimic the behaviour of a browser, but you probably
> don't have an engine that is fully capable of handling JavaScript,
> cookies, automated redirects and form submissions like a real browser
> does.

Curl handles cookies, redirects that happen via Location headers, and
form submissions, for both regular sites and secure sites.

My scripts all evolved under Big Brother, so they're all Bourne shell,
using curl, grep, sed, head, tail, etc to process the pages.  There's
a perl script on the curl home page (http://curl.haxx.se) called
formfind.pl, which pulls out form elements.  I've got a hacked up copy
of formfind.pl that hands me the form in a format suitable for posting
back using curl.

I used to have a script that logged into an airline booking system,
picked out flights, and went through the whole booking sequence,
stopping right before it would have had to enter credit card details.
That was about 19 steps, I think.

You don't necessarily have to be able to process javascript.  A lot of
forms just use it to verify the data, which you can ignore.  I've run
into a few that used javascript to modify the form variables and store
the results into other form variables, but those aren't too hard to
deal with once you understand what's happening.

Probably the worst thing I've had to deal with was a page that
assembled a new url from bits of the form elements, then jumped to it
by setting location.href=[newurl].  That kind of thing almost has to
be hardwired, and as Henrik says, that has a tendency to be a bit
fragile.

I think I've got an example script somewhere.  If I can find it, I'll post it.

Ralph Mitchell



More information about the Xymon mailing list