Adams Paul wrote:
> Hello everyone,
> I have a program which navigates to a page and then extracts the links and 
> navigates
> to each of those links.It works fine for the first web page(the program 
> navigates to
> the first page and then extracts the links and visits all the links).When I 
> then try to get
> the program to navigate to the second page (from @page)it puts a http:/// in 
> the address bar and does
> not return a web page. Below is the code:
>  
>  use LWP::UserAgent;  use HTML::LinkExtor;  use URI::URL;use 
> WIN32::IEAutomation;my $ie = Win32::IEAutomation->new( visible => 1, maximize 
> => 
> 1);@page=('http://www.ebay.com','http://www.google.com','http://www.nasa.gov');
>    $b=0;while($b<100) {   print $b;print"@page[$b]";  $url = "@page[$b]"; 
> print "This is $url";print @page[$url]; # for instance  $ua = 
> LWP::UserAgent->new;
>   # Set up a callback that collect image links  my @imgs = ();  sub callback 
> {     my($tag, %attr) = @_;     return if $tag ne 'a';  # we only look closer 
> at <img ...>     push(@imgs, values %attr);  }
>   # Make the parser.  Unfortunately, we don't know the base yet  # (it might 
> be diffent from $url)  $p = HTML::LinkExtor->new(\&callback);
>   # Request document and parse it as it arrives  $res = 
> $ua->request(HTTP::Request->new(GET => $url),                      sub 
> {$p->parse($_[0])});
>   # Expand all image URLs to absolute ones  my $base = $res->base;  @imgs = 
> map { $_ = url($_, $base)->abs; } @imgs;
>   # Print them out  print join("\n", @imgs), "\n";  
> print (@imgs[1]);
>  
> for($a=0;$a<10;$a=$a+1){$ie->gotoURL("@imgs[$a]");print($img);}$b++;}
> Any help would be appreciated

Ah, John and Brian will like this one. It has very few lines :)

Rob

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to