I've scripted with JS for 12 years and would contribute all I could to a
project of this type. My interest would be more from a reverse engineering
view point. If the LWP could be used to efficiently parse JS at useragent
level it would be both a blessing and a curse. Right now JS is great for
blin
I've also been interested for a long time and tried to work on
this 2 years ago but didn't get far enough to bother trying
to release anything.
DOM could be tackled in an HTML::Tree-->XML::"parser" fashion.
That way, bad mark-up could be legitimized and something like
XML::LibXML could handle the
Andy Lester wrote:
On Nov 22, 2006, at 4:13 PM, Chuck Gelm wrote:
Has anyone a URL for URI, HTML::Parser, and Compress::Zlib 1.10 ?
Same place you got libwww, probably. Did you go to search.cpan.org to
get it?
You can also use the CPAN shell to automagically take care of those
dependenc
On Nov 22, 2006, at 4:13 PM, Chuck Gelm wrote:
Has anyone a URL for URI, HTML::Parser, and Compress::Zlib 1.10 ?
Same place you got libwww, probably. Did you go to search.cpan.org
to get it?
You can also use the CPAN shell to automagically take care of those
dependencies.
xoxo,
Andy
Howdy:
I guess that this means that my subscription to libwww-perl was
accepted. :-|
I am trying to install libwww-perl-5.805.
In ../libwww-5.805/ I run the ./install.sh script and it complains that
I need
URI, HTML::Parser, and Compress::Zlib 1.10.
I need this for a LTSP v4.2 server.
I a
I'm willing to take a crack at laying out a vision, high level objectives
and some implementation requirements based on my experiences and see how
much interest there is for a group effort if others are interested in
helping out. I'm sure I'll miss a lot that others with different
experiences cou
I agree that folks have been talking about JS for a long time, and that it's
frustrating, but what I'm suggesting is that we need to tackle a different
problem first.
This isn't an academic question - without knowing how the DOM is going to
work (or even if there is one), the JS conversation can'
On Nov 22, 2006, at 2:51 PM, Christopher Hart wrote:
Would an "easier" (yet still monumental) starting point be to
tackle the DOM
implementation independent of a JS engine?
All of this is pointless unless someone is willing to step up and
JFDI. Otherwise, it's just rehashing the same the
Would an "easier" (yet still monumental) starting point be to tackle the DOM
implementation independent of a JS engine?
It seems like attempting to create any kind of a JavaScript framework
implementation would be pretty useless (and horribly incomplete) without the
DOM being present first. An i
On Wed, 22 Nov 2006, Stefan Seifert wrote:
[...]
I too thought about that. Maybe using the JavaScript or
JavaScript::Spidermonkey module and XML::DOM. I will certainly
experiment around with them, as we need it at work. Doesn't seem to be
Sigh, we've had this same little discussion at least fiv
As Jonathan said, File::Listing does not belong to the perl core and
is part of the libwww-perl distribution (whose maintainer is Gisle
Aas). So it it is not supported here. I have copied your report to the
libwww@perl.org mailing list (as instructed in the README file of the
distribution) and the
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
John J Lee wrote:
> On Fri, 3 Nov 2006, Christopher Hart wrote:
>
>> I know there is a rich history of challenges implementing any kind of
>> JavaScript interpretation using Mechanize or any other web
>> scripting/automation utility, but I was wonderi
On Fri, 3 Nov 2006, Christopher Hart wrote:
I know there is a rich history of challenges implementing any kind of
JavaScript interpretation using Mechanize or any other web
scripting/automation utility, but I was wondering if anyone has tried to
focus on "Mechanizing" AJAX?
I realize this would
This is a bug report for perl from [EMAIL PROTECTED],
generated with the help of perlbug 1.35 running under perl v5.8.8.
-
[Please enter your report here]
I use Listing.pm in cases where I am not sure if the remote ftp site
running '
14 matches
Mail list logo