ÍÎÓÒÁÓÊÈ, ÌÓËÜÒÈÌÅÄÈÀ-ÏÐÎÅÊÒÎÐÛ, ÂÈÄÅÎÊÀÌÅÐÛ, CISCO

2001-02-18 Thread jupiterv
Title: ÍÎÓÒÁÓÊÈ, ÌÓËÜÒÈÌÅÄÈÀ-ÏÐÎÅÊÒÎÐÛ, ÂÈÄÅÎÊÀÌÅÐÛ, CISCO







Óâàæàåìûå ãîñïîäà,
Ïðåäëàãàåì ñî ñêëàäà â Ìîñêâå ïî íèçêèì öåíàì:

Íîóòáóêè – Toshiba, Sony, Fujitsu-Siemens, Compaq, IBM, Mitac, Asus 
Ìóëüòèìåäà-ïðîåêòîðû – Panasonic, Sony, Nec, Sanyo, Proxima, Mitsubishi 
Ïëàçìåííûå ïàíåëè – Panasonic, Nec, JVC, Fujitsu 
Âèäåîêàìåðû öèôðîâûå – Sony, Panasonic 
Ñåòåâîå îáîðóäîâàíèå – Cisco, Intel 

 

Íàäååìñÿ íà ñîòðóäíè÷åñòâî,

Íàøè êîîðäèíàòû:

Òåë./Ôàêñ: (095) 760-79-42
Ïîíåäåëüíèê - ïÿòíèöà ñ 9.00 äî 19.00


[EMAIL PROTECTED]


Èçâèíèòå, åñëè ýòî ïèñüìî ïðè÷èíèëî Âàì íåóäîáñòâî.

 ñëó÷àå, åñëè Âû â äàëüíåéøåì íå õîòèòå ïîëó÷àòü íàøó èíôîðìàöèþ âåðíèòå ýòî ïèñüìî ñ ïîìåòêîé UNSUBSCRIBE







how to filter only certain URL's?

2001-02-18 Thread Gary Funck

Hello,

I have an application where I want to traverse a given site, but only
retrieve pages with a URL that matches a particular pattern.  The
pattern would include a specific directory, and a file name that
has a particular form.  If wget won't accept a general pattern, I'd
like it if wget would just return the URL's it finds during its
recursive traversal, but not return the data.  Given the list of
URL's, I can filter the ones out that I'm interested in, and only
fetch those.  Here's an example - assume that I'm interested in
fetching all FAQ pages that have linux in their file name.  Using
conventional grep patterns, I might be interested in URL's of the
form: '.*/faqs/.*linux.*\.html', for example.  Is there a way to
do something like this in wget, or some other program?

thanks.



compiling --with-ssl under cygwin

2001-02-18 Thread Hack Kampbjørn

I tried to run compile wget with the ssl support under cygwin.
$ Makefile realclean
$ Makefile -f Makefile.cvs
$ ./configure --with-ssl
$ make
But I got a bunch of undefined references like:
/usr/lib/libssl.a(ssl_lib.o)(.text+0x585):ssl_lib.c: undefined reference
to `BIO_s_socket'

If I changed the Makefile to link with crypto first
LIBS = -lintl -l crytpo -lssl
them it compile fine.

Now since Makefile is an autogenerated one, I looked where to fix this.
After trying a couple of things I ended changing the order of the
AC_CHECK_LIB for crypto and ssl in configure.in:

  AC_CHECK_LIB(crypto,main,,ssl_lose=yes)
  AC_CHECK_LIB(ssl,SSL_new,,ssl_lose=yes,-lcrypto)

I'm not sure that this is the right solution (FAIK it might break things
on other platforms) or that is fixed the right place. So if someone on
the list with more knowledge about crypto and ssl can help here I will
try it out.


-- 
Med venlig hilsen / Kind regards

Hack Kampbjørn   [EMAIL PROTECTED]
HackLine +45 2031 7799



Re: can wget do POSTing?

2001-02-18 Thread Hack Kampbjørn

Cyrus Adkisson wrote:
> 
> I'm trying to retrieve information from a website, but to get to the
> page I want, there is a form submission using the POST method. I've
> tried everything I know to do, including using a --header="POST /
> HTTP/1.0" parameter, but with all the errors I'm getting, I'm starting
> to come to the conclusing that wget is only capable of GET http
> requests. That would explain why it's called wget and not wpost, right?
> Am I correct in this assumption?
> 
As you already found out, wget can only do GET.

> If so, does anyone have any ideas how I might retrieve the webpage
> beyond the POST form? I'd really appreciate any help you might have for
> me.


You can try and use the GET method anyway. Many web-scripts don't really
care with method you use (or even cookies). But I suppose you already
have tried that 8-(

Next you can use your browser and make the POST request there. Save the
resulting page. And the use the `--force-html' and `--input-file'
options to retrieve all the resting pages.

If those pages also requires POST to access then you could consider
adding support for this method in wget 8-)

> 
> Cyrus

-- 
Med venlig hilsen / Kind regards

Hack Kampbjørn   [EMAIL PROTECTED]
HackLine +45 2031 7799



can wget do POSTing?

2001-02-18 Thread Cyrus Adkisson


I'm trying to retrieve information from a website, but to get to the
page I want, there is a form submission using the POST method. I've
tried everything I know to do, including using a --header="POST /
HTTP/1.0" parameter, but with all the errors I'm getting, I'm starting
to come to the conclusing that wget is only capable of GET http
requests. That would explain why it's called wget and not wpost, right?
Am I correct in this assumption?

If so, does anyone have any ideas how I might retrieve the webpage
beyond the POST form? I'd really appreciate any help you might have for
me.

Cyrus



Re: wget feature request: mail when complete

2001-02-18 Thread Andre Majorel

On 2001-02-18 01:08 -0500, Mordechai T. Abzug wrote:
> 
> Sometimes, I run wget in background to download a file that will take
> hours or days to complete.  It would be handy to have an option for
> wget to send me mail when it's done, so I can fire and forget.

Perhaps this will do the trick :

  #!/bin/sh
  wget "$@" 2>&1 | mail -s "wget job ($?)" [EMAIL PROTECTED]

-- 
André Majorel <[EMAIL PROTECTED]>
http://www.teaser.fr/~amajorel/



Re: wget feature request: mail when complete

2001-02-18 Thread Hack Kampbjørn

"Mordechai T. Abzug" wrote:
> 
> Sometimes, I run wget in background to download a file that will take
> hours or days to complete.  It would be handy to have an option for
> wget to send me mail when it's done, so I can fire and forget.
> 
> Thanks!
> 
> - Morty

wget comes from the *nix world where utilities tries to be good at one
or two things and relay on other utilities to good at other things so
that they don't bloat their code. E.g. wget is good at downloading files
from the internet: using http and ftp adding other protocol might be a
natural thing for wget to do. But for sending mail there're already a
lot of other utilities that's good at that.

And there are also a bunch of utilities that are good at making other
utilities cooperate and intercomunicate: those are the shells.

I use the bash shell so if I wanted this feature I'll type something
like:

$ (wget -r -l 0 http://www.vigilante.com/ | mail -s "wget run completed"
`id -un`) &

Arghhh, wget sends the output to STDERR. Well then sm like:
$ (wget -r -l 0 http://www.vigilante.com/ 2>&1 | mail -s "wget run
completed" `id -un`) &

Or if I used it a lot make a litlle script for it:
$ cat>~/bin/bwget
#!/bin/bash
# Background wget: runs wget and sends a mail when finished
(wget $* 2>&1 | mail -s "wget run completed" `id -un`) &
^D
$ chmod 0700 ~/bin/bwget

Look at the documentation for the shell you use.


-- 
Med venlig hilsen / Kind regards

Hack Kampbjørn   [EMAIL PROTECTED]
HackLine +45 2031 7799