Re: downloading files for ftp

2003-10-02 Thread Hrvoje Niksic
Payal Rathod <[EMAIL PROTECTED]> writes:

> On Thu, Oct 02, 2003 at 12:03:34PM +0200, Hrvoje Niksic wrote:
>> Payal Rathod <[EMAIL PROTECTED]> writes:
>> 
>> > On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
>> >> The way to do it with Wget would be something like:
>> >> 
>> >> wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED]
>> >
>> > But if I run in thru' crontab, where will it store the downloaded files?
>> > I want it to store as it is in server 1.
>> 
>> It will store them to the current directory.  You can either cd to the
>> desired target directory, or use the `-P' flag to specify the
>> directory to Wget.
>
> Thanks a lot. It works wonderfully. But one small thing here. I am
> trying to use it thru' cron like this,
>
> 51 * * * * wget --mirror --no-host-directories -P /home/t1 ftp://root:[EMAIL 
> PROTECTED]//home/payal/qmail*
>
> But instead of delivering it to /home/t1, wget makes a directory
> /home/t1/home/payal and put the qmail* files there.
>
> What is the workaround for this?

Use `--cut-dirs=2', which will tell Wget to get rid of two levels of
directory hierarchy ("home" and "payal").


Re: downloading files for ftp

2003-10-02 Thread Payal Rathod
On Thu, Oct 02, 2003 at 12:03:34PM +0200, Hrvoje Niksic wrote:
> Payal Rathod <[EMAIL PROTECTED]> writes:
> 
> > On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
> >> The way to do it with Wget would be something like:
> >> 
> >> wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED]
> >
> > But if I run in thru' crontab, where will it store the downloaded files?
> > I want it to store as it is in server 1.
> 
> It will store them to the current directory.  You can either cd to the
> desired target directory, or use the `-P' flag to specify the
> directory to Wget.

Thanks a lot. It works wonderfully. But one small thing here. I am
trying to use it thru' cron like this,

51 * * * * wget --mirror --no-host-directories -P /home/t1 ftp://root:[EMAIL 
PROTECTED]//home/payal/qmail*

But instead of delivering it to /home/t1, wget makes a directory
/home/t1/home/payal and put the qmail* files there.

What is the workaround for this?
Even if I can download the whole /home it is OK.

With warm regards,
-Payal

> 

-- 
"Visit GNU/Linux Success Stories"
http://payal.staticky.com
Guest-Book Section Updated.


Re: downloading files for ftp

2003-10-02 Thread Hrvoje Niksic
Payal Rathod <[EMAIL PROTECTED]> writes:

> On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
>> The way to do it with Wget would be something like:
>> 
>> wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED]
>
> But if I run in thru' crontab, where will it store the downloaded files?
> I want it to store as it is in server 1.

It will store them to the current directory.  You can either cd to the
desired target directory, or use the `-P' flag to specify the
directory to Wget.


Re: downloading files for ftp

2003-10-01 Thread Payal Rathod
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
> The way to do it with Wget would be something like:
> 
> wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED]

But if I run in thru' crontab, where will it store the downloaded files?
I want it to store as it is in server 1.

> It will preserve permissions.  Having said that, I believe that rsync
> would be better at this because it's much more careful to correctly
> transfer a directory tree from point A to point B.

Yes, I know, but I am quite new to rsync so thought good ol' wget will
do the job.

With warm regards,
-Payal

> 
> (For better transfer of file names, you should also use Wget 1.9 beta
> and specify `--restrict-file-names=nocontrol'.)
> 
> 

-- 
"Visit GNU/Linux Success Stories"
http://payal.staticky.com
Guest-Book Section Updated.


Re: downloading files for ftp

2003-10-01 Thread Hrvoje Niksic
Payal Rathod <[EMAIL PROTECTED]> writes:

> I have 5-7 user accounts in /home whose data is important. Every day at
> 12:00 I want to back their data to a differnt backup machine.
> The remote machine has a ftp server.
> Can I use wget for this? If yes, how do I proceed?

The way to do it with Wget would be something like:

wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED]

It will preserve permissions.  Having said that, I believe that rsync
would be better at this because it's much more careful to correctly
transfer a directory tree from point A to point B.

(For better transfer of file names, you should also use Wget 1.9 beta
and specify `--restrict-file-names=nocontrol'.)



downloading files for ftp

2003-10-01 Thread Payal Rathod
Hi,
I have 5-7 user accounts in /home whose data is important. Every day at
12:00 I want to back their data to a differnt backup machine.
The remote machine has a ftp server.
Can I use wget for this? If yes, how do I proceed? I am keen to use wget
rather than rsync for this.
I want to preserve permissions also, so in case the 1st server fails I
will just ask those users to point their ftp clients to 2nd backup
server and their work can go on uninterrupted.

Can anyone help?

With warm regards,
-Payal




-- 
"Visit GNU/Linux Success Stories"
http://payal.staticky.com
Guest-Book Section Updated.