Hi,

 

Generally you will use -parallel=X for directories with multiple files (to
pull multiple files at the same time under a particular directory tree) and
pget -n X for single files (typically large ones), say Linux ISOs.

 

I do not recall exactly if there is a way to mirror -parallel and pget -n X
for each of the files it mirrors in parallel.

 

However, I'd not recommend this though, e.g. if you mirrored with 5 parallel
connections with 5 parallel threads each that would be 25 simultaneous
connections whereas most public FTP sites allow 1-3 connections per IP.

 

Does that answer your question?

 

lftp :~> help pget

Usage: pget [OPTS] <rfile> [-o <lfile>]

Gets the specified file using several connections. This can speed up
transfer,

but loads the net heavily impacting other users. Use only if you really

have to transfer the file ASAP.

 

Options:

-c  continue transfer. Requires <lfile>.lftp-pget-status file.

-n <maxconn>  set maximum number of connections (default is is taken from

     pget:default-n setting)

-O <base> specifies base directory where files should be placed

lftp :~>

 

Justin.

 

From: lftp-boun...@uniyar.ac.ru [mailto:lftp-boun...@uniyar.ac.ru] On Behalf
Of Dave
Sent: Thursday, February 23, 2012 1:07 PM
To: lftp@uniyar.ac.ru
Subject: [lftp] Breaking a single file into multiple parts-segmented
downloads

 

Hi all, I'm a casual user looking to break away from Windows. I use CuteFTP
a lot over there and one function I can't seem to replicate is its ability
to  break a single file into multiple parts to increase transfer speed. I've
come to call this segmented downloads but I've no idea if that's the right
term. I have a dedicated server in Europe which sometimes has ridiculously
slow speeds to the U.S. and this function has really come in handy. 

 

I've been browsing through the lftp documentation and man pages and, while I
think I've come across that exact thing, I'm having a hard time knowing for
certain.

 

If I want to grab an entire directory I've found I can use  "mirror
--parallel=5 dir_name" to open up five separate threads and increase the
speed of the transfer that way. 

It sounds like pget might do what I want with single files, but I'm unsure.
I've also found that I can use pget in conjunction with mirror but again I'm
unsure if that's doing what I want. In a brief test last night I found I was
getting basically the same speed with and without pget. With CuteFTP I would
see multiple part files in the directory and they would be recombined once
the download finished.

 

Could someone elaborate in basic terms what pget does, if it does what I
want, and maybe how it works with mirror. 

 

For the latter I see that using pget with mirror transfers every single file
(something I thought mirror would do on its own). pget can also be used on
its own and gets the file using multiple connections, which seems like it's
what I want, but I'm not sure. 

 

Appreciate any assistance,

    Dave

_______________________________________________
lftp mailing list
lftp@uniyar.ac.ru
http://univ.uniyar.ac.ru/mailman/listinfo/lftp

Reply via email to