I think PHP can help you with the second problem. i remember it has a
function 

fgets(filename/url, 4096). you can replace 4096 with any value you want
to download upto.

Hoe that helps.
Akshay

On Sat, 2003-06-28 at 08:06, Robins Tharakan wrote:

    hi all,
    
    I would like to be straight to the point here.
    i was doing some file manipulations in linux and have a need for
    utilities which help me do the following two activities... 
    
    1. to be able to split a big file into different parts (not of similar
    sizes) for eg. a 1000 kb file split from 1-100kb, 100-123kb,
    123-500kb,.etc...  ('split' only does same sizes, 'cut' does only for
    one line, and not the whole file...!)
    
    2. secondly, and more importantly, to download only a part of a big file
    from an ftp (or an http) server. i.e. download only 100-200kb from a 1Mb
    file from the net. ('wget' only restarts a partial download, doesnt seem
    to have an option for stopping a download at a particular point..)
    
    any ideas??
    (a pointer to a shell/perl script, or a perl module would be fantastic
    too..!)
    
    tia.
    
    affly
    robins
    
    
    _______________________________________________
    ilugd mailing list
    [EMAIL PROTECTED]

http://frodo.hserus.net/mailman/listinfo/ilugd

    
_______________________________________________
ilugd mailing list
[EMAIL PROTECTED]
http://frodo.hserus.net/mailman/listinfo/ilugd

Reply via email to