On 19.07.2018 17:24, Paul Wagner wrote:
> Dear wgetters,
> 
> apologies if this has been asked before.
> 
> I'm using wget to download DASH media files, i.e. a number of URLs in
> the form domain.com/path/segment_1.mp4, domain.com/path/segment_2.mp4,
> ..., which represent chunks of audio or video, and which are to be
> combined to form the whole programme.  I used to call individuall
> instances of wget for each chunk and combine them, which was dead slow. 
> Now I tried
> 
>   { i=1; while [[ $i != 100 ]]; do echo
> "http://domain.com/path/segment_$((i++)).mp4"; done } | wget -O foo.mp4
> -i -
> 
> which works like a charm *as long as the 'generator process' is finite*,
> i.e. the loop is actually programmed as in the example.  The problem is
> that it would be much easier if I could let the loop run forever, let
> wget get whatever is there and then fail after the counter extends to a
> segment number not available anymore, which would in turn fail the whole
> pipe.  Turns out that
> 
>   { i=1; while true; do echo
> "http://domain.com/path/segment_$((i++)).mp4"; done } | wget -O foo.mp4
> -i -
> 
> hangs in the sense that the first process loops forever while wget
> doesn't even bother to start retrieving.  Am I right assuming that wget
> waits until the file specified by -i is actually fully written?  Is
> there any way to change this behavour?
> 
> Any help appreciated.  (I'm using wget 1.19.1 under cygwin.)

Hi Paul,

Wget2 behaves like what you need. So you can run it with an endless loop
without wget2 hanging.

I should build under CygWin without problems, though my last test is a
while ago.

See https://gitlab.com/gnuwget/wget2

Latest tarball is
https://alpha.gnu.org/gnu/wget/wget2-1.99.1.tar.gz

or latest git
git clone https://gitlab.com/gnuwget/wget2.git


Regards, Tim

Attachment: signature.asc
Description: OpenPGP digital signature

Reply via email to