Howdy,
the situation is as follows. I have a file url.txt with lines
http://host1/a.pdf ftp://host2/b.exe
a.pdf and b.exe are big files so I use "wget -ci url.txt" for them. After download I want to import the files into the WWWOFFLE cache. Of course, I can do this manually with wwwoffle-write, but I was wondering if anyone on this list has written a script to automate the process. If not, can anyone see a flaw in the following algorithm?
1) for each file in the dl dir do a grep on url.txt to find the line
with url. If no match go to 4).
2) cat $file | wwwoffle-write --add-header $url
3) rm $file
(unprocessed files are left in the dir and can be dealt with)
4) ouput name of $file and report on success or failure
5) go to next fileOf course, this won't deal properly with index.php?fileid=abc where the resulting file is def.pdf. But will this do for "normal" situations or at least not break things too easily?
Best
Rolf
