Rudi Ahlers wrote:
I got lynx to work with the --dump option, and now the errors are
gone, and the cronjob works well. wget downloaded the whole website
which was like 23MB everytime, whereas lynx gave me the output, which
is more usable for trouble shooting the cronjob.

wget should only download 'the whole website' if you specify a -r (recursive) option, otherwise it would just fetch the one file you specified.
_______________________________________________
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos

Reply via email to