When requesting a large split-file via fproxy, it usually eventually fails with the following:

Request failed gracefully: Next failed: Could only fetch 25 of 26 blocks in segment 1 of 1: 14 failed, total available 39

However, if I manually re-request the URI enough times, it will eventually download completely. I take all defaults when requesting:

0       Initial Hops to Live
4       Number of Retries for a Block that Failed
5       Increment HTL on retry by this amount
        Force the Browser to Save the File
        Don't Look for Blocks in Local Data Store
        Download Segments in Random Order
30      Number of Simultaneous Downloads
        Run Anonymity Filter on Download Completion (recommended)
        Make the Anonymity Filter Really Paranoid
100     % of Missing Data Blocks to Insert (Healing)
18      Hops-to-Live for the Healing Blocks
        Write directly to disk rather than sending to browser

Is there a combination of settings for the request (or another method) to instruct fproxy to not give up until the file actually completes?

_______________________________________________
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://dodo.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:[EMAIL PROTECTED]

Reply via email to