On 5/20/11 4:16 PM, jdrewsen wrote:
Den 19-05-2011 22:50, Andrei Alexandrescu skrev:
On 5/19/11 3:28 PM, jdrewsen wrote:
[snip]
Speaking of which, what's the status on recycling them buffers? :o)
I'm slowly going through all the suggestions I received in the "Curl
wrapper" thread. I still haven't fixed the recycling buffers though.
/Jonas
Great. It's very exciting that once that is done, a two-stroke download
program that transfers data at optimal speed is a five-liner:
import std.exception, std.net.curl, std.stdio;
void main(string[] args) {
enforce(args == 2, "Usage: " ~ args[0] ~ " url");
foreach (chunk; Http.byChunk(args[1])) {
stdout.rawWrite(chunk);
}
}
If we play our cards right, then (a) the memory allocated will be
constant in the size of the input, (b) reading and writing will not
block each other (except when buffers are full or starved), and (c)
transfer rate will be optimal, i.e. never slower than the minimum of
input and output rate.
I'm really looking forward to this.
Andrei