Not sure if this is a clojure issue or a something else but I'm seeing
surprisingly slow I/O on large text files.

For example, on a unix machine try this:

1. create a large file
rm -f words; for x in $(seq 300); do cat /usr/share/dict/words >>
words; done

2. create a clj file that just slurps it
#!/usr/bin/env clj

(slurp *in*)

3. time the slurp
cat words | cat.clj

On my machine this takes 17 seconds.

If I make a simple java program:

import java.io.BufferedReader;
import java.io.InputStreamReader;

public class Cat {

    public static void main(String[] args) throws Exception {
        BufferedReader reader;
        reader = new BufferedReader(new InputStreamReader(System.in));
        StringBuffer fileData = new StringBuffer();

        char[] buffer = new char[4096];
        int numRead = 0;

        while ((numRead = reader.read(buffer)) != -1) {
            fileData.append(buffer, 0, numRead);
        }
        reader.close();
    }
}

and try this:
cat words | java -Xmx3G Cat

It only takes 2.3 seconds.

I thought this might have to do with the -Xmx max heap flag but I
added this to the java invocation in my clj wrapper script and it
didn't seem to make any difference. Line-oriented I/O seems similarly
slow.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to