Tapestry 5 version: 5.3-beta19 I love the API of tapestry-func, Today when I use tapestry-func to process a file with 3 million lines, I get following exception:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOfRange(Arrays.java:3209) at java.lang.String.<init>(String.java:216) at java.io.BufferedReader.readLine(BufferedReader.java:331) at java.io.BufferedReader.readLine(BufferedReader.java:362) at org.apache.commons.io.LineIterator.hasNext(LineIterator.java:97) at org.apache.tapestry5.func.LazyIterator.next(LazyIterator.java:33) at org.apache.tapestry5.func.LazyFlow.resolve(LazyFlow.java:78) at org.apache.tapestry5.func.LazyFlow.isEmpty(LazyFlow.java:61) at org.apache.tapestry5.func.AbstractFlow$1.hasNext(AbstractFlow.java:63) at org.apache.tapestry5.func.AbstractFlow.each(AbstractFlow.java:103) at org.apache.tapestry5.func.AbstractFlow.each(AbstractFlow.java:32) at com.datastream.services.flow.FileProcessor.main(FileProcessor.java:29) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:90) Java code: (using common.io) public static void main(String[] args) { try { final LineIterator lineIterator = FileUtils.lineIterator(FileUtils.getFile("c:\\tmp\\largefile.txt")); Iterable<String> lineIterable = new Iterable<String>() { public Iterator<String> iterator() { return lineIterator; } }; F.flow(lineIterable) .each(new Worker<String>() { public void work(String s) { } }); LineIterator.closeQuietly(lineIterator); } catch (IOException e) { e.printStackTrace(); } } I have looked at the code, looks like Tapestry created a lot of chained LazyFlow objects that used up memories. Is it a bug? Thanks, Richard