I have some CSV files with a header line, so setting useMaps="true" would
be the natural thing to do. Works great.

My CSV files are very big, so using streaming/parallelProcessing would be
the natural thing to do. Also works great.

Unfortunately using useMaps="true" AND streaming/parallelProcessing does
not work: It results in lots of empty Lists/Maps. Which is understandable,
but not nice.

>> So the question remains: How to efficiently process large CSV files that
have a header line? <<

By the way, this is my route:

<route id="CSVRoute">
    <from uri="file:/tmp/data/" />
    <split streaming="true" parallelProcessing="true">
        <tokenize token="\n" />
        <unmarshal>
            <csv delimiter=";" useMaps="true" />
        </unmarshal>
        <log message="Got ${body}"/>
        <to uri="mock:nextStageProcessor"/>
    </split>
</route>

Reply via email to