; > all the mechanics that are already baked in and hardened in the Camel
> File
> > endpoint.
> >
> > Doing it this way though changes this from a pull mechanism to a
> > push/publish mechanism that I prefer.
> >
> > I have a couple of questions about h
don't want to replicate
> all the mechanics that are already baked in and hardened in the Camel File
> endpoint.
>
> Doing it this way though changes this from a pull mechanism to a
> push/publish mechanism that I prefer.
>
> I have a couple of questions about how I
oint.
Doing it this way though changes this from a pull mechanism to a
push/publish mechanism that I prefer.
I have a couple of questions about how I might proceed with this. I'm fine
with reading the Beanio stream/definition from my blueprint bundle with
getResourceAsStream and foreg
Thanks. I'll take a look at the custom BeanioIterator implementation. On
the current project it's a nice to have but the next one will likely
require. We read in 10k to 50k records which we can handle in memory. In
the next project that will be more like 500k records (each mutli-line) and
at th
Hi Brad!
To read and split a file you need a combination of streaming unmarshalling and
splitter components. Is tmxPaymentTechIn something you wrote?
You can try to read the file by camel as xml, splitting groups by xpath and
unmarshall just the group from xml to a bean model...
Unfortunately
You would need to create a BeanioIterator implementation and use the
splitter eip with it.
Would be a good addition to get into camel-beanio with such an iterator.
There is an iterator example in camel-mail, and I think also in
camel-zipfile etc.
On Tue, Apr 5, 2016 at 12:54 AM, Brad Johnson
wr
I have a beanio mapping that's rather standard and I read it in but it
appears that it reads the entire file into memory instead of streaming it.
Is there a way to change that behavior so that I can take items as they are
read and put them on a queue?