Hey Joe,
Sure thing.  I attached the template, I'm just taking the GDELT data set for 
the getFile Processor which works.  The error i get is a negative array.


> Date: Mon, 21 Sep 2015 14:24:50 -0400
> Subject: Re: CSV to Mongo
> From: [email protected]
> To: [email protected]
> 
> Adam,
> 
> Regarding moving from Storm to NiFi i'd say they make better teammates
> than competitors.  The use case outlines above should be quite easy
> for NiFi but there are analytic/processing functions Storm is probably
> a better answer for.  We're happy to help explore that with you as you
> progress.
> 
> If you ever run into an ArrayIndexBoundsException.. then it will
> always be 100% a coding error.  Would you mind sending your
> flow.xml.gz over or making a template of the flow (assuming it
> contains nothing sensitive)?  If at all possible sample data which
> exposes the issue would be ideal.  As an alternative can you go ahead
> and send us the resulting stack trace/error that comes out?
> 
> We'll get this addressed.
> 
> Thanks
> Joe
> 
> On Mon, Sep 21, 2015 at 2:17 PM, Adam Williams
> <[email protected]> wrote:
> > Hello,
> >
> > I'm moving from storm to NiFi and trying to do a simple test with getting a
> > large CSV file dumped into MongoDB.  The CSV file has a header with column
> > names and it is structured, my only problem is dumping it into MongoDB.  At
> > a high level, do the following processor steps look correct?  All i want is
> > to just pull the whole CSV file over the MongoDB without a regex or anything
> > fancy (yet).  I eventually always seem to hit trouble with array index
> > problems with the putmongo processor:
> >
> > GetFile --> ExtractText --> RoutOnAttribute(not a null line) --> PutMongo.
> >
> > Does that seem to be the right way to do this in NiFi?
> >
> > Thank you,
> > Adam
                                          

Attachment: mongoGDELT.xml
Description: XML document

Reply via email to