Re: How to convert unix timestamp to datetime in Apache NiFi

2019-04-18 Thread Puspak
Thanks for the suggestion 



--
Sent from: http://apache-nifi-developer-list.39713.n7.nabble.com/


Re: Execute SQL support for nvarchar(max) datatype

2019-04-18 Thread Endre Kovács
Hi Venu,
I belive it would only be possible if the PDF would be base64 encoded
before DB insertion, and then in NiFi it would need to be base64 decoded.
Seems possible, but adds a bit of computing overhead to the flow.
Best regards,
Endre

On Thu, Apr 18, 2019, 5:09 PM Venugopal Iyengar 
wrote:

> Thanks Peter.
> We are trying to avoid using nvarchar going forward. However, for some
> schema it is implemented that way by some consultants. If you can give me
> the steps to debug that would be helpful. Is it possible to get this
> working by using some converters as part of the flow? I appreciate your
> help on this.
>
> Venu
>
> On Tue, Apr 16, 2019 at 5:23 AM Peter Turcsanyi
>  wrote:
>
> > Hi Venu,
> >
> > If I'm not wrong, you're using MS SQL Server and nvarchar / varbinary are
> > the database column types.
> > nvarchar can be used for storing character data (strings) but it is not
> > suitable for binary data (like pdf).
> >
> > I think NiFi tries to handle the bytes of the pdf file as Unicode
> > characters. It could be debugged but my short answer is that nvarchar
> > should not be used for binary data.
> >
> > Regards,
> > Peter
> >
> > On Mon, Apr 15, 2019 at 10:05 PM Venugopal Iyengar <
> > iyengar.g.v...@gmail.com>
> > wrote:
> >
> > > Hello there,
> > > I am using Execute SQL, Split AVro and put file processors to read some
> > > pdf documents stored in SQL database and store that in the file system.
> > >
> > > When I use varbinary(max)  I was able to pull and view the pdf without
> > any
> > > issues.
> > >
> > > When the datatype is nvarchar(max) I am unable to open the pdf file.
> > >
> > > I appreciate if some body can shed some light on this. How can I debug
> > > this?
> > >
> > > Thanks
> > > Venu
> > >
> > >
> > > [image: image.png]
> > >
> >
>


Re: Support Gradle for building NAR

2019-04-18 Thread Kevin Doran
Something to keep in mind is that the master branch of the nifi-maven
plugin has been updated recently to include additional logic for
generating and including metadata information in the resulting NAR.
This is a stepping stone to adding support for hosting NAR extension
bundles in NiFi Registry.

I support providing/maintaining a NAR Gradle plugin for the community.
If we do so, it should support the same capabilities as the NAR Maven
plugin, including the recent work on master.

Thanks,
Kevin

On Thu, Apr 18, 2019 at 2:49 PM Matt Burgess  wrote:
>
> Franklin,
>
> What kind of support do you mean? Maybe an "official" NiFi Gradle
> plugin to build NARs, like the NAR MOJO plugin for Maven? What are the
> concerns with using the third-party library (I heard it works fine)? I
> also wrote my own inline script [1] (rather than a plugin) you could
> use in your own build.gradle file(s). IIRC we talked about writing our
> own NAR Gradle plugin, but it would involve either restructuring the
> current nifi-maven repo (into a top-level nifi-build repo or
> something) or adding a nifi-gradle repo.
>
> Or do you mean adding build.gradle files to the various modules
> allowing anyone to immediately build an existing NAR using Gradle
> instead of Maven? I imagine that would take some community discussion,
> as we'd have to collectively agree we want to maintain two build
> systems and their configurations.
>
> Regards,
> Matt
>
> [1] 
> https://github.com/mattyb149/nifi-gradle-nar-example/blob/master/nifi-gradle-built-example-nar/build.gradle
>
> On Thu, Apr 18, 2019 at 1:02 PM Franklin George
>  wrote:
> >
> > Hi,
> > Can you please provide the support to build NAR Files using Gradle too?
> > Currently, the only solution available is to use a 3rd party Gradle plugin: 
> > https://github.com/sponiro/gradle-nar-plugin for Packaging custom 
> > processors into NAR Files.
> >
> > Regards,
> > Franklin George
> >
> >


Re: Support Gradle for building NAR

2019-04-18 Thread Matt Burgess
Franklin,

What kind of support do you mean? Maybe an "official" NiFi Gradle
plugin to build NARs, like the NAR MOJO plugin for Maven? What are the
concerns with using the third-party library (I heard it works fine)? I
also wrote my own inline script [1] (rather than a plugin) you could
use in your own build.gradle file(s). IIRC we talked about writing our
own NAR Gradle plugin, but it would involve either restructuring the
current nifi-maven repo (into a top-level nifi-build repo or
something) or adding a nifi-gradle repo.

Or do you mean adding build.gradle files to the various modules
allowing anyone to immediately build an existing NAR using Gradle
instead of Maven? I imagine that would take some community discussion,
as we'd have to collectively agree we want to maintain two build
systems and their configurations.

Regards,
Matt

[1] 
https://github.com/mattyb149/nifi-gradle-nar-example/blob/master/nifi-gradle-built-example-nar/build.gradle

On Thu, Apr 18, 2019 at 1:02 PM Franklin George
 wrote:
>
> Hi,
> Can you please provide the support to build NAR Files using Gradle too?
> Currently, the only solution available is to use a 3rd party Gradle plugin: 
> https://github.com/sponiro/gradle-nar-plugin for Packaging custom processors 
> into NAR Files.
>
> Regards,
> Franklin George
>
>


Re: [EXT] Latest NiFi customs?

2019-04-18 Thread Andy LoPresto
There is substantial work being lead by Jeff Storck to migrate NiFi to Java 11. 
You can see discussions about that on the mailing list [1] and in one of the 
PRs [2]. 

[1] 
https://lists.apache.org/thread.html/8af1752087480ac83bc01ea89d616316a125848b334624a3e2fce4d3@%3Cdev.nifi.apache.org%3E
 

[2] https://github.com/apache/nifi/pull/3404 



Andy LoPresto
alopre...@apache.org
alopresto.apa...@gmail.com
PGP Fingerprint: 70EC B3E5 98A6 5A3F D3C4  BACE 3C6E F65B 2F7D EF69

> On Apr 18, 2019, at 6:40 AM, Peter Wicks (pwicks)  wrote:
> 
> One other thing, that seems to catch me every time I upgrade an old instance: 
> you will need to go in and allow users to read provenance data again. 
> Somewhere along the way (1.6?) provenance reading moved into a separate 
> policy, and it does not get assigned to anyone after upgrade.
> 
> -Original Message-
> From: Lars Francke  
> Sent: Thursday, April 18, 2019 3:05 AM
> To: dev@nifi.apache.org
> Subject: [EXT] Re: Latest NiFi customs?
> 
> Hi,
> 
> I have just one data point on the version but I would suggest moving to 1.9 
> if you're just starting out and if you're using the Record based processors 
> with potentially dynamic/changing schemas.
> The automatic schema inference described in this blog post[1] makes things 
> much easier (or possible). I see no reason to start with 1.8 today if you 
> have the option of upgrading.
> 
> Java: Java 8, while outdated, is still pretty much standard almost everywhere 
> I look.
> 
> Cheers,
> Lars
> 
> [1] <
> https://nam01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmedium.com%2F%40abdelkrim.hadjidj%2Fdemocratizing-nifi-record-processors-with-automatic-schemas-inference-4f2b2794c427&data=02%7C01%7Cpwicks%40micron.com%7C756adfb0351748bcb14608d6c3dd1ae1%7Cf38a5ecd28134862b11bac1d563c806f%7C0%7C0%7C636911751740895088&sdata=ek7jCdr8iy83RRJQ1cxhR%2BntfswCcLRImxviuDkWgBQ%3D&reserved=0
>> 
> 
> On Wed, Apr 17, 2019 at 4:49 PM Russell Bateman 
> wrote:
> 
>> After a couple of years absence from NiFi (prior to Java 9), I find 
>> myself just now back in a developer role in a company that uses NiFi.
>> (This is a pleasant thought, I might add, as I believe that NiFi 
>> rocks.) I have inherited an existing implementation that's sorely aged 
>> and, though I've googled mostly in vain on what I'm asking, would like 
>> to dot the /i/s and cross the /t/s.
>> 
>> *What version of NiFi?*
>> How far forward (toward) NiFi 1.9 should I push my company? I see that 
>> the Docker container is at 1.8 if that's any reference. I'm tempted 
>> right now to move to 1.8 immediately.
>> 
>> *What about Java?*
>> What is the state of Java in NiFi? It appears that it's still back on 
>> Java 8? I develop using IntelliJ IDEA. While I constrain the level of 
>> language features to 1.8, it isn't realistic to contemplate developing 
>> in IDEA without a pretty modern JDK version (I use Java 11 today 
>> because LTS). I assume, nevertheless, that if I'm careful not to 
>> permit--by setting in IDEA--the use of language constructs in my 
>> custom processors to exceed 1.8, I should be okay, right? Or, am I 
>> missing something and there are other considerations to watch out for?
>> 
>> Thanks for any and all comments, setting me straight, etc.
>> 



Re: Support Gradle for building NAR

2019-04-18 Thread Otto Fowler
Could you create a jira for this new feature?  I tried to see if there was
one already, and I couldn’t find one.


On April 18, 2019 at 13:02:13, Franklin George (
franklin.geo...@protegrity.com.invalid) wrote:

Hi,
Can you please provide the support to build NAR Files using Gradle too?
Currently, the only solution available is to use a 3rd party Gradle plugin:
https://github.com/sponiro/gradle-nar-plugin for Packaging custom
processors into NAR Files.

Regards,
Franklin George


Support Gradle for building NAR

2019-04-18 Thread Franklin George
Hi,
Can you please provide the support to build NAR Files using Gradle too?
Currently, the only solution available is to use a 3rd party Gradle plugin: 
https://github.com/sponiro/gradle-nar-plugin for Packaging custom processors 
into NAR Files.

Regards,
Franklin George




Re: Execute SQL support for nvarchar(max) datatype

2019-04-18 Thread Venugopal Iyengar
Thanks Peter.
We are trying to avoid using nvarchar going forward. However, for some
schema it is implemented that way by some consultants. If you can give me
the steps to debug that would be helpful. Is it possible to get this
working by using some converters as part of the flow? I appreciate your
help on this.

Venu

On Tue, Apr 16, 2019 at 5:23 AM Peter Turcsanyi
 wrote:

> Hi Venu,
>
> If I'm not wrong, you're using MS SQL Server and nvarchar / varbinary are
> the database column types.
> nvarchar can be used for storing character data (strings) but it is not
> suitable for binary data (like pdf).
>
> I think NiFi tries to handle the bytes of the pdf file as Unicode
> characters. It could be debugged but my short answer is that nvarchar
> should not be used for binary data.
>
> Regards,
> Peter
>
> On Mon, Apr 15, 2019 at 10:05 PM Venugopal Iyengar <
> iyengar.g.v...@gmail.com>
> wrote:
>
> > Hello there,
> > I am using Execute SQL, Split AVro and put file processors to read some
> > pdf documents stored in SQL database and store that in the file system.
> >
> > When I use varbinary(max)  I was able to pull and view the pdf without
> any
> > issues.
> >
> > When the datatype is nvarchar(max) I am unable to open the pdf file.
> >
> > I appreciate if some body can shed some light on this. How can I debug
> > this?
> >
> > Thanks
> > Venu
> >
> >
> > [image: image.png]
> >
>


RE: [EXT] Re: Latest NiFi customs?

2019-04-18 Thread Peter Wicks (pwicks)
One other thing, that seems to catch me every time I upgrade an old instance: 
you will need to go in and allow users to read provenance data again. Somewhere 
along the way (1.6?) provenance reading moved into a separate policy, and it 
does not get assigned to anyone after upgrade.

-Original Message-
From: Lars Francke  
Sent: Thursday, April 18, 2019 3:05 AM
To: dev@nifi.apache.org
Subject: [EXT] Re: Latest NiFi customs?

Hi,

I have just one data point on the version but I would suggest moving to 1.9 if 
you're just starting out and if you're using the Record based processors with 
potentially dynamic/changing schemas.
The automatic schema inference described in this blog post[1] makes things much 
easier (or possible). I see no reason to start with 1.8 today if you have the 
option of upgrading.

Java: Java 8, while outdated, is still pretty much standard almost everywhere I 
look.

Cheers,
Lars

[1] <
https://nam01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmedium.com%2F%40abdelkrim.hadjidj%2Fdemocratizing-nifi-record-processors-with-automatic-schemas-inference-4f2b2794c427&data=02%7C01%7Cpwicks%40micron.com%7C756adfb0351748bcb14608d6c3dd1ae1%7Cf38a5ecd28134862b11bac1d563c806f%7C0%7C0%7C636911751740895088&sdata=ek7jCdr8iy83RRJQ1cxhR%2BntfswCcLRImxviuDkWgBQ%3D&reserved=0
>

On Wed, Apr 17, 2019 at 4:49 PM Russell Bateman 
wrote:

> After a couple of years absence from NiFi (prior to Java 9), I find 
> myself just now back in a developer role in a company that uses NiFi.
> (This is a pleasant thought, I might add, as I believe that NiFi 
> rocks.) I have inherited an existing implementation that's sorely aged 
> and, though I've googled mostly in vain on what I'm asking, would like 
> to dot the /i/s and cross the /t/s.
>
> *What version of NiFi?*
> How far forward (toward) NiFi 1.9 should I push my company? I see that 
> the Docker container is at 1.8 if that's any reference. I'm tempted 
> right now to move to 1.8 immediately.
>
> *What about Java?*
> What is the state of Java in NiFi? It appears that it's still back on 
> Java 8? I develop using IntelliJ IDEA. While I constrain the level of 
> language features to 1.8, it isn't realistic to contemplate developing 
> in IDEA without a pretty modern JDK version (I use Java 11 today 
> because LTS). I assume, nevertheless, that if I'm careful not to 
> permit--by setting in IDEA--the use of language constructs in my 
> custom processors to exceed 1.8, I should be okay, right? Or, am I 
> missing something and there are other considerations to watch out for?
>
> Thanks for any and all comments, setting me straight, etc.
>


Re: Latest NiFi customs?

2019-04-18 Thread Lars Francke
Hi,

I have just one data point on the version but I would suggest moving to 1.9
if you're just starting out and if you're using the Record based processors
with potentially dynamic/changing schemas.
The automatic schema inference described in this blog post[1] makes things
much easier (or possible). I see no reason to start with 1.8 today if you
have the option of upgrading.

Java: Java 8, while outdated, is still pretty much standard almost
everywhere I look.

Cheers,
Lars

[1] <
https://medium.com/@abdelkrim.hadjidj/democratizing-nifi-record-processors-with-automatic-schemas-inference-4f2b2794c427
>

On Wed, Apr 17, 2019 at 4:49 PM Russell Bateman 
wrote:

> After a couple of years absence from NiFi (prior to Java 9), I find
> myself just now back in a developer role in a company that uses NiFi.
> (This is a pleasant thought, I might add, as I believe that NiFi rocks.)
> I have inherited an existing implementation that's sorely aged and,
> though I've googled mostly in vain on what I'm asking, would like to dot
> the /i/s and cross the /t/s.
>
> *What version of NiFi?*
> How far forward (toward) NiFi 1.9 should I push my company? I see that
> the Docker container is at 1.8 if that's any reference. I'm tempted
> right now to move to 1.8 immediately.
>
> *What about Java?*
> What is the state of Java in NiFi? It appears that it's still back on
> Java 8? I develop using IntelliJ IDEA. While I constrain the level of
> language features to 1.8, it isn't realistic to contemplate developing
> in IDEA without a pretty modern JDK version (I use Java 11 today because
> LTS). I assume, nevertheless, that if I'm careful not to permit--by
> setting in IDEA--the use of language constructs in my custom processors
> to exceed 1.8, I should be okay, right? Or, am I missing something and
> there are other considerations to watch out for?
>
> Thanks for any and all comments, setting me straight, etc.
>