Can anyone explain the WHY of this PutHDFS logic?

2015-11-29 Thread Mark Petronic
This is the sort of "mystery" code that really should have some explicit
code comments. :) What are the underlying reasons for this retry logic?
This could definitely lead to a bottleneck if this loop has to run and
sleep numerous times. Just wondering what, in HDFS, results in the need to
do this? Maybe what to do to in the cluster setup/config to avoid hitting
this condition, if that is possible?

Thanks

boolean renamed = false;
for (int i = 0; i < 10; i++) { // try to rename multiple times.
if (hdfs.rename(tempCopyFile, copyFile)) {
renamed = true;
break;// rename was successful
}
Thread.sleep(200L);// try waiting to let whatever might
cause rename failure to resolve
}
if (!renamed) {
hdfs.delete(tempCopyFile, false);
throw new ProcessException("Copied file to HDFS but could
not rename dot file " + tempCopyFile
+ " to its final filename");
}


Re: Nifi startup error after I install my first custom processor

2015-11-20 Thread Mark Petronic
Bryan, thanks so much for the pointers. I got is all working now and
ready to start on the next processor with a whole lot more confidence.
:) FWIW, maybe for someone else who needs this information, I just
added a dependency for logback-classic to get more than the default
slf4f-simple logging (which is standard error console logging only, I
believe)


ch.qos.logback
logback-classic


And then I just used the typical LoggerFactory.getLogger() to get a
logger in my helper classes and that's it.

I still get this warning in eclipse when I run unit tests but it just
happens to pick logback-classic and same when I run in production so I
do see my log messages going to the app log. I could also play around
with logback.xml now and direct my stuff where ever I want. But, at
least the traces are showing up now so that works. I did a lot of
research into how to exclude slf4j-simple from the classpath in Maven
to avoid the contention warning but nothing I read or tried works.
Anyway, it does pick the logger I wanted so I will ignore for now.

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/home/mpetronic/.m2/repository/ch/qos/logback/logback-classic/1.1.3/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/mpetronic/.m2/repository/org/slf4j/slf4j-simple/1.7.12/slf4j-simple-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type
[ch.qos.logback.classic.util.ContextSelectorStaticBinder]

mvn dependency:tree gives this and it appears to me that that the
slf4j-simple dependency is coming from the root of that tree (test) so
maybe that is an internal dependency defined by the maven test runner
or something? Very new to maven so just guessing.

--- maven-dependency-plugin:2.9:tree (default-cli) @ hughes-bigdata-bundle ---
[INFO] com.hughes:hughes-bigdata-bundle:pom:1.0
[INFO] +- junit:junit:jar:4.12:test
[INFO] |  \- org.hamcrest:hamcrest-core:jar:1.3:test
[INFO] +- org.mockito:mockito-core:jar:1.10.19:test
[INFO] |  \- org.objenesis:objenesis:jar:2.1:test
[INFO] \- org.slf4j:slf4j-simple:jar:1.7.12:test
[INFO]\- org.slf4j:slf4j-api:jar:1.7.12:provided

On Fri, Nov 20, 2015 at 8:31 AM, Bryan Bende <bbe...@gmail.com> wrote:
> The dependencies can definitely be confusing to wrap your head around. A
> good rule of thumb is that you would generally not have a direct jar
> dependency on anything under nifi-nar-bundles in the source tree. This
> would mean no jar dependencies on processors, controller services,
> reporting tasks etc. Now there are cases where a NAR can depend on another
> NAR, this is common when a processor needs to use a controller service, a
> good description of that is here [1], another example is how we have the
> hadoop-libraries-nar so multiple NARs can depend on the same set of hadoop
> libraries and not have to duplicate all those jars. Any other dependencies
> like things from nifi-commons, such as processor utils and others, are fair
> game to include in your NAR. If you look at the nifi lib directory you can
> see the following jars:
>
> jcl-over-slf4j-1.7.12.jar
> jul-to-slf4j-1.7.12.jar
> log4j-over-slf4j-1.7.12.jar
> logback-classic-1.1.3.jar
> logback-core-1.1.3.jar
> nifi-api-0.4.0-SNAPSHOT.jar
> nifi-documentation-0.4.0-SNAPSHOT.jar
> nifi-nar-utils-0.4.0-SNAPSHOT.jar
> nifi-properties-0.4.0-SNAPSHOT.jar
> nifi-runtime-0.4.0-SNAPSHOT.jar
> slf4j-api-1.7.12.jar
>
> These are automatically available to every nar, anything else needs to be
> brought in by bundling the jars in your NAR, or by depending on another NAR.
>
> As for logging, nifi itself uses slf4j and logback, so the logger you get
> from getLogger() would be controlled from the logback.xml in the conf
> directory. I think for non-processor classes if you use the slf4j api that
> will be your best bet as I believe it will automatically use the same
> logback configuration, but others could correct me if I am wrong here. I
> believe it is also possible to use log4j directly with in your NAR, for
> example I know some third party client libraries use log4j and by having a
> dependency on log4j-over-slf4j it can somehow route all of the log4j calls
> back through the main slf4j configuration. Hope I didn't confuse things
> more here, let us know.
>
> [1]
> https://cwiki.apache.org/confluence/display/NIFI/Maven+Projects+for+Extensions#MavenProjectsforExtensions-LinkingProcessorsandControllerServices
>
>
>
> On Fri, Nov 20, 2015 at 1:26 AM, Mark Petronic <markpetro...@gmail.com>
> wrote:
>
>> Thanks so much Bryan. It is running now. It is starting to come together
>> but I'm still a little unclear on when to include nifi 

Re: ERROR [NiFi Web Server-976] c.s.j.spi.container.ContainerResponse Mapped exception to response: 500 (Internal Server Error)

2015-11-04 Thread Mark Petronic
Thanks, guys. That's impressive service. Building now. :)

On Wed, Nov 4, 2015 at 11:44 AM, Tony Kurc <trk...@gmail.com> wrote:

> I'll review it now
>
> On Wed, Nov 4, 2015 at 11:19 AM, Matt Gilman <matt.c.gil...@gmail.com>
> wrote:
>
> > Thanks for reporting. This issue [1] has been addressed and is currently
> > waiting for review before it can be merged.
> >
> > Thanks!
> >
> > Matt
> >
> > [1] https://issues.apache.org/jira/browse/NIFI-1098
> >
> > On Wed, Nov 4, 2015 at 10:58 AM, Mark Petronic <markpetro...@gmail.com>
> > wrote:
> >
> > > Running with a latest master branch build off
> > > commit dbf0c7893fef964bfbb3a4c039c756396587ce12.
> > > I have defined two rules in the advanced config of UpdateAttribute.
> > > I now try to create another rule and want to copy from an existing. As
> > soon
> > > as I type a character into the "Copy from existing rule (optional)"
> > field,
> > > I see this exception.
> > > Let me know what I can do to help further.
> > >
> > > Thanks
> > >
> > >
> > > 2015-11-04 15:51:32,152 ERROR [NiFi Web Server-976]
> > > c.s.j.spi.container.ContainerResponse Mapped exception to response: 500
> > > (Internal Server Error)
> > > javax.ws.rs.WebApplicationException: null
> > > at
> > >
> > >
> >
> org.apache.nifi.update.attributes.api.RuleResource.getCriteria(RuleResource.java:605)
> > > ~[classes/:na]
> > > at
> > >
> > >
> >
> org.apache.nifi.update.attributes.api.RuleResource.searchRules(RuleResource.java:419)
> > > ~[classes/:na]
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > ~[na:1.8.0_51]
> > > at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > ~[na:1.8.0_51]
> > > at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > ~[na:1.8.0_51]
> > > at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_51]
> > > at
> > >
> > >
> >
> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
> > > ~[jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
> > > ~[jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
> > > ~[jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
> > > ~[jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
> > > ~[jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
> > > ~[jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
> > > ~[jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
> > > ~[jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
> > > [jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
> > > [jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
> > > [jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
> > > [jersey-server-1.19.jar:1.19]
> > > at
> > >
> > >
> >
>

Re: LogAttribute - Sending that output to a custom logger?

2015-11-02 Thread Mark Petronic
My primary use is for understanding Nifi. I like to direct various
processors output into both their logical next processor stage as well as
into a log attribute processor. Then I tail the Nifi app log file and watch
what happens - in real time. I do not intend to use this for long term log
retention. I agree that providence is the right choice for that. So, the
only reason I wanted to allow configuration of a custom logger was simply
to isolate all the attribute-rich logging from the normal logging because I
was primarily interested in the attribute flows as a way to (a) better
understand what a processor emits because, frankly, the documentation of
some of the processors is very sparse. So, I learn imperatively, so to
speak. I say that as a new user. I feel I should be able to get a pretty
good understanding of a processor by reading the usage. But I am finding
that the documentation, in some cases, is more like what I like to refer to
as, "note to self" documentation. Great if you are the guy who wrote the
processor with those "insights" - not so great if you are not the
developer. So, then I need to dig up the code. That should not be needed as
the first step of understanding a processor as a new user. There is some
well documented processors but not all are, IMHO. (b) Validate my flows
with some test data and verify attribute values look correct and routing is
happen on them as expected, etc. Again, easier, IMO, to see in the logs
than digging into the providence data.

Maybe this is just a good "private" feature for me so maybe I will just
create a private version to use on my own. I already have it working but
would need more polish to achieve PR status. Maybe this is the sort of
thing that others would not find beneficial? That's fine. There are others
ways I can contribute in the future. I'm still having fun! :)

On Sun, Nov 1, 2015 at 12:41 PM, Joe Witt <joe.w...@gmail.com> wrote:

> Mark Petronic,
>
> I share Payne's perspective on this.  But I'd also like to work with
> you to better understand the workflow.  For those of us that have used
> this tool for a long time there is a lot we take for granted from a
> new user perspective.  We believe the provenance feature to provide a
> far superior option to understanding how an item went through the
> system and the timing and what we knew when and so on.  But, it would
> be great to understand it from your perspective as someone learning
> NiFi.  Not meaning to take away from your proposed contrib - that
> would be great too.  Just want to see if the prov user experience
> solves what you're looking for and if not can we make it do that.
>
> Thanks
> Joe
>
> On Sun, Nov 1, 2015 at 11:23 AM, Mark Payne <marka...@hotmail.com> wrote:
> > Mark,
> >
> > To make sure that I understand what you're proposing, you want to add a
> property to
> > LogAttribute that allows users to provide a custom logger name?
> >
> > If that is indeed what you are suggesting then I think it's a great idea.
> >
> > That being said, in practice I rarely ever use LogAttribute and we even
> considered removing
> > it from the codebase before we open sourced, because the Data Provenance
> provides a
> > much better view of what's going on to debug your flows.
> >
> > I know you're pretty new to NiFi, so if you've not yet had a chance to
> play with the Provenance,
> > you can see the section in the User Guide at
> http://nifi.apache.org/docs/nifi-docs/html/user-guide.html#data-provenance
> <
> http://nifi.apache.org/docs/nifi-docs/html/user-guide.html#data-provenance
> >
> >
> > If you're interested in updating the LogAttribute processor, though,
> we'd be happy to have
> > that contribution added, as it does make the Processor more usable.
> >
> > Thanks
> > -Mark
> >
> >> On Oct 31, 2015, at 12:35 PM, Mark Petronic <markpetro...@gmail.com>
> wrote:
> >>
> >> From the code, it appears it cannot be done as the attribute logging
> >> goes the same getLogger() instance as the normal nifi-app traces. Has
> >> anyone considered making that configurable, maybe allowing you do
> >> define a different logger name for LogAttribute then creating that
> >> logger definition in log back conf allowing flexibility? I'm using
> >> attribute logging heavily as I try to better learn/debug Nifi (it give
> >> you a nice 'under the hood' view of the flow) and build up some flows
> >> and feel it would be beneficial to be able to capture the LogAttribte
> >> message by themselves for more clarity on what is happening. I would
> >> not mind maybe trying to implement this feature as my first crack at
> >> contributing to the project. Seems like a fairly easy one that would
> >> allow me to "go through the motions" of a full pull request process
> >> and iron out the process. Anyone have any thoughts on this?
> >
>


LogAttribute - Sending that output to a custom logger?

2015-10-31 Thread Mark Petronic
>From the code, it appears it cannot be done as the attribute logging
goes the same getLogger() instance as the normal nifi-app traces. Has
anyone considered making that configurable, maybe allowing you do
define a different logger name for LogAttribute then creating that
logger definition in log back conf allowing flexibility? I'm using
attribute logging heavily as I try to better learn/debug Nifi (it give
you a nice 'under the hood' view of the flow) and build up some flows
and feel it would be beneficial to be able to capture the LogAttribte
message by themselves for more clarity on what is happening. I would
not mind maybe trying to implement this feature as my first crack at
contributing to the project. Seems like a fairly easy one that would
allow me to "go through the motions" of a full pull request process
and iron out the process. Anyone have any thoughts on this?


Re: Recommendation on getting started as a contibutor

2015-10-27 Thread Mark Petronic
On Tue, Oct 27, 2015 at 6:55 AM, Oleg Zhurakousky
 wrote:
> I was just able to reproduce your exact error by disassociating it from the 
> class path

Oleg, thanks for the response.

1. I verified that my working directory is correct and points to my
running version of Nifi:

/home/mpetronic/nifi-0.3.1-SNAPSHOT

2. I verified that src/main/resources is indeed the one and only entry
listing in my build path settings under the "Source" tab. However, I
don't see that reflected in the below command line.

3. Not sure how to dump the active classpath from Eclipse project
configuration but, if I run Nifi under Eclipse and go to the
properties of the running instance I see this as the command line used
to run it. Question is why are all my classpaths pointing to files in
the maven repository? Those are the values reflected in the project
build path under the "Libraries" tab that I got by default after
importing the nifi-ide-integration project. I did not edit anything
there.

/opt/java/jdk1.7.0_75/bin/java
-Dnifi.properties.file.path=/home/mpetronic/nifi-0.3.1-SNAPSHOT/./conf/nifi.properties
-Dfile.encoding=ANSI_X3.4-1968 -classpath
/home/mpetronic/repos/nifi-ide-integration/bin:/home/mpetronic/.m2/repository/org/apache/nifi/nifi-api/0.3.1-SNAPSHOT/nifi-api-0.3.1-SNAPSHOT.jar:/home/mpetronic/.m2/repository/org/apache/nifi/nifi-runtime/0.3.1-SNAPSHOT/nifi-runtime-0.3.1-SNAPSHOT.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.4/d99532ba3603f27bebf4cdd3653feb0e0b84cf6/log4j-core-2.4.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.12/8e20852d05222dc286bf1c71d78d0531e177c317/slf4j-api-1.7.12.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.12/485f77901840cf4e8bf852f2abb9b723eb8ec29/slf4j-log4j12-1.7.12.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.slf4j/jul-to-slf4j/1.7.12/8811e2e9ab9055e557598dc9aedc64fd43e0ab20/jul-to-slf4j-1.7.12.jar:/home/mpetronic/.m2/repository/org/apache/nifi/nifi-nar-utils/0.3.1-SNAPSHOT/nifi-nar-utils-0.3.1-SNAPSHOT.jar:/home/mpetronic/.m2/repository/org/apache/nifi/nifi-properties/0.3.1-SNAPSHOT/nifi-properties-0.3.1-SNAPSHOT.jar:/home/mpetronic/.m2/repository/org/apache/nifi/nifi-documentation/0.3.1-SNAPSHOT/nifi-documentation-0.3.1-SNAPSHOT.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.4/cc68e72d6d14098ba044123e10e048d203d3fd47/log4j-api-2.4.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/log4j/log4j/1.2.17/5af35056b4d257e4b64b9e8069c0746e8b08629f/log4j-1.2.17.jar:/home/mpetronic/nifi-0.3.1-SNAPSHOT/conf
org.apache.nifi.NiFi

4. I just got it to work by removing "src/main/resources" from the
"Source" tab and added it to the "Libraries" tab using "Add Class
Folder". Now when I run, I see console logging as expected and my new
command line is below (where src/main/resources is now showing up as
the last entry in the path): But why did I need this extra step?

/opt/java/jdk1.7.0_75/bin/java
-Dnifi.properties.file.path=/home/mpetronic/nifi-0.3.1-SNAPSHOT/./conf/nifi.properties
-Dfile.encoding=ANSI_X3.4-1968 -classpath
/home/mpetronic/.m2/repository/org/apache/nifi/nifi-api/0.3.1-SNAPSHOT/nifi-api-0.3.1-SNAPSHOT.jar:/home/mpetronic/.m2/repository/org/apache/nifi/nifi-runtime/0.3.1-SNAPSHOT/nifi-runtime-0.3.1-SNAPSHOT.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.4/d99532ba3603f27bebf4cdd3653feb0e0b84cf6/log4j-core-2.4.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.12/8e20852d05222dc286bf1c71d78d0531e177c317/slf4j-api-1.7.12.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.12/485f77901840cf4e8bf852f2abb9b723eb8ec29/slf4j-log4j12-1.7.12.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.slf4j/jul-to-slf4j/1.7.12/8811e2e9ab9055e557598dc9aedc64fd43e0ab20/jul-to-slf4j-1.7.12.jar:/home/mpetronic/.m2/repository/org/apache/nifi/nifi-nar-utils/0.3.1-SNAPSHOT/nifi-nar-utils-0.3.1-SNAPSHOT.jar:/home/mpetronic/.m2/repository/org/apache/nifi/nifi-properties/0.3.1-SNAPSHOT/nifi-properties-0.3.1-SNAPSHOT.jar:/home/mpetronic/.m2/repository/org/apache/nifi/nifi-documentation/0.3.1-SNAPSHOT/nifi-documentation-0.3.1-SNAPSHOT.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.4/cc68e72d6d14098ba044123e10e048d203d3fd47/log4j-api-2.4.jar:/home/mpetronic/.gradle/caches/modules-2/files-2.1/log4j/log4j/1.2.17/5af35056b4d257e4b64b9e8069c0746e8b08629f/log4j-1.2.17.jar:/home/mpetronic/repos/nifi-ide-integration/src/main/resources
org.apache.nifi.NiFi

May I ask you a related question about environment setup for
contributing given I am new to contributing and want to get this
right?

I read the contributors/developers guides. They talk about three
repos: the ASF repo, the github mirror, and a personal forked github
repo. They talk about two way to commit: patch 

Re: Recommendation on getting started as a contibutor

2015-10-26 Thread Mark Petronic
On Mon, Oct 26, 2015 at 5:06 AM, Oleg Zhurakousky
 wrote:
> nifi-ide-integration

Oleg, thanks for the nifi-ide-integration github. I was able to get up
and running in the debugger in Eclipse pretty easy with this. I did
have to add this to my VM args in my run configuration and it was not
documented in your github:

-Dnifi.properties.file.path=/home/mpetronic/nifi-0.3.1-SNAPSHOT/./conf/nifi.properties

Wondering if I might have done something wrong although your github
readme is pretty clear? I exactly followed those instructions after
first cloning the asf repo, building Nifi, and then installing the
target tar.gz.

One final question, I see this on my Eclipse console:

log4j:WARN No appenders could be found for logger (org.apache.nifi.NiFi).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.

I see you you have a log4j.properties file in the nifi-ide-integration
project's src/main/resources path and that path is in the class path
for my Eclipse nifi-ide-integration project. Can you please help guide
me through getting console logging working for Eclipse? I notice that
nifi-ide-integration references a log4j property file whilst the
installed Nifi conf dir has a logback.xml file.


Re: Recommendation on getting started as a contibutor

2015-10-26 Thread Mark Petronic
Thanks Oleg. The pointer to the committers guide what I needed. Missed that
one. Thanks for the insight into your setup as well.
On Oct 26, 2015 5:06 AM, "Oleg Zhurakousky" <ozhurakou...@hortonworks.com>
wrote:

> Mark
>
> Yes there are a lot of projects since aside form core modules there are
> out-of-the-box NAR bundles and other supporting modules, and everything is
> a separate project.
> So, when importing you can chose to have a view into the entire NiFi or
> individual modules you interested in working on (you don’t have import all
> of them).
> Just to give you an idea here is what my workspace consists of at the time
> of writing this email
>
> nifi-framework
> nifi-provenance-repository-bundle
> nifi-api
> nifi-bootstrap
> nifi-data-provenance-utils
> nifi-framework-core
> nifi-framework-core-api
> nifi-ide-integration
> nifi-nar-utils
> nifi-persistent-provenance-repository
> nifi-properties
> nifi-provenance-repository-nar
> nifi-runtime
> nifi-standard-reporting-tasks
> nifi-utils
> nifi-volatile-provenance-repository
>
> The contributors guide -
> https://cwiki.apache.org/confluence/display/NIFI/Contributor+Guide is a
> good starting point for general questions and FAQs about contributions and
> it does provide instructions on how to run NiFi in DEBUG mode (specifically
> in Eclipse).
>
> Hope this helps, otherwise let us know.
>
> Cheers
> Oleg
>
> On Oct 26, 2015, at 1:39 AM, Mark Petronic <markpetro...@gmail.com markpetro...@gmail.com>> wrote:
>
> Hey guys, I read through the developers guide and am interested in
> maybe trying my hand at contributing to an OSS project for the first
> time. I'm pretty interested in Nifi. I've done a good bit of Java
> programming using Eclipse and Ant for a multi-threaded HTTP proxy-like
> application that uses Apache NIO and am very good with Git. I guess we
> used Ant because the guy who originally started the project did it
> that way and we have very little external jar dependencies so we never
> really moved to Maven.
>
> Looking at this project, I'm not really sure how to approach setting
> up a build environment. I'm used to being able to build and run my
> application in Eclipse and step though code in the debugger, etc, run
> unit tests from the IDE, that sort of stuff. I cloned Nifi and did the
> mvn clean build successfully against the nifi-0.3.0-RC1 tag as I
> believe that would be equivalent to the release I am running. I
> imported that into Eclipse using the import existing Maven project
> wizard. There are a LOT of projects. So, I am a bit lost at where to
> start here.
>
> Would anyone kindly step me through how you have your dev environment
> setup so that I could maybe mirror that and get started?
>
> Thanks in advance
>
>
>


Re: Recommendation on getting started as a contibutor

2015-10-26 Thread Mark Petronic
Thanks for the tips Venkatesh
Hi Mark,

Welcome.
I am a newbie to the project myself and therefore some of the tricks i used
are fresh in my mind.
(though i use intellij).

I suspect a key differentiator here is the use of Maven , which is
completely different from Ant and in a good way for the most part.

Here are some key points around the project organization.

a_ The Project uses Maven (https://maven.apache.org/) for building the
project. Its a multi-module maven project. I would suggest taking a quick
look at the Maven FAQ and User Guide.

b_ The root of the project is NiFi. Multi-module Maven projects are modular
in structure (hence the name). In NiFi the module dependencies are binary
i.e. each maven module, from a 100-feet view can be considered a stand-alone
project and simply imports its nifi dependencies like any other project.
There is nothing special about them except that they all live together in a
single repo.

c_ The key files to look at in maven are the pom.xml that are present in the
root of each project. I would start with importing the project into Eclipse
as a maven project thereby letting maven set the classpath for the project.
Start by looking at the pom.xml at the root i.e. NiFI and then go from
there.




--
View this message in context:
http://apache-nifi-developer-list.39713.n7.nabble.com/Recommendation-on-getting-started-as-a-contibutor-tp3313p3315.html
Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.


Recommendation on getting started as a contibutor

2015-10-25 Thread Mark Petronic
Hey guys, I read through the developers guide and am interested in
maybe trying my hand at contributing to an OSS project for the first
time. I'm pretty interested in Nifi. I've done a good bit of Java
programming using Eclipse and Ant for a multi-threaded HTTP proxy-like
application that uses Apache NIO and am very good with Git. I guess we
used Ant because the guy who originally started the project did it
that way and we have very little external jar dependencies so we never
really moved to Maven.

Looking at this project, I'm not really sure how to approach setting
up a build environment. I'm used to being able to build and run my
application in Eclipse and step though code in the debugger, etc, run
unit tests from the IDE, that sort of stuff. I cloned Nifi and did the
mvn clean build successfully against the nifi-0.3.0-RC1 tag as I
believe that would be equivalent to the release I am running. I
imported that into Eclipse using the import existing Maven project
wizard. There are a LOT of projects. So, I am a bit lost at where to
start here.

Would anyone kindly step me through how you have your dev environment
setup so that I could maybe mirror that and get started?

Thanks in advance