just for some color, this is the type of hacking we are doing (See readme)

https://github.com/nsavageJVM/bigpetstoreprofiles

to try to get pig apps to build integration tests in maven.  its all
dependency related i beleieve.

So any thoughts on this would be awesome


On Mon, Jan 27, 2014 at 8:43 AM, Jay Vyas <jayunit...@gmail.com> wrote:

> Thanks for all this attention to my issue.... I'm trying to follow the
> debate.
>
> My first question:
>
> 1) Can you clarify what is meant by "we drive different dependencies off
> of our BOM the artifacts end up being different anyway :-("?
>
> Sorry ... Still learning the bigtop vernacular. :).
>
> 2) Now in the interim. My current thought is that pig,hive, etc aren't
> ever gauranteed to have non conflicting classpaths. And thus any hadoop app
> which attempts to integrate them into a single build will have to use stuff
> like submodules or profiles.
>
> > On Jan 27, 2014, at 12:24 AM, Mark Grover <m...@apache.org> wrote:
> >
> >> On Sat, Jan 25, 2014 at 11:51 PM, Roman Shaposhnik <r...@apache.org>
> wrote:
> >>
> >>> On Fri, Jan 24, 2014 at 3:28 PM, Mark Grover <m...@apache.org> wrote:
> >>> Hey Jay,
> >>> Currently we don't patch things in Bigtop. That means when we download
> >> and
> >>> include, say Hadoop 2.0.2 in Bigtop 0.8, our maven artifacts for hadoop
> >>> (say hadoop-common.jar) would have the version 2.2.0 - exactly the same
> >>> version as what upstream hadoop released.
> >>
> >> It is true that we stay away from patching the source, but since we
> >> drive different dependencies off of our BOM the artifacts end
> >> up being different anyway :-(
> >>
> >> IOW, an hbase jar from bigtop is NOT the same as published by
> >> upstream hbase. In fact it is kind of incompatible. Same goes for
> >> most of the other jars with non-trivial dependency chains.
> >
> > Ah, good point, Roman. Thanks for correcting me.
> >
> >
> >>
> >>> So, now 2 options exist for bigpetstore in my opinion:
> >>> 1. Pull upstream Hadoop artifacts from maven. You will rely on Apache
> >>> Hadoop artifacts instead of bigtop artifacts. However, since Bigtop
> >> doesn't
> >>> patch, java artifacts should be exactly the same from Bigtop as
> compared
> >> to
> >>> Apache Hadoop.
> >>> 2. Pull Bigtop artifacts for maven. For this, we will obviously need
> >> Bigtop
> >>> to a) start updating pom files with its own versioning scheme b) Upload
> >>> them to maven central or equivalent.
> >>>
> >>> As you can see option #2 is a fairly non-trivial overhead for Bigtop
> but
> >> I
> >>> would love to hear if you prefer one of the two options and if so why.
> >>
> >> I think the only real alternative is for us to bite the bullet and start
> >> publishing bigtop artifacts.
> >>
> >> In the ideal world -- we'd be pushing right to Maven central, provided
> >> that we're careful about branding the artifacts with an explicit Bigtop
> >> version stamp. E.g.
> >>   org.apache.hadoop:hadoop-annotations:2.2.0-bigtop_0.8.0
> >
> > Yeah, I'd be ok with that. The real question is that would we be changing
> > our policy to only patch things like version numbers or are we opening up
> > the realm of patching in general? Regardless, I think this would be a
> > non-trivial cost and we should consider aiming it for later release
> (0.9?).
> >
> >>
> >> Thanks,
> >> Roman.
> >
> > And, welcome back, Roman!
> > Mark
>



-- 
Jay Vyas
http://jayunit100.blogspot.com

Reply via email to