I tried the following:

  511  rm -rf
~/.m2/repository/org/apache/spark/spark-core_2.10/1.3.0-SNAPSHOT/
  513  mvn -am -pl streaming package -DskipTests

[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .......................... SUCCESS [4.976s]
[INFO] Spark Project Networking .......................... SUCCESS [1.279s]
[INFO] Spark Project Shuffle Streaming Service ........... SUCCESS [0.499s]
[INFO] Spark Project Core ................................ SUCCESS
[1:03.302s]
[INFO] Spark Project Streaming ........................... SUCCESS [26.777s]
[INFO]
------------------------------------------------------------------------
[INFO] BUILD SUCCESS

Cheers

On Fri, Dec 5, 2014 at 4:53 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> I've never used it, but reading the help it seems the "-am" option
> might help here.
>
> On Fri, Dec 5, 2014 at 4:47 PM, Sean Owen <so...@cloudera.com> wrote:
> > Maven definitely compiles "what is needed", but not if you tell it to
> > only compile one module alone. Unless you have previously built and
> > installed the other local snapshot artifacts it needs, that invocation
> > can't proceed because you have restricted it to build one module whose
> > dependencies don't exist.
> >
> > On Fri, Dec 5, 2014 at 6:44 PM, Koert Kuipers <ko...@tresata.com> wrote:
> >> i think what changed is that core now has dependencies on other sub
> >> projects. ok... so i am forced to install stuff because maven cannot
> compile
> >> "what is needed". i will install
> >>
> >> On Fri, Dec 5, 2014 at 7:12 PM, Koert Kuipers <ko...@tresata.com>
> wrote:
> >>>
> >>> i suddenly also run into the issue that maven is trying to download
> >>> snapshots that dont exists for other sub projects.
> >>>
> >>> did something change in the maven build?
> >>>
> >>> does maven not have capability to smartly compile the other
> sub-projects
> >>> that a sub-project depends on?
> >>>
> >>> i rather avoid "mvn install" since this creates a local maven repo. i
> have
> >>> been stung by that before (spend a day trying to do something and got
> weird
> >>> errors because some toy version i once build was stuck in my local
> maven
> >>> repo and it somehow got priority over a real maven repo).
> >>>
> >>> On Fri, Dec 5, 2014 at 5:28 PM, Marcelo Vanzin <van...@cloudera.com>
> >>> wrote:
> >>>>
> >>>> You can set SPARK_PREPEND_CLASSES=1 and it should pick your new mllib
> >>>> classes whenever you compile them.
> >>>>
> >>>> I don't see anything similar for examples/, so if you modify example
> >>>> code you need to re-build the examples module ("package" or "install"
> >>>> - just "compile" won't work, since you need to build the new jar).
> >>>>
> >>>> On Thu, Dec 4, 2014 at 10:23 PM, MEETHU MATHEW <
> meethu2...@yahoo.co.in>
> >>>> wrote:
> >>>> > Hi all,
> >>>> >
> >>>> > I made some code changes  in mllib project and as mentioned in the
> >>>> > previous
> >>>> > mails I did
> >>>> >
> >>>> > mvn install -pl mllib
> >>>> >
> >>>> > Now  I run a program in examples using run-example, the new code is
> not
> >>>> > executing.Instead the previous code itself is running.
> >>>> >
> >>>> > But if I do an  "mvn install" in the entire spark project , I can
> see
> >>>> > the
> >>>> > new code running.But installing the entire spark takes a lot of time
> >>>> > and so
> >>>> > its difficult to do this each time  I make some changes.
> >>>> >
> >>>> > Can someone tell me how to compile mllib alone and get the changes
> >>>> > working?
> >>>> >
> >>>> > Thanks & Regards,
> >>>> > Meethu M
> >>>> >
> >>>> >
> >>>> > On Friday, 28 November 2014 2:39 PM, MEETHU MATHEW
> >>>> > <meethu2...@yahoo.co.in>
> >>>> > wrote:
> >>>> >
> >>>> >
> >>>> > Hi,
> >>>> > I have a similar problem.I modified the code in mllib and examples.
> >>>> > I did
> >>>> > mvn install -pl mllib
> >>>> > mvn install -pl examples
> >>>> >
> >>>> > But when I run the program in examples using run-example,the older
> >>>> > version
> >>>> > of  mllib (before the changes were made) is getting executed.
> >>>> > How to get the changes made in mllib while  calling it from examples
> >>>> > project?
> >>>> >
> >>>> > Thanks & Regards,
> >>>> > Meethu M
> >>>> >
> >>>> >
> >>>> > On Monday, 24 November 2014 3:33 PM, Yiming (John) Zhang
> >>>> > <sdi...@gmail.com>
> >>>> > wrote:
> >>>> >
> >>>> >
> >>>> > Thank you, Marcelo and Sean, "mvn install" is a good answer for my
> >>>> > demands.
> >>>> >
> >>>> > -----邮件原件-----
> >>>> > 发件人: Marcelo Vanzin [mailto:van...@cloudera.com]
> >>>> > 发送时间: 2014年11月21日 1:47
> >>>> > 收件人: yiming zhang
> >>>> > 抄送: Sean Owen; user@spark.apache.org
> >>>> > 主题: Re: How to incrementally compile spark examples using mvn
> >>>> >
> >>>> > Hi Yiming,
> >>>> >
> >>>> > On Wed, Nov 19, 2014 at 5:35 PM, Yiming (John) Zhang <
> sdi...@gmail.com>
> >>>> > wrote:
> >>>> >> Thank you for your reply. I was wondering whether there is a
> method of
> >>>> >> reusing locally-built components without installing them? That is,
> if
> >>>> >> I have
> >>>> >> successfully built the spark project as a whole, how should I
> >>>> >> configure it
> >>>> >> so that I can incrementally build (only) the "spark-examples" sub
> >>>> >> project
> >>>> >> without the need of downloading or installation?
> >>>> >
> >>>> > As Sean suggest, you shouldn't need to install anything. After "mvn
> >>>> > install", your local repo is a working Spark installation, and you
> can
> >>>> > use
> >>>> > spark-submit and other tool directly within it.
> >>>> >
> >>>> > You just need to remember to rebuild the assembly/ project when
> >>>> > modifying
> >>>> > Spark code (or the examples/ project when modifying examples).
> >>>> >
> >>>> >
> >>>> > --
> >>>> > Marcelo
> >>>> >
> >>>> >
> >>>> >
> ---------------------------------------------------------------------
> >>>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >>>> > For additional commands, e-mail: user-h...@spark.apache.org
> >>>> >
> >>>> >
> >>>> >
> >>>> >
> >>>>
> >>>>
> >>>>
> >>>> --
> >>>> Marcelo
> >>>>
> >>>> ---------------------------------------------------------------------
> >>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >>>> For additional commands, e-mail: user-h...@spark.apache.org
> >>>>
> >>>
> >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to