I'm killing zinc (if it's running) before running each build attempt.

Trying to build as "clean" as possible.


On Mon, Apr 6, 2015 at 7:31 PM Patrick Wendell <pwend...@gmail.com> wrote:

> What if you don't run zinc? I.e. just download maven and run that "mvn
> package...". It might take longer, but I wonder if it will work.
>
> On Mon, Apr 6, 2015 at 10:26 PM, mjhb <sp...@mjhb.com> wrote:
> > Similar problem on 1.2 branch:
> >
> > [ERROR] Failed to execute goal on project spark-core_2.11: Could not
> resolve
> > dependencies for project
> > org.apache.spark:spark-core_2.11:jar:1.2.3-SNAPSHOT: The following
> artifacts
> > could not be resolved:
> > org.apache.spark:spark-network-common_2.10:jar:1.2.3-SNAPSHOT,
> > org.apache.spark:spark-network-shuffle_2.10:jar:1.2.3-SNAPSHOT: Failure
> to
> > find org.apache.spark:spark-network-common_2.10:jar:1.2.3-SNAPSHOT in
> > http://repository.apache.org/snapshots was cached in the local
> repository,
> > resolution will not be reattempted until the update interval of
> > apache.snapshots has elapsed or updates are forced -> [Help 1]
> > org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
> execute
> > goal on project spark-core_2.11: Could not resolve dependencies for
> project
> > org.apache.spark:spark-core_2.11:jar:1.2.3-SNAPSHOT: The following
> artifacts
> > could not be resolved:
> > org.apache.spark:spark-network-common_2.10:jar:1.2.3-SNAPSHOT,
> > org.apache.spark:spark-network-shuffle_2.10:jar:1.2.3-SNAPSHOT: Failure
> to
> > find org.apache.spark:spark-network-common_2.10:jar:1.2.3-SNAPSHOT in
> > http://repository.apache.org/snapshots was cached in the local
> repository,
> > resolution will not be reattempted until the update interval of
> > apache.snapshots has elapsed or updates are forced
> >
> >
> >
> >
> > --
> > View this message in context: http://apache-spark-
> developers-list.1001551.n3.nabble.com/1-3-Build-Error-
> with-Scala-2-11-tp11441p11442.html
> > Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> > For additional commands, e-mail: dev-h...@spark.apache.org
> >
>

Reply via email to