If this is not a confirmed regression from 1.0.2, I think it's better to
report it in a separate thread or JIRA.

I believe serious regressions are generally the only reason to block a new
release. Otherwise, if this is an old issue, it should be handled
separately.

2014년 9월 1일 월요일, chutium<teng....@gmail.com>님이 작성한 메시지:

> i didn't tried with 1.0.2
>
> it takes always too long to build spark assembly jars... more than 20min
>
> [info] Packaging
>
> /mnt/some-nfs/common/spark/assembly/target/scala-2.10/spark-assembly-1.1.0-SNAPSHOT-hadoop1.0.3-mapr-3.0.3.jar
> ...
> [info] Packaging
>
> /mnt/some-nfs/common/spark/examples/target/scala-2.10/spark-examples-1.1.0-SNAPSHOT-hadoop1.0.3-mapr-3.0.3.jar
> ...
> [info] Done packaging.
> [info] Done packaging.
> [success] Total time: 1582 s, completed Sep 1, 2014 1:39:21 PM
>
> is there some easily way to exclude some modules such as spark/examples or
> spark/external ?
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-1-0-RC3-tp8147p8163.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org <javascript:;>
> For additional commands, e-mail: dev-h...@spark.apache.org <javascript:;>
>
>

Reply via email to