Re: build error - failing test- Error while building spark 2.0 trunk from github

2016-07-31 Thread Jacek Laskowski
Hi,

Can you share what's the command to run the build? What's the OS? Java?

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Jul 31, 2016 at 6:54 PM, Rohit Chaddha
 wrote:
> ---
>  T E S T S
> ---
> Running org.apache.spark.api.java.OptionalSuite
> Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.052 sec -
> in org.apache.spark.api.java.OptionalSuite
> Running org.apache.spark.JavaAPISuite
> Tests run: 90, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 23.537 sec
> <<< FAILURE! - in org.apache.spark.JavaAPISuite
> wholeTextFiles(org.apache.spark.JavaAPISuite)  Time elapsed: 0.331 sec  <<<
> FAILURE!
> java.lang.AssertionError:
> expected:> but was:
> at
> org.apache.spark.JavaAPISuite.wholeTextFiles(JavaAPISuite.java:1087)
>
> Running org.apache.spark.JavaJdbcRDDSuite
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.799 sec -
> in org.apache.spark.JavaJdbcRDDSuite
> Running org.apache.spark.launcher.SparkLauncherSuite
> Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.04 sec <<<
> FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
> testChildProcLauncher(org.apache.spark.launcher.SparkLauncherSuite)  Time
> elapsed: 0.03 sec  <<< FAILURE!
> java.lang.AssertionError: expected:<0> but was:<1>
> at
> org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:110)
>
> Running org.apache.spark.memory.TaskMemoryManagerSuite
> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.011 sec -
> in org.apache.spark.memory.TaskMemoryManagerSuite
> Running org.apache.spark.shuffle.sort.PackedRecordPointerSuite
> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 sec -
> in org.apache.spark.shuffle.sort.PackedRecordPointerSuite
> Running org.apache.spark.shuffle.sort.ShuffleInMemoryRadixSorterSuite
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.103 sec -
> in org.apache.spark.shuffle.sort.ShuffleInMemoryRadixSorterSuite
> Running org.apache.spark.shuffle.sort.ShuffleInMemorySorterSuite
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.199 sec -
> in org.apache.spark.shuffle.sort.ShuffleInMemorySorterSuite
> Running org.apache.spark.shuffle.sort.UnsafeShuffleWriterSuite
> Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.67 sec -
> in org.apache.spark.shuffle.sort.UnsafeShuffleWriterSuite
> Running org.apache.spark.unsafe.map.BytesToBytesMapOffHeapSuite
> Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.97 sec -
> in org.apache.spark.unsafe.map.BytesToBytesMapOffHeapSuite
> Running org.apache.spark.unsafe.map.BytesToBytesMapOnHeapSuite
> Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.583 sec -
> in org.apache.spark.unsafe.map.BytesToBytesMapOnHeapSuite
> Running
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorterRadixSortSuite
> Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.533 sec -
> in
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorterRadixSortSuite
> Running
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorterSuite
> Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.606 sec -
> in org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorterSuite
> Running
> org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorterRadixSortSuite
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 sec -
> in
> org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorterRadixSortSuite
> Running
> org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorterSuite
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec -
> in org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorterSuite
> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m;
> support was removed in 8.0
>
> Results :
>
> Failed tests:
>   JavaAPISuite.wholeTextFiles:1087 expected:> but was:
>   SparkLauncherSuite.testChildProcLauncher:110 expected:<0> but was:<1>
>
> Tests run: 189, Failures: 2, Errors: 0, Skipped: 0

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



build error - failing test- Error while building spark 2.0 trunk from github

2016-07-31 Thread Rohit Chaddha
---
 T E S T S
---
Running org.apache.spark.api.java.OptionalSuite
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.052 sec -
in org.apache.spark.api.java.OptionalSuite
Running org.apache.spark.JavaAPISuite
Tests run: 90, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 23.537 sec
<<< FAILURE! - in org.apache.spark.JavaAPISuite
wholeTextFiles(org.apache.spark.JavaAPISuite)  Time elapsed: 0.331 sec  <<<
FAILURE!
java.lang.AssertionError:
expected: but was:
at
org.apache.spark.JavaAPISuite.wholeTextFiles(JavaAPISuite.java:1087)

Running org.apache.spark.JavaJdbcRDDSuite
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.799 sec -
in org.apache.spark.JavaJdbcRDDSuite
Running org.apache.spark.launcher.SparkLauncherSuite
Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.04 sec
<<< FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
testChildProcLauncher(org.apache.spark.launcher.SparkLauncherSuite)  Time
elapsed: 0.03 sec  <<< FAILURE!
java.lang.AssertionError: expected:<0> but was:<1>
at
org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:110)

Running org.apache.spark.memory.TaskMemoryManagerSuite
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.011 sec -
in org.apache.spark.memory.TaskMemoryManagerSuite
Running org.apache.spark.shuffle.sort.PackedRecordPointerSuite
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 sec -
in org.apache.spark.shuffle.sort.PackedRecordPointerSuite
Running org.apache.spark.shuffle.sort.ShuffleInMemoryRadixSorterSuite
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.103 sec -
in org.apache.spark.shuffle.sort.ShuffleInMemoryRadixSorterSuite
Running org.apache.spark.shuffle.sort.ShuffleInMemorySorterSuite
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.199 sec -
in org.apache.spark.shuffle.sort.ShuffleInMemorySorterSuite
Running org.apache.spark.shuffle.sort.UnsafeShuffleWriterSuite
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.67 sec -
in org.apache.spark.shuffle.sort.UnsafeShuffleWriterSuite
Running org.apache.spark.unsafe.map.BytesToBytesMapOffHeapSuite
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.97 sec -
in org.apache.spark.unsafe.map.BytesToBytesMapOffHeapSuite
Running org.apache.spark.unsafe.map.BytesToBytesMapOnHeapSuite
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.583 sec
- in org.apache.spark.unsafe.map.BytesToBytesMapOnHeapSuite
Running
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorterRadixSortSuite
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.533 sec
- in
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorterRadixSortSuite
Running
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorterSuite
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.606 sec
- in org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorterSuite
Running
org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorterRadixSortSuite
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 sec -
in
org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorterRadixSortSuite
Running
org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorterSuite
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec -
in org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorterSuite
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
MaxPermSize=512m; support was removed in 8.0

Results :

Failed tests:
  JavaAPISuite.wholeTextFiles:1087 expected: but was:
  SparkLauncherSuite.testChildProcLauncher:110 expected:<0> but was:<1>

Tests run: 189, Failures: 2, Errors: 0, Skipped: 0


Re: Spark build error

2015-11-17 Thread Jeff Zhang
This also bother me for a long time. I suspect the intellij builder
conflicts with the sbt/maven builder.

I resolve this issue by rebuild spark in intellij.  You may meet
compilation issue when building it in intellij.
For that you need to put external/flume-sink/target/java on the source
build path.



On Wed, Nov 18, 2015 at 12:02 PM, Ted Yu  wrote:

> Is the Scala version in Intellij the same as the one used by sbt ?
>
> Cheers
>
> On Tue, Nov 17, 2015 at 6:45 PM, 金国栋  wrote:
>
>> Hi!
>>
>> I tried to build spark source code from github, and I successfully built
>> it from command line using `*sbt/sbt assembly*`. While I encountered an
>> error when compiling the project in Intellij IDEA(V14.1.5).
>>
>>
>> The error log is below:
>> *Error:scala: *
>> * while compiling:
>> /Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/src/main/scala/org/apache/spark/sql/util/QueryExecutionListener.scala*
>> *during phase: jvm*
>>  library version: version 2.10.5
>> compiler version: version 2.10.5
>>   reconstructed args: -nobootcp -javabootclasspath : -deprecation
>> -feature -classpath
>> /Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/tools.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/htmlconverter.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/target/scala-2.10/classes:/Users/ray/Documents/P01_Project/Spark-Github/spark/core/target/scala-2.10/classes:/Users/ray/.m2/repository/org/apache/avro/avro-mapred/1.7.7/avro-mapred-1.7.7-hadoop2.jar:/Users/ray/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7.jar:/Users/ray/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7-tests.jar:/Users/ray/.m2/repository/com/twitter/chill_2.10/0.5.0/chill_2.10-0.5.0.jar:/Users/ray/.m2/repository/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar:/Users/ray/.m2/repository/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar:/Users/ray/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/ray/.m2/repository/com/twitter/chill-java/0.5.0/chill-java-0.5.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-client/2.2.0/hadoop-client-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-common/2.2.0/hadoop-common-2.2.0.jar:/Users/ray/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/ray/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/Users/ray/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/ray/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/Users/ray/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/ray/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/Users/ray/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/com

Re: Spark build error

2015-11-17 Thread Ted Yu
Is the Scala version in Intellij the same as the one used by sbt ?

Cheers

On Tue, Nov 17, 2015 at 6:45 PM, 金国栋  wrote:

> Hi!
>
> I tried to build spark source code from github, and I successfully built
> it from command line using `*sbt/sbt assembly*`. While I encountered an
> error when compiling the project in Intellij IDEA(V14.1.5).
>
>
> The error log is below:
> *Error:scala: *
> * while compiling:
> /Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/src/main/scala/org/apache/spark/sql/util/QueryExecutionListener.scala*
> *during phase: jvm*
>  library version: version 2.10.5
> compiler version: version 2.10.5
>   reconstructed args: -nobootcp -javabootclasspath : -deprecation -feature
> -classpath
> /Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/tools.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/htmlconverter.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/target/scala-2.10/classes:/Users/ray/Documents/P01_Project/Spark-Github/spark/core/target/scala-2.10/classes:/Users/ray/.m2/repository/org/apache/avro/avro-mapred/1.7.7/avro-mapred-1.7.7-hadoop2.jar:/Users/ray/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7.jar:/Users/ray/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7-tests.jar:/Users/ray/.m2/repository/com/twitter/chill_2.10/0.5.0/chill_2.10-0.5.0.jar:/Users/ray/.m2/repository/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar:/Users/ray/.m2/repository/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar:/Users/ray/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/ray/.m2/repository/com/twitter/chill-java/0.5.0/chill-java-0.5.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-client/2.2.0/hadoop-client-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-common/2.2.0/hadoop-common-2.2.0.jar:/Users/ray/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/ray/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/Users/ray/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/ray/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/Users/ray/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/ray/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/Users/ray/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/Users/ray/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-auth/2.2.0/hadoop-auth-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.2.0/hadoop-hdfs-2.2.0.jar:/Users/ray/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar

Spark build error

2015-11-17 Thread 金国栋
Hi!

I tried to build spark source code from github, and I successfully built it
from command line using `*sbt/sbt assembly*`. While I encountered an error
when compiling the project in Intellij IDEA(V14.1.5).


The error log is below:
*Error:scala: *
* while compiling:
/Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/src/main/scala/org/apache/spark/sql/util/QueryExecutionListener.scala*
*during phase: jvm*
 library version: version 2.10.5
compiler version: version 2.10.5
  reconstructed args: -nobootcp -javabootclasspath : -deprecation -feature
-classpath
/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/tools.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/htmlconverter.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/target/scala-2.10/classes:/Users/ray/Documents/P01_Project/Spark-Github/spark/core/target/scala-2.10/classes:/Users/ray/.m2/repository/org/apache/avro/avro-mapred/1.7.7/avro-mapred-1.7.7-hadoop2.jar:/Users/ray/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7.jar:/Users/ray/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7-tests.jar:/Users/ray/.m2/repository/com/twitter/chill_2.10/0.5.0/chill_2.10-0.5.0.jar:/Users/ray/.m2/repository/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar:/Users/ray/.m2/repository/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar:/Users/ray/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/ray/.m2/repository/com/twitter/chill-java/0.5.0/chill-java-0.5.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-client/2.2.0/hadoop-client-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-common/2.2.0/hadoop-common-2.2.0.jar:/Users/ray/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/ray/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/Users/ray/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/ray/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/Users/ray/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/ray/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/Users/ray/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/Users/ray/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-auth/2.2.0/hadoop-auth-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.2.0/hadoop-hdfs-2.2.0.jar:/Users/ray/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-app/2.2.0/hadoop-mapreduce-client-app-2.2.0.jar:/Users/ray/.m2/repository/org/apache

Re: Spark 1.4.0 build Error on Windows

2015-06-03 Thread pawan kumar
I got the same error message when using maven 3.3 .
On Jun 3, 2015 8:58 AM, "Ted Yu"  wrote:

> I used the same command on Linux but didn't reproduce the error.
>
> Can you include -X switch on your command line ?
>
> Also consider upgrading maven to 3.3.x
>
> Cheers
>
> On Wed, Jun 3, 2015 at 2:36 AM, Daniel Emaasit 
> wrote:
>
>> I run into errors while trying to build Spark from the 1.4 release
>> branch: https://github.com/apache/spark/tree/branch-1.4. Any help will
>> be much appreciated. Here is the log file from my windows 8.1 PC. (F.Y.I, I
>> installed all the dependencies like Java 7, Maven 3.2.5 and set
>> the environment variables)
>>
>>
>> C:\Program Files\Apache Software Foundation\spark-branch-1.4>mvn
>> -Psparkr -Pyarn
>>  -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
>> [INFO] Scanning for projects...
>> [INFO]
>> 
>> [INFO] Reactor Build Order:
>> [INFO]
>> [INFO] Spark Project Parent POM
>> [INFO] Spark Launcher Project
>> [INFO] Spark Project Networking
>> [INFO] Spark Project Shuffle Streaming Service
>> [INFO] Spark Project Unsafe
>> [INFO] Spark Project Core
>> [INFO] Spark Project Bagel
>> [INFO] Spark Project GraphX
>> [INFO] Spark Project Streaming
>> [INFO] Spark Project Catalyst
>> [INFO] Spark Project SQL
>> [INFO] Spark Project ML Library
>> [INFO] Spark Project Tools
>> [INFO] Spark Project Hive
>> [INFO] Spark Project REPL
>> [INFO] Spark Project YARN
>> [INFO] Spark Project Assembly
>> [INFO] Spark Project External Twitter
>> [INFO] Spark Project External Flume Sink
>> [INFO] Spark Project External Flume
>> [INFO] Spark Project External MQTT
>> [INFO] Spark Project External ZeroMQ
>> [INFO] Spark Project External Kafka
>> [INFO] Spark Project Examples
>> [INFO] Spark Project External Kafka Assembly
>> [INFO] Spark Project YARN Shuffle Service
>> [INFO]
>> [INFO]
>> 
>> [INFO] Building Spark Project Parent POM 1.4.0-SNAPSHOT
>> [INFO]
>> 
>> [INFO]
>> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ spark-parent_2.10
>> --
>> -
>> [INFO]
>> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @ spark
>> -parent_2
>> .10 ---
>> [INFO]
>> [INFO] --- scala-maven-plugin:3.2.0:add-source (eclipse-add-source) @
>> spark-pare
>> nt_2.10 ---
>> [INFO] Add Source directory: C:\Program Files\Apache Software Foundation\
>> spark-b
>> ranch-1.4\src\main\scala
>> [INFO] Add Test Source directory: C:\Program Files\Apache Software
>> Foundation\sp
>> ark-branch-1.4\src\test\scala
>> [INFO]
>> [INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-scala-sources)
>> @ spar
>> k-parent_2.10 ---
>> [INFO] Source directory: C:\Program Files\Apache Software Foundation\
>> spark-branc
>> h-1.4\src\main\scala added.
>> [INFO]
>> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark
>> -parent_2.
>> 10 ---
>> [INFO]
>> 
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] Spark Project Parent POM ... FAILURE [
>> 23.528 s]
>> [INFO] Spark Launcher Project . SKIPPED
>> [INFO] Spark Project Networking ... SKIPPED
>> [INFO] Spark Project Shuffle Streaming Service  SKIPPED
>> [INFO] Spark Project Unsafe ... SKIPPED
>> [INFO] Spark Project Core . SKIPPED
>> [INFO] Spark Project Bagel  SKIPPED
>> [INFO] Spark Project GraphX ... SKIPPED
>> [INFO] Spark Project Streaming  SKIPPED
>> [INFO] Spark Project Catalyst . SKIPPED
>> [INFO] Spark Project SQL .. SKIPPED
>> [INFO] Spark Project ML Library ... SKIPPED
>> [INFO] Spark Project Tools  SKIPPED
>> [INFO] Spark Project Hive . SKIPPED
>> [INFO] Spark Project REPL . SKIPPED
>> [INFO] Spark Project YARN . SKIPPED
>> [INFO] Spark Project Assembly . SKIPPED
>> [INFO] Spark Project External Twitter . SKIPPED
>> [INFO] Spark Project External Flume Sink .. SKIPPED
>> [INFO] Spark Project External Flume ... SKIPPED
>> [INFO] Spark Project External MQTT  SKIPPED
>> [INFO] Spark Project External ZeroMQ .. SKIPPED
>> [INFO] Spark Project External Kafka ... SKIPPED
>> [INFO] Spark Project Examples . SKIPPED
>> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>> [INFO] Spark Project YARN Shuffle Service ...

Re: Spark 1.4.0 build Error on Windows

2015-06-03 Thread Ted Yu
I used the same command on Linux but didn't reproduce the error.

Can you include -X switch on your command line ?

Also consider upgrading maven to 3.3.x

Cheers

On Wed, Jun 3, 2015 at 2:36 AM, Daniel Emaasit 
wrote:

> I run into errors while trying to build Spark from the 1.4 release
> branch: https://github.com/apache/spark/tree/branch-1.4. Any help will be
> much appreciated. Here is the log file from my windows 8.1 PC. (F.Y.I, I
> installed all the dependencies like Java 7, Maven 3.2.5 and set
> the environment variables)
>
>
> C:\Program Files\Apache Software Foundation\spark-branch-1.4>mvn -Psparkr
> -Pyarn
>  -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
> [INFO] Scanning for projects...
> [INFO]
> 
> [INFO] Reactor Build Order:
> [INFO]
> [INFO] Spark Project Parent POM
> [INFO] Spark Launcher Project
> [INFO] Spark Project Networking
> [INFO] Spark Project Shuffle Streaming Service
> [INFO] Spark Project Unsafe
> [INFO] Spark Project Core
> [INFO] Spark Project Bagel
> [INFO] Spark Project GraphX
> [INFO] Spark Project Streaming
> [INFO] Spark Project Catalyst
> [INFO] Spark Project SQL
> [INFO] Spark Project ML Library
> [INFO] Spark Project Tools
> [INFO] Spark Project Hive
> [INFO] Spark Project REPL
> [INFO] Spark Project YARN
> [INFO] Spark Project Assembly
> [INFO] Spark Project External Twitter
> [INFO] Spark Project External Flume Sink
> [INFO] Spark Project External Flume
> [INFO] Spark Project External MQTT
> [INFO] Spark Project External ZeroMQ
> [INFO] Spark Project External Kafka
> [INFO] Spark Project Examples
> [INFO] Spark Project External Kafka Assembly
> [INFO] Spark Project YARN Shuffle Service
> [INFO]
> [INFO]
> 
> [INFO] Building Spark Project Parent POM 1.4.0-SNAPSHOT
> [INFO]
> 
> [INFO]
> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ spark-parent_2.10
> --
> -
> [INFO]
> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @ spark
> -parent_2
> .10 ---
> [INFO]
> [INFO] --- scala-maven-plugin:3.2.0:add-source (eclipse-add-source) @
> spark-pare
> nt_2.10 ---
> [INFO] Add Source directory: C:\Program Files\Apache Software Foundation\
> spark-b
> ranch-1.4\src\main\scala
> [INFO] Add Test Source directory: C:\Program Files\Apache Software
> Foundation\sp
> ark-branch-1.4\src\test\scala
> [INFO]
> [INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-scala-sources)
> @ spar
> k-parent_2.10 ---
> [INFO] Source directory: C:\Program Files\Apache Software Foundation\spark
> -branc
> h-1.4\src\main\scala added.
> [INFO]
> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark
> -parent_2.
> 10 ---
> [INFO]
> 
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Spark Project Parent POM ... FAILURE [
> 23.528 s]
> [INFO] Spark Launcher Project . SKIPPED
> [INFO] Spark Project Networking ... SKIPPED
> [INFO] Spark Project Shuffle Streaming Service  SKIPPED
> [INFO] Spark Project Unsafe ... SKIPPED
> [INFO] Spark Project Core . SKIPPED
> [INFO] Spark Project Bagel  SKIPPED
> [INFO] Spark Project GraphX ... SKIPPED
> [INFO] Spark Project Streaming  SKIPPED
> [INFO] Spark Project Catalyst . SKIPPED
> [INFO] Spark Project SQL .. SKIPPED
> [INFO] Spark Project ML Library ... SKIPPED
> [INFO] Spark Project Tools  SKIPPED
> [INFO] Spark Project Hive . SKIPPED
> [INFO] Spark Project REPL . SKIPPED
> [INFO] Spark Project YARN . SKIPPED
> [INFO] Spark Project Assembly . SKIPPED
> [INFO] Spark Project External Twitter . SKIPPED
> [INFO] Spark Project External Flume Sink .. SKIPPED
> [INFO] Spark Project External Flume ... SKIPPED
> [INFO] Spark Project External MQTT  SKIPPED
> [INFO] Spark Project External ZeroMQ .. SKIPPED
> [INFO] Spark Project External Kafka ... SKIPPED
> [INFO] Spark Project Examples . SKIPPED
> [INFO] Spark Project External Kafka Assembly .. SKIPPED
> [INFO] Spark Project YARN Shuffle Service . SKIPPED
> [INFO]
> 
> [INFO] BUILD FAILURE
> [INFO]
> 
> [I

Re: Spark 1.4.0 build Error on Windows

2015-06-03 Thread Daniel Emaasit
I run into errors while trying to build Spark from the 1.4 release branch:
https://github.com/apache/spark/tree/branch-1.4. Any help will be much
appreciated. Here is the log file from my windows 8.1 PC. (F.Y.I, I
installed all the dependencies like Java 7, Maven 3.2.5 and set
the environment variables)


C:\Program Files\Apache Software Foundation\spark-branch-1.4>mvn -Psparkr
-Pyarn
 -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
[INFO] Scanning for projects...
[INFO]

[INFO] Reactor Build Order:
[INFO]
[INFO] Spark Project Parent POM
[INFO] Spark Launcher Project
[INFO] Spark Project Networking
[INFO] Spark Project Shuffle Streaming Service
[INFO] Spark Project Unsafe
[INFO] Spark Project Core
[INFO] Spark Project Bagel
[INFO] Spark Project GraphX
[INFO] Spark Project Streaming
[INFO] Spark Project Catalyst
[INFO] Spark Project SQL
[INFO] Spark Project ML Library
[INFO] Spark Project Tools
[INFO] Spark Project Hive
[INFO] Spark Project REPL
[INFO] Spark Project YARN
[INFO] Spark Project Assembly
[INFO] Spark Project External Twitter
[INFO] Spark Project External Flume Sink
[INFO] Spark Project External Flume
[INFO] Spark Project External MQTT
[INFO] Spark Project External ZeroMQ
[INFO] Spark Project External Kafka
[INFO] Spark Project Examples
[INFO] Spark Project External Kafka Assembly
[INFO] Spark Project YARN Shuffle Service
[INFO]
[INFO]

[INFO] Building Spark Project Parent POM 1.4.0-SNAPSHOT
[INFO]

[INFO]
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ spark-parent_2.10
-- 
-
[INFO]
[INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @ spark
-parent_2
.10 ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:add-source (eclipse-add-source) @ spark
-pare
nt_2.10 ---
[INFO] Add Source directory: C:\Program Files\Apache Software Foundation\
spark-b
ranch-1.4\src\main\scala
[INFO] Add Test Source directory: C:\Program Files\Apache Software
Foundation\sp
ark-branch-1.4\src\test\scala
[INFO]
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-scala-sources) @
spar
k-parent_2.10 ---
[INFO] Source directory: C:\Program Files\Apache Software Foundation\spark
-branc
h-1.4\src\main\scala added.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark
-parent_2.
10 ---
[INFO]

[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ... FAILURE [
23.528 s]
[INFO] Spark Launcher Project . SKIPPED
[INFO] Spark Project Networking ... SKIPPED
[INFO] Spark Project Shuffle Streaming Service  SKIPPED
[INFO] Spark Project Unsafe ... SKIPPED
[INFO] Spark Project Core . SKIPPED
[INFO] Spark Project Bagel  SKIPPED
[INFO] Spark Project GraphX ... SKIPPED
[INFO] Spark Project Streaming  SKIPPED
[INFO] Spark Project Catalyst . SKIPPED
[INFO] Spark Project SQL .. SKIPPED
[INFO] Spark Project ML Library ... SKIPPED
[INFO] Spark Project Tools  SKIPPED
[INFO] Spark Project Hive . SKIPPED
[INFO] Spark Project REPL . SKIPPED
[INFO] Spark Project YARN . SKIPPED
[INFO] Spark Project Assembly . SKIPPED
[INFO] Spark Project External Twitter . SKIPPED
[INFO] Spark Project External Flume Sink .. SKIPPED
[INFO] Spark Project External Flume ... SKIPPED
[INFO] Spark Project External MQTT  SKIPPED
[INFO] Spark Project External ZeroMQ .. SKIPPED
[INFO] Spark Project External Kafka ... SKIPPED
[INFO] Spark Project Examples . SKIPPED
[INFO] Spark Project External Kafka Assembly .. SKIPPED
[INFO] Spark Project YARN Shuffle Service . SKIPPED
[INFO]

[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 24.680 s
[INFO] Finished at: 2015-06-03T02:11:35-07:00
[INFO] Final Memory: 27M/224M
[INFO]

[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-remote-resources-p
lugin:1.5:process (default) on project spark-parent_2.10: Error finding
remote r
esources manifests: C:\Program Files\Apache Software Foundation\spark
-branch-1.4
\target\maven-sh

Re: Build error

2015-01-30 Thread Tathagata Das
That is a known issue uncovered last week. It fails on certain
environments, not on Jenkins which is our testing environment.
There is already a PR up to fix it. For now you can build using "mvn
package -DskipTests"
TD

On Fri, Jan 30, 2015 at 8:59 PM, Andrew Musselman <
andrew.mussel...@gmail.com> wrote:

> Off master, got this error; is that typical?
>
> ---
>  T E S T S
> ---
> Running org.apache.spark.streaming.mqtt.JavaMQTTStreamSuite
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.495 sec
> - in org.apache.spark.streaming.mqtt.JavaMQTTStreamSuite
>
> Results :
>
>
>
>
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
>
> [INFO]
> [INFO] --- scalatest-maven-plugin:1.0:test (test) @
> spark-streaming-mqtt_2.10 ---
> Discovery starting.
> Discovery completed in 498 milliseconds.
> Run starting. Expected test count is: 1
> MQTTStreamSuite:
> - mqtt input stream *** FAILED ***
>   org.eclipse.paho.client.mqttv3.MqttException: Too many publishes in
> progress
>   at
> org.eclipse.paho.client.mqttv3.internal.ClientState.send(ClientState.java:432)
>   at
> org.eclipse.paho.client.mqttv3.internal.ClientComms.internalSend(ClientComms.java:121)
>   at
> org.eclipse.paho.client.mqttv3.internal.ClientComms.sendNoWait(ClientComms.java:139)
>   at org.eclipse.paho.client.mqttv3.MqttTopic.publish(MqttTopic.java:107)
>   at
> org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$publishData$1.apply(MQTTStreamSuite.scala:125)
>   at
> org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$publishData$1.apply(MQTTStreamSuite.scala:124)
>   at scala.collection.immutable.Range.foreach(Range.scala:141)
>   at
> org.apache.spark.streaming.mqtt.MQTTStreamSuite.publishData(MQTTStreamSuite.scala:124)
>   at
> org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$3.apply$mcV$sp(MQTTStreamSuite.scala:78)
>   at
> org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$3.apply(MQTTStreamSuite.scala:66)
>   ...
> Exception in thread "Thread-20" org.apache.spark.SparkException: Job
> cancelled because SparkContext was shut down
> at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:690)
> at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:689)
> at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
> at
> org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:689)
> at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1384)
> at org.apache.spark.util.EventLoop.stop(EventLoop.scala:81)
> at
> org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1319)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1250)
> at
> org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:510)
> at
> org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:485)
> at
> org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$2.apply$mcV$sp(MQTTStreamSuite.scala:59)
> at
> org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$2.apply(MQTTStreamSuite.scala:57)
> at
> org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$2.apply(MQTTStreamSuite.scala:57)
> at org.scalatest.BeforeAndAfter$class.runTest(BeforeAndAfter.scala:210)
> at
> org.apache.spark.streaming.mqtt.MQTTStreamSuite.runTest(MQTTStreamSuite.scala:38)
> at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
> at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
> at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
> at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
> at scala.collection.immutable.List.foreach(List.scala:318)
> at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
> at org.scalatest.SuperEngine.org
> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
> at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
> at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
> at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
> at org.scalatest.Suite$class.run(Suite.scala:1424)
> at org.scalatest.FunSuite.org
> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
> at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
> at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
> at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
> at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
> at org.apache.spark.streaming.mqtt.MQTTStreamSuite.org
> $scalatest$BeforeAndAfter$$super$run(MQTTStreamSuite.scala:38)
> at org.scalatest.BeforeAndAfter$class.run(BeforeAndAfter.scala:241)

Build error

2015-01-30 Thread Andrew Musselman
Off master, got this error; is that typical?

---
 T E S T S
---
Running org.apache.spark.streaming.mqtt.JavaMQTTStreamSuite
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.495 sec -
in org.apache.spark.streaming.mqtt.JavaMQTTStreamSuite

Results :




Tests run: 1, Failures: 0, Errors: 0, Skipped: 0

[INFO]
[INFO] --- scalatest-maven-plugin:1.0:test (test) @
spark-streaming-mqtt_2.10 ---
Discovery starting.
Discovery completed in 498 milliseconds.
Run starting. Expected test count is: 1
MQTTStreamSuite:
- mqtt input stream *** FAILED ***
  org.eclipse.paho.client.mqttv3.MqttException: Too many publishes in
progress
  at
org.eclipse.paho.client.mqttv3.internal.ClientState.send(ClientState.java:432)
  at
org.eclipse.paho.client.mqttv3.internal.ClientComms.internalSend(ClientComms.java:121)
  at
org.eclipse.paho.client.mqttv3.internal.ClientComms.sendNoWait(ClientComms.java:139)
  at org.eclipse.paho.client.mqttv3.MqttTopic.publish(MqttTopic.java:107)
  at
org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$publishData$1.apply(MQTTStreamSuite.scala:125)
  at
org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$publishData$1.apply(MQTTStreamSuite.scala:124)
  at scala.collection.immutable.Range.foreach(Range.scala:141)
  at
org.apache.spark.streaming.mqtt.MQTTStreamSuite.publishData(MQTTStreamSuite.scala:124)
  at
org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$3.apply$mcV$sp(MQTTStreamSuite.scala:78)
  at
org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$3.apply(MQTTStreamSuite.scala:66)
  ...
Exception in thread "Thread-20" org.apache.spark.SparkException: Job
cancelled because SparkContext was shut down
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:690)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:689)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at
org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:689)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1384)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:81)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1319)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1250)
at
org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:510)
at
org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:485)
at
org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$2.apply$mcV$sp(MQTTStreamSuite.scala:59)
at
org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$2.apply(MQTTStreamSuite.scala:57)
at
org.apache.spark.streaming.mqtt.MQTTStreamSuite$$anonfun$2.apply(MQTTStreamSuite.scala:57)
at org.scalatest.BeforeAndAfter$class.runTest(BeforeAndAfter.scala:210)
at
org.apache.spark.streaming.mqtt.MQTTStreamSuite.runTest(MQTTStreamSuite.scala:38)
at
org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
at
org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
at
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
at
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.org
$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
at org.scalatest.Suite$class.run(Suite.scala:1424)
at org.scalatest.FunSuite.org
$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
at
org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
at
org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
at org.apache.spark.streaming.mqtt.MQTTStreamSuite.org
$scalatest$BeforeAndAfter$$super$run(MQTTStreamSuite.scala:38)
at org.scalatest.BeforeAndAfter$class.run(BeforeAndAfter.scala:241)
at
org.apache.spark.streaming.mqtt.MQTTStreamSuite.run(MQTTStreamSuite.scala:38)
at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492)
at
org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528)
at
org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1526)
at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at org.scalatest.Suit

Spark-1.2.0 build error

2015-01-02 Thread rapelly kartheek
Hi,

I get the following error when I build spark using sbt:

[error] Nonzero exit code (128): git clone
https://github.com/ScrapCodes/sbt-pom-reader.git
/home/karthik/.sbt/0.13/staging/ad8e8574a5bcb2d22d23/sbt-pom-reader
[error] Use 'last' for the full log.


Any help please?


Re: Build error when using spark with breeze

2014-09-26 Thread Xiangrui Meng
We removed commons-math3 from dependencies to avoid version conflict
with hadoop-common. hadoop-common-2.3+ depends on commons-math3-3.1.1,
while breeze depends on commons-math3-3.3. 3.3 is not backward
compatible with 3.1.1. So we removed it because the breeze functions
we use do not touch commons-math3 code. As Sean suggested, please
include breeze in the dependency set of your project. Do not rely on
transitive dependencies. -Xiangrui

On Fri, Sep 26, 2014 at 9:08 AM, Jaonary Rabarisoa  wrote:
> I solve the problem by including the commons-math3 package in my sbt
> dependencies as Sean suggested. Thanks.
>
> On Fri, Sep 26, 2014 at 6:05 PM, Ted Yu  wrote:
>>
>> You can use scope of runtime.
>>
>> See
>> http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#Dependency_Scope
>>
>> Cheers
>>
>> On Fri, Sep 26, 2014 at 8:57 AM, Jaonary Rabarisoa 
>> wrote:
>>>
>>> Thank Ted. Can you tell me how to adjust the scope ?
>>>
>>> On Fri, Sep 26, 2014 at 5:47 PM, Ted Yu  wrote:
>>>>
>>>> spark-core's dependency on commons-math3 is @ test scope (core/pom.xml):
>>>> 
>>>>   org.apache.commons
>>>>   commons-math3
>>>>   3.3
>>>>   test
>>>> 
>>>>
>>>> Adjusting the scope should solve the problem below.
>>>>
>>>> On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa 
>>>> wrote:
>>>>>
>>>>> Hi all,
>>>>>
>>>>> I'm using some functions from Breeze in a spark job but I get the
>>>>> following build error :
>>>>>
>>>>> Error:scalac: bad symbolic reference. A signature in RandBasis.class
>>>>> refers to term math3
>>>>> in package org.apache.commons which is not available.
>>>>> It may be completely missing from the current classpath, or the version
>>>>> on
>>>>> the classpath might be incompatible with the version used when
>>>>> compiling RandBasis.class.
>>>>>
>>>>> In my case, I just declare a new Gaussian distribution
>>>>>
>>>>> val g = new Gaussian(0d,1d)
>>>>>
>>>>> I'm using spark 1.1
>>>>>
>>>>>
>>>>> Any ideas to fix this ?
>>>>>
>>>>>
>>>>> Best regards,
>>>>>
>>>>>
>>>>> Jao
>>>>
>>>>
>>>
>>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Build error when using spark with breeze

2014-09-26 Thread Jaonary Rabarisoa
I solve the problem by including the commons-math3 package in my sbt
dependencies as Sean suggested. Thanks.

On Fri, Sep 26, 2014 at 6:05 PM, Ted Yu  wrote:

> You can use scope of runtime.
>
> See
> http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#Dependency_Scope
>
> Cheers
>
> On Fri, Sep 26, 2014 at 8:57 AM, Jaonary Rabarisoa 
> wrote:
>
>> Thank Ted. Can you tell me how to adjust the scope ?
>>
>> On Fri, Sep 26, 2014 at 5:47 PM, Ted Yu  wrote:
>>
>>> spark-core's dependency on commons-math3 is @ test scope (core/pom.xml):
>>> 
>>>   org.apache.commons
>>>   commons-math3
>>>   3.3
>>>   test
>>> 
>>>
>>> Adjusting the scope should solve the problem below.
>>>
>>> On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa 
>>> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I'm using some functions from Breeze in a spark job but I get the
>>>> following build error :
>>>>
>>>> *Error:scalac: bad symbolic reference. A signature in RandBasis.class
>>>> refers to term math3*
>>>> *in package org.apache.commons which is not available.*
>>>> *It may be completely missing from the current classpath, or the
>>>> version on*
>>>> *the classpath might be incompatible with the version used when
>>>> compiling RandBasis.class.*
>>>>
>>>> In my case, I just declare a new Gaussian distribution
>>>>
>>>> *val g = new Gaussian(0d,1d)*
>>>>
>>>> I'm using spark 1.1
>>>>
>>>>
>>>> Any ideas to fix this ?
>>>>
>>>>
>>>> Best regards,
>>>>
>>>>
>>>> Jao
>>>>
>>>
>>>
>>
>


Re: Build error when using spark with breeze

2014-09-26 Thread Ted Yu
You can use scope of runtime.

See
http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#Dependency_Scope

Cheers

On Fri, Sep 26, 2014 at 8:57 AM, Jaonary Rabarisoa 
wrote:

> Thank Ted. Can you tell me how to adjust the scope ?
>
> On Fri, Sep 26, 2014 at 5:47 PM, Ted Yu  wrote:
>
>> spark-core's dependency on commons-math3 is @ test scope (core/pom.xml):
>> 
>>   org.apache.commons
>>   commons-math3
>>   3.3
>>   test
>> 
>>
>> Adjusting the scope should solve the problem below.
>>
>> On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa 
>> wrote:
>>
>>> Hi all,
>>>
>>> I'm using some functions from Breeze in a spark job but I get the
>>> following build error :
>>>
>>> *Error:scalac: bad symbolic reference. A signature in RandBasis.class
>>> refers to term math3*
>>> *in package org.apache.commons which is not available.*
>>> *It may be completely missing from the current classpath, or the version
>>> on*
>>> *the classpath might be incompatible with the version used when
>>> compiling RandBasis.class.*
>>>
>>> In my case, I just declare a new Gaussian distribution
>>>
>>> *val g = new Gaussian(0d,1d)*
>>>
>>> I'm using spark 1.1
>>>
>>>
>>> Any ideas to fix this ?
>>>
>>>
>>> Best regards,
>>>
>>>
>>> Jao
>>>
>>
>>
>


Re: Build error when using spark with breeze

2014-09-26 Thread Sean Owen
Shouldn't the user's application depend on commons-math3 if it uses
it? it shouldn't require a Spark change. Maybe I misunderstand.

On Fri, Sep 26, 2014 at 4:47 PM, Ted Yu  wrote:
> spark-core's dependency on commons-math3 is @ test scope (core/pom.xml):
> 
>   org.apache.commons
>   commons-math3
>   3.3
>   test
> 
>
> Adjusting the scope should solve the problem below.
>
> On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa 
> wrote:
>>
>> Hi all,
>>
>> I'm using some functions from Breeze in a spark job but I get the
>> following build error :
>>
>> Error:scalac: bad symbolic reference. A signature in RandBasis.class
>> refers to term math3
>> in package org.apache.commons which is not available.
>> It may be completely missing from the current classpath, or the version on
>> the classpath might be incompatible with the version used when compiling
>> RandBasis.class.
>>
>> In my case, I just declare a new Gaussian distribution
>>
>> val g = new Gaussian(0d,1d)
>>
>> I'm using spark 1.1
>>
>>
>> Any ideas to fix this ?
>>
>>
>> Best regards,
>>
>>
>> Jao
>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Build error when using spark with breeze

2014-09-26 Thread Jaonary Rabarisoa
Thank Ted. Can you tell me how to adjust the scope ?

On Fri, Sep 26, 2014 at 5:47 PM, Ted Yu  wrote:

> spark-core's dependency on commons-math3 is @ test scope (core/pom.xml):
> 
>   org.apache.commons
>   commons-math3
>   3.3
>   test
> 
>
> Adjusting the scope should solve the problem below.
>
> On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa 
> wrote:
>
>> Hi all,
>>
>> I'm using some functions from Breeze in a spark job but I get the
>> following build error :
>>
>> *Error:scalac: bad symbolic reference. A signature in RandBasis.class
>> refers to term math3*
>> *in package org.apache.commons which is not available.*
>> *It may be completely missing from the current classpath, or the version
>> on*
>> *the classpath might be incompatible with the version used when compiling
>> RandBasis.class.*
>>
>> In my case, I just declare a new Gaussian distribution
>>
>> *val g = new Gaussian(0d,1d)*
>>
>> I'm using spark 1.1
>>
>>
>> Any ideas to fix this ?
>>
>>
>> Best regards,
>>
>>
>> Jao
>>
>
>


Re: Build error when using spark with breeze

2014-09-26 Thread Ted Yu
spark-core's dependency on commons-math3 is @ test scope (core/pom.xml):

  org.apache.commons
  commons-math3
  3.3
  test


Adjusting the scope should solve the problem below.

On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa 
wrote:

> Hi all,
>
> I'm using some functions from Breeze in a spark job but I get the
> following build error :
>
> *Error:scalac: bad symbolic reference. A signature in RandBasis.class
> refers to term math3*
> *in package org.apache.commons which is not available.*
> *It may be completely missing from the current classpath, or the version
> on*
> *the classpath might be incompatible with the version used when compiling
> RandBasis.class.*
>
> In my case, I just declare a new Gaussian distribution
>
> *val g = new Gaussian(0d,1d)*
>
> I'm using spark 1.1
>
>
> Any ideas to fix this ?
>
>
> Best regards,
>
>
> Jao
>


Build error when using spark with breeze

2014-09-26 Thread Jaonary Rabarisoa
Hi all,

I'm using some functions from Breeze in a spark job but I get the following
build error :

*Error:scalac: bad symbolic reference. A signature in RandBasis.class
refers to term math3*
*in package org.apache.commons which is not available.*
*It may be completely missing from the current classpath, or the version on*
*the classpath might be incompatible with the version used when compiling
RandBasis.class.*

In my case, I just declare a new Gaussian distribution

*val g = new Gaussian(0d,1d)*

I'm using spark 1.1


Any ideas to fix this ?


Best regards,


Jao


Spark build error

2014-08-06 Thread Priya Ch
Hi,

I am trying to build jars using the command :

mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package

Execution of the above command is throwing the following error:

[INFO] Spark Project Core . FAILURE [  0.295 s]
[INFO] Spark Project Bagel  SKIPPED
[INFO] Spark Project GraphX ... SKIPPED
[INFO] Spark Project ML Library ... SKIPPED
[INFO] Spark Project Streaming  SKIPPED
[INFO] Spark Project Tools  SKIPPED
[INFO] Spark Project Catalyst . SKIPPED
[INFO] Spark Project SQL .. SKIPPED
[INFO] Spark Project Hive . SKIPPED
[INFO] Spark Project REPL . SKIPPED
[INFO] Spark Project YARN Parent POM .. SKIPPED
[INFO] Spark Project YARN Stable API .. SKIPPED
[INFO] Spark Project Assembly . SKIPPED
[INFO] Spark Project External Twitter . SKIPPED
[INFO] Spark Project External Kafka ... SKIPPED
[INFO] Spark Project External Flume ... SKIPPED
[INFO] Spark Project External ZeroMQ .. SKIPPED
[INFO] Spark Project External MQTT  SKIPPED
[INFO] Spark Project Examples . SKIPPED
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 3.748 s
[INFO] Finished at: 2014-08-07T01:00:48+05:30
[INFO] Final Memory: 24M/175M
[INFO] 
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
(default) on project spark-core_2.10: Execution default of goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
failed: For artifact {null:null:null:jar}: The groupId cannot be
empty. -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
execute goal org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
(default) on project spark-core_2.10: Execution default of goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
failed: For artifact {null:null:null:jar}: The groupId cannot be
empty.
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:224)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at 
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at 
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:347)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:154)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at 
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at 
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at 
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution
default of goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
failed: For artifact {null:null:null:jar}: The groupId cannot be
empty.
at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:143)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
Caused by: org.apache.maven.artifact.InvalidArtifactRTException: For
artifact {null:null:null:jar}: The groupId cannot be empty.



Can someone help me on this ?


Re: spark github source build error

2014-07-23 Thread m3.sharma
ackage$DebugNode,
> > object package$DebugNode)
> >   at scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:1678)
> > at
> >
> scala.reflect.internal.Symbols$ClassSymbol.companionModule0(Symbols.scala:2988)
>
> > at
> >
> scala.reflect.internal.Symbols$ClassSymbol.companionModule(Symbols.scala:2991)
>
> > at
> >
> scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1371)
>
> > at
> scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:120)
> > at
> > scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1583)
> > at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1557)
> > at scala.tools.nsc.Global$Run.compileSources(Global.scala:1553)
> > at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
> > at xsbt.CachedCompiler0.run(CompilerInterface.scala:123)
> > at xsbt.CachedCompiler0.run(CompilerInterface.scala:99)
> > at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> > at java.lang.reflect.Method.invoke(Method.java:606)
> > at
> sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
> > at
> > sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> > at
> > sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> > at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:99)
>
> > at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
>
> > at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
>
> > at
> >
> sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:166)
>
> > at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3.compileScala$1(AggressiveCompile.scala:98)
>
> > at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:143)
>
> > at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:87)
> > at
> > sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:39)
> > at
> > sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:37)
> > at sbt.inc.IncrementalCommon.cycle(Incremental.scala:99)
> > at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
> > at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:37)
> > at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:65)
> > at sbt.inc.Incremental$.compile(Incremental.scala:37)
> > at sbt.inc.IncrementalCompile$.apply(Compile.scala:27)
> > at
> > sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:157)
> > at
> > sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:71)
> > at
> sbt.compiler.AggressiveCompile.apply(AggressiveCompile.scala:46)
> > at sbt.Compiler$.apply(Compiler.scala:75)
> > at sbt.Compiler$.apply(Compiler.scala:66)
> > at
> sbt.Defaults$.sbt$Defaults$$compileTaskImpl(Defaults.scala:770)
> > at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:762)
> > at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:762)
> > at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
> > at
> > sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
> > at sbt.std.Transform$$anon$4.work(System.scala:64)
> > at
> > sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
> > at
> > sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
> > at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
> > at sbt.Execute.work(Execute.scala:244)
> > at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
> > at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
> > at
> >
> sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
>
> > at
> sbt.CompletionSe

Re: spark github source build error

2014-07-23 Thread Xiangrui Meng
  at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:123)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:99)
> at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
> at
> sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> at
> sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:99)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
> at
> sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:166)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3.compileScala$1(AggressiveCompile.scala:98)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:143)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:87)
> at
> sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:39)
> at
> sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:37)
> at sbt.inc.IncrementalCommon.cycle(Incremental.scala:99)
> at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
> at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:37)
> at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:65)
> at sbt.inc.Incremental$.compile(Incremental.scala:37)
> at sbt.inc.IncrementalCompile$.apply(Compile.scala:27)
> at
> sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:157)
> at
> sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:71)
> at sbt.compiler.AggressiveCompile.apply(AggressiveCompile.scala:46)
> at sbt.Compiler$.apply(Compiler.scala:75)
> at sbt.Compiler$.apply(Compiler.scala:66)
> at sbt.Defaults$.sbt$Defaults$$compileTaskImpl(Defaults.scala:770)
> at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:762)
> at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:762)
> at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
> at
> sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
> at sbt.std.Transform$$anon$4.work(System.scala:64)
> at
> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
> at
> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
> at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
> at sbt.Execute.work(Execute.scala:244)
> at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
> at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
> at
> sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
> at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:744)
> [error] (sql/compile:compile) java.lang.AssertionError: assertion failed:
> List(object package$DebugNode, object package$DebugNode)
> [error] Total time: 126 s, completed Jul 23, 2014 11:19:27 AM
>
> I dont want spark sql, I can do without it.
>
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-tp10532.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.


spark github source build error

2014-07-23 Thread m3.sharma
ler.scala:102)
at
sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
at
sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
at
sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:99)
at
sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
at
sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
at
sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:166)
at
sbt.compiler.AggressiveCompile$$anonfun$3.compileScala$1(AggressiveCompile.scala:98)
at
sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:143)
at
sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:87)
at
sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:39)
at
sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:37)
at sbt.inc.IncrementalCommon.cycle(Incremental.scala:99)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:37)
at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:65)
at sbt.inc.Incremental$.compile(Incremental.scala:37)
at sbt.inc.IncrementalCompile$.apply(Compile.scala:27)
at
sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:157)
at
sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:71)
at sbt.compiler.AggressiveCompile.apply(AggressiveCompile.scala:46)
at sbt.Compiler$.apply(Compiler.scala:75)
at sbt.Compiler$.apply(Compiler.scala:66)
at sbt.Defaults$.sbt$Defaults$$compileTaskImpl(Defaults.scala:770)
at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:762)
at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:762)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at
sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
at sbt.std.Transform$$anon$4.work(System.scala:64)
at
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:244)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at
sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
[error] (sql/compile:compile) java.lang.AssertionError: assertion failed:
List(object package$DebugNode, object package$DebugNode)
[error] Total time: 126 s, completed Jul 23, 2014 11:19:27 AM

I dont want spark sql, I can do without it.





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-tp10532.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Apache Spark 0.9.0 Build Error

2014-03-18 Thread x
I tried to build 0.9.0 on Windows&Cygwin yesterday and it passed.
Did you launch it on Cygwin?

-xj

On Wed, Mar 19, 2014 at 12:42 AM, wapisani  wrote:

> Hi Chen,
>
> I tried "sbt\sbt assembly" and I got an error of " 'sbt\sbt' is not
> recognized as an internal or external command, operable program or batch
> file."
>
>
>
> On Tue, Mar 18, 2014 at 11:18 AM, Chen Jingci [via Apache Spark User List]
> <[hidden email] <http://user/SendEmail.jtp?type=node&node=2812&i=0>>wrote:
>
>> hi, if you run that under windows, you should use "\" to replace "/".
>> sbt/sbt means the sbt file under the sbt folder.
>> On Mar 18, 2014 8:42 PM, "wapisani" <[hidden 
>> email]<http://user/SendEmail.jtp?type=node&node=2811&i=0>>
>> wrote:
>>
>>> I tried that command on Fedora and I got a lot of random downloads
>>> (around 250 downloads) and it appeared that something was trying to get
>>> BitTorrent start. That command "./sbt/sbt assembly" doesn't work on
>>> Windows.
>>>
>>> I installed sbt separately. Is there a way to determine if I'm using the
>>> sbt that's included with Spark or the standalone version?
>>>
>>>
>>> On Tue, Mar 18, 2014 at 12:16 AM, Mark Hamstra [via Apache Spark User
>>> List] <[hidden email]<http://user/SendEmail.jtp?type=node&node=2806&i=0>
>>> > wrote:
>>> Try ./sbt/sbt assembly
>>>
>>>
>>> On Mon, Mar 17, 2014 at 9:06 PM, wapisani <[hidden 
>>> email]<http://user/SendEmail.jtp?type=node&node=2795&i=0>
>>> > wrote:
>>> Good morning! I'm attempting to build Apache Spark 0.9.0 on Windows 8.
>>> I've
>>> installed all prerequisites (except Hadoop) and run "sbt/sbt assembly"
>>> while
>>> in the root directory. I'm getting an error after the line "Set current
>>> project to root ". The
>>> error
>>> is:
>>> [error] Not a valid command: /
>>> [error] /sbt
>>> [error]  ^
>>>
>>> Do you know why I'm getting this error?
>>>
>>> Thank you very much,
>>> Will
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>
>>
>>
>> --
>>  If you reply to this email, your message will be added to the
>> discussion below:
>>
>> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2795.html
>>  To unsubscribe from Apache Spark 0.9.0 Build Error, click here.
>> NAML<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>>
>>
>>
>> --
>> Will Pisani
>> Fourth-Year Chemical Engineering Student
>> Research Scholar
>> Honors Institute
>> Michigan Technological University
>>
>> --
>> View this message in context: Re: Apache Spark 0.9.0 Build 
>> Error<http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2806.html>
>>
>> Sent from the Apache Spark User List mailing list 
>> archive<http://apache-spark-user-list.1001560.n3.nabble.com/>at Nabble.com.
>>
>>
>> --
>>  If you reply to this email, your message will be added to the
>> discussion below:
>>
>> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2811.html
>>  To unsubscribe from Apache Spark 0.9.0 Build Error, click here.
>> NAML<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>>
>
>
>
> --
> Will Pisani
> Fourth-Year Chemical Engineering Student
> Research Scholar
> Honors Institute
> Michigan Technological University
>
> --
> View this message in context: Re: Apache Spark 0.9.0 Build 
> Error<http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2812.html>
> Sent from the Apache Spark User List mailing list 
> archive<http://apache-spark-user-list.1001560.n3.nabble.com/>at Nabble.com.
>


Re: Apache Spark 0.9.0 Build Error

2014-03-18 Thread wapisani
Hi Chen,

I tried "sbt\sbt assembly" and I got an error of " 'sbt\sbt' is not
recognized as an internal or external command, operable program or batch
file."



On Tue, Mar 18, 2014 at 11:18 AM, Chen Jingci [via Apache Spark User List] <
ml-node+s1001560n2811...@n3.nabble.com> wrote:

> hi, if you run that under windows, you should use "\" to replace "/".
> sbt/sbt means the sbt file under the sbt folder.
> On Mar 18, 2014 8:42 PM, "wapisani" <[hidden 
> email]<http://user/SendEmail.jtp?type=node&node=2811&i=0>>
> wrote:
>
>> I tried that command on Fedora and I got a lot of random downloads
>> (around 250 downloads) and it appeared that something was trying to get
>> BitTorrent start. That command "./sbt/sbt assembly" doesn't work on
>> Windows.
>>
>> I installed sbt separately. Is there a way to determine if I'm using the
>> sbt that's included with Spark or the standalone version?
>>
>>
>> On Tue, Mar 18, 2014 at 12:16 AM, Mark Hamstra [via Apache Spark User
>> List] <[hidden email] <http://user/SendEmail.jtp?type=node&node=2806&i=0>
>> > wrote:
>> Try ./sbt/sbt assembly
>>
>>
>> On Mon, Mar 17, 2014 at 9:06 PM, wapisani <[hidden 
>> email]<http://user/SendEmail.jtp?type=node&node=2795&i=0>
>> > wrote:
>> Good morning! I'm attempting to build Apache Spark 0.9.0 on Windows 8.
>> I've
>> installed all prerequisites (except Hadoop) and run "sbt/sbt assembly"
>> while
>> in the root directory. I'm getting an error after the line "Set current
>> project to root ". The error
>> is:
>> [error] Not a valid command: /
>> [error] /sbt
>> [error]  ^
>>
>> Do you know why I'm getting this error?
>>
>> Thank you very much,
>> Will
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>
>
>
> --
>  If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2795.html
>  To unsubscribe from Apache Spark 0.9.0 Build Error, click here.
> NAML<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>
>
>
> --
> Will Pisani
> Fourth-Year Chemical Engineering Student
> Research Scholar
> Honors Institute
> Michigan Technological University
>
> ------
> View this message in context: Re: Apache Spark 0.9.0 Build 
> Error<http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2806.html>
>
> Sent from the Apache Spark User List mailing list 
> archive<http://apache-spark-user-list.1001560.n3.nabble.com/>at Nabble.com.
>
>
> --
>  If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2811.html
>  To unsubscribe from Apache Spark 0.9.0 Build Error, click 
> here<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=2794&code=d2FwaXNhbmlAbXR1LmVkdXwyNzk0fDEyNzEzNDQzNzg=>
> .
> NAML<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>



-- 
Will Pisani
Fourth-Year Chemical Engineering Student
Research Scholar
Honors Institute
Michigan Technological University




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2812.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Apache Spark 0.9.0 Build Error

2014-03-18 Thread Robin Cjc
hi, if you run that under windows, you should use "\" to replace "/".
sbt/sbt means the sbt file under the sbt folder.
On Mar 18, 2014 8:42 PM, "wapisani"  wrote:

> I tried that command on Fedora and I got a lot of random downloads (around
> 250 downloads) and it appeared that something was trying to get BitTorrent
> start. That command "./sbt/sbt assembly" doesn't work on Windows.
>
> I installed sbt separately. Is there a way to determine if I'm using the
> sbt that's included with Spark or the standalone version?
>
>
> On Tue, Mar 18, 2014 at 12:16 AM, Mark Hamstra [via Apache Spark User
> List] <[hidden email] 
> <http://user/SendEmail.jtp?type=node&node=2806&i=0>>wrote:
>
>> Try ./sbt/sbt assembly
>>
>>
>> On Mon, Mar 17, 2014 at 9:06 PM, wapisani <[hidden 
>> email]<http://user/SendEmail.jtp?type=node&node=2795&i=0>
>> > wrote:
>>
>>> Good morning! I'm attempting to build Apache Spark 0.9.0 on Windows 8.
>>> I've
>>> installed all prerequisites (except Hadoop) and run "sbt/sbt assembly"
>>> while
>>> in the root directory. I'm getting an error after the line "Set current
>>> project to root ". The
>>> error
>>> is:
>>> [error] Not a valid command: /
>>> [error] /sbt
>>> [error]  ^
>>>
>>> Do you know why I'm getting this error?
>>>
>>> Thank you very much,
>>> Will
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>
>>
>>
>> --
>>  If you reply to this email, your message will be added to the
>> discussion below:
>>
>> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2795.html
>>  To unsubscribe from Apache Spark 0.9.0 Build Error, click here.
>> NAML<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>>
>
>
>
> --
> Will Pisani
> Fourth-Year Chemical Engineering Student
> Research Scholar
> Honors Institute
> Michigan Technological University
>
> --
> View this message in context: Re: Apache Spark 0.9.0 Build 
> Error<http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2806.html>
> Sent from the Apache Spark User List mailing list 
> archive<http://apache-spark-user-list.1001560.n3.nabble.com/>at Nabble.com.
>


Re: Apache Spark 0.9.0 Build Error

2014-03-18 Thread wapisani
I tried that command on Fedora and I got a lot of random downloads (around
250 downloads) and it appeared that something was trying to get BitTorrent
start. That command "./sbt/sbt assembly" doesn't work on Windows.

I installed sbt separately. Is there a way to determine if I'm using the
sbt that's included with Spark or the standalone version?


On Tue, Mar 18, 2014 at 12:16 AM, Mark Hamstra [via Apache Spark User List]
 wrote:

> Try ./sbt/sbt assembly
>
>
> On Mon, Mar 17, 2014 at 9:06 PM, wapisani <[hidden 
> email]<http://user/SendEmail.jtp?type=node&node=2795&i=0>
> > wrote:
>
>> Good morning! I'm attempting to build Apache Spark 0.9.0 on Windows 8.
>> I've
>> installed all prerequisites (except Hadoop) and run "sbt/sbt assembly"
>> while
>> in the root directory. I'm getting an error after the line "Set current
>> project to root ". The error
>> is:
>> [error] Not a valid command: /
>> [error] /sbt
>> [error]  ^
>>
>> Do you know why I'm getting this error?
>>
>> Thank you very much,
>> Will
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>
>
>
> ------
>  If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2795.html
>  To unsubscribe from Apache Spark 0.9.0 Build Error, click 
> here<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=2794&code=d2FwaXNhbmlAbXR1LmVkdXwyNzk0fDEyNzEzNDQzNzg=>
> .
> NAML<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>



-- 
Will Pisani
Fourth-Year Chemical Engineering Student
Research Scholar
Honors Institute
Michigan Technological University




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794p2806.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Apache Spark 0.9.0 Build Error

2014-03-17 Thread Mark Hamstra
Try ./sbt/sbt assembly


On Mon, Mar 17, 2014 at 9:06 PM, wapisani  wrote:

> Good morning! I'm attempting to build Apache Spark 0.9.0 on Windows 8. I've
> installed all prerequisites (except Hadoop) and run "sbt/sbt assembly"
> while
> in the root directory. I'm getting an error after the line "Set current
> project to root ". The error
> is:
> [error] Not a valid command: /
> [error] /sbt
> [error]  ^
>
> Do you know why I'm getting this error?
>
> Thank you very much,
> Will
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>


Apache Spark 0.9.0 Build Error

2014-03-17 Thread wapisani
Good morning! I'm attempting to build Apache Spark 0.9.0 on Windows 8. I've
installed all prerequisites (except Hadoop) and run "sbt/sbt assembly" while
in the root directory. I'm getting an error after the line "Set current
project to root ". The error
is:
[error] Not a valid command: /
[error] /sbt
[error]  ^

Do you know why I'm getting this error?

Thank you very much,
Will



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.