bq. on one node it works but on the other it gives me the above error.

Can you tell us the difference between the environments on the two nodes ?
Does the other node use Java 8 ?

Cheers

On Mon, Jul 27, 2015 at 11:38 AM, Rahul Palamuttam <rahulpala...@gmail.com>
wrote:

> Hi All,
>
> I hope this is the right place to post troubleshooting questions.
> I've been following the install instructions and I get the following error
> when running the following from Spark home directory
>
> $./build/sbt
> Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
> Note, this will be overridden by -java-home if it is set.
> Attempting to fetch sbt
> Launching sbt from build/sbt-launch-0.13.7.jar
> Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar
>
> However when I run sbt assembly it compiles, with a couple of warnings, but
> it works none-the less.
> Is the build/sbt script deprecated? I do notice on one node it works but on
> the other it gives me the above error.
>
> Thanks,
>
> Rahul P
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to