I have found the following to work for me on win 8.1:
1) run sbt assembly
2) Use Maven. You can find the maven commands for your build at : 
docs\building-spark.md


-----Original Message-----
From: Ishwardeep Singh [mailto:ishwardeep.si...@impetus.co.in] 
Sent: Thursday, November 27, 2014 11:31 PM
To: u...@spark.incubator.apache.org
Subject: Unable to compile spark 1.1.0 on windows 8.1

Hi,

I am trying to compile spark 1.1.0 on windows 8.1 but I get the following 
exception. 

[info] Compiling 3 Scala sources to
D:\myworkplace\software\spark-1.1.0\project\target\scala-2.10\sbt0.13\classes...
[error] D:\myworkplace\software\spark-1.1.0\project\SparkBuild.scala:26:
object sbt is not a member of package com.typesafe [error] import 
com.typesafe.sbt.pom.{PomBuild, SbtPomKeys}
[error]                     ^
[error] D:\myworkplace\software\spark-1.1.0\project\SparkBuild.scala:53: not
found: type PomBuild
[error] object SparkBuild extends PomBuild {
[error]                           ^
[error] D:\myworkplace\software\spark-1.1.0\project\SparkBuild.scala:121:
not found: value SbtPomKeys
[error]     otherResolvers <<= SbtPomKeys.mvnLocalRepository(dotM2 =>
Seq(Resolver.file("dotM2", dotM2))),
[error]                        ^
[error] D:\myworkplace\software\spark-1.1.0\project\SparkBuild.scala:165:
value projectDefinitions is not a member of AnyRef
[error]     super.projectDefinitions(baseDirectory).map { x =>
[error]           ^
[error] four errors found
[error] (plugins/compile:compile) Compilation failed

I have also setup scala 2.10.

Need help to resolve this issue.

Regards,
Ishwardeep 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-compile-spark-1-1-0-on-windows-8-1-tp19996.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to