Hi Judy,

Thank you for your response.

When I try to compile using maven "mvn -Dhadoop.version=1.2.1 -DskipTests
clean package" I get an error "Error: Could not find or load main class" . 
I have maven 3.0.4.

And when I run command "sbt package" I get the same exception as earlier.

I have done the following steps:

1. Download spark-1.1.0.tgz from the spark site and unzip the compressed zip
to a folder "d:\myworkplace\software\spark-1.1.0"
2. Then I downloaded sbt-0.13.7.zip and extract it to folder
"d:\myworkplace\software\sbt"
3. Update the PATH environment variable to include
"d:\myworkplace\software\sbt\bin" in the PATH.
4. Navigate to spark folder d:\myworkplace\software\spark-1.1.0
5. Run the command "sbt assembly"
6. As a side effect of this command a number of libraries are downloaded and
I get an initial error that path
C:\Users\ishwardeep.singh\.sbt\0.13\staging\ec3aa8f39111944cc5f2\sbt-pom-reader
does not exist. 
7. I manually create this subfolder "ec3aa8f39111944cc5f2\sbt-pom-reader"
and retry to get the next error as described in my initial error.

Is this the correct procedure to compile spark 1.1.0? Please let me know.

Hoping to hear from you soon.

Regards,
ishwardeep



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-compile-spark-1-1-0-on-windows-8-1-tp19996p20075.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to