Re: failed to run spark sample on windows

2015-09-29 Thread saurfang
See
http://stackoverflow.com/questions/26516865/is-it-possible-to-run-hadoop-jobs-like-the-wordcount-sample-in-the-local-mode,
https://issues.apache.org/jira/browse/SPARK-6961 and finally
https://issues.apache.org/jira/browse/HADOOP-10775. The easy solution is to
download a Windows Hadoop distribution and point %HADOOP_HOME% to that
location so winutils.exe can be picked up.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/failed-to-run-spark-sample-on-windows-tp14393p14407.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [VOTE] Release Apache Spark 1.5.0 (RC3)

2015-09-03 Thread saurfang
+1. Compiled on Windows with YARN and Hive. Tested Tungsten aggregation and
observed similar (good) performance comparing to 1.4 with unsafe on. Ran a
few workloads and tested SparkSQL thrift server



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-5-0-RC3-tp13928p13953.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [VOTE] Release Apache Spark 1.5.0 (RC2)

2015-08-27 Thread saurfang
Nevermind. It looks like this has been fixed in
https://github.com/apache/spark/pull/8053 but didn't make the cut? Even
though the associated JIRA is targeted for 1.6, I was able to submit to YARN
from Windows without a problem with 1.4. I'm wondering if this fix will be
merged to 1.5 branch. Let me know if someone thinks I'm just not doing the
compile and/or spark-submit right.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-5-0-RC2-tp13826p13872.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [VOTE] Release Apache Spark 1.5.0 (RC2)

2015-08-27 Thread saurfang
Compiled on Windows with YARN and HIVE. However I got exception when
submitting application to YARN due to: 

java.net.URISyntaxException: Illegal character in opaque part at index 2:
D:\TEMP\spark-b32c5b5b-a9fa-4cfd-a233-3977588d4092\__spark_conf__1960856096319316224.zip
at java.net.URI$Parser.fail(URI.java:2829)
at java.net.URI$Parser.checkChars(URI.java:3002)
at java.net.URI$Parser.parse(URI.java:3039)
at java.net.URI.init(URI.java:595)
at
org.apache.spark.deploy.yarn.Client.org$apache$spark$deploy$yarn$Client$$distribute$1(Client.scala:321)
at
org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:417)

It looks like either we can do `new File(path).toURI` at here:
https://github.com/apache/spark/blob/v1.5.0-rc2/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L321

Or make sure the file path use '/' separator here:
https://github.com/apache/spark/blob/v1.5.0-rc2/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L417




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-5-0-RC2-tp13826p13871.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

2015-06-08 Thread saurfang
+1

Build for Hadoop 2.4. Run a few jobs on YARN and tested spark.sql.unsafe
whose performance seems great!



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-4-0-RC4-tp12582p12671.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org