Re: spark-shell not working on standalone cluster (java.io.IOException: Cannot run program compute-classpath.sh)

2013-11-25 Thread Grega Kešpret
Thanks, will try it out! Grega -- [image: Inline image 1] *Grega Kešpret* Analytics engineer Celtra — Rich Media Mobile Advertising celtra.com | @celtramobile On Mon, Nov 25, 2013 at 11:54 PM, Aaron Davidson wrote: > There is a pul

Re: spark-shell not working on standalone cluster (java.io.IOException: Cannot run program compute-classpath.sh)

2013-11-25 Thread Aaron Davidson
There is a pull request currently to fix this exact issue, I believe, at https://github.com/apache/incubator-spark/pull/192. It's very small and only touches the script files, so you could apply it to your current version and distribute it to the workers. The fix here is that you add an additional

Re: spark-shell not working on standalone cluster (java.io.IOException: Cannot run program compute-classpath.sh)

2013-11-25 Thread Grega Kešpret
It seems there is already an open ticket for this - https://spark-project.atlassian.net/browse/SPARK-905 , but for version 0.7.3. Grega -- [image: Inline image 1] *Grega Kešpret* Analytics engineer Celtra — Rich Media Mobile Advertising celtra.com | @celtramobile

Re: spark-shell not working on standalone cluster (java.io.IOException: Cannot run program compute-classpath.sh)

2013-11-25 Thread Grega Kešpret
Sorry, forgot to mention, I run spark version "v0.8.0-incubating" from https://github.com/apache/incubator-spark.git. It seems to work when local Spark directory is also /opt/spark, so I think this confirms my doubt that SPARK_HOME somehow doesn't get passed to the Executor? Grega -- [image: Inlin

spark-shell not working on standalone cluster (java.io.IOException: Cannot run program compute-classpath.sh)

2013-11-25 Thread Grega Kešpret
Hi, I'm trying to run spark-shell and point it to Spark standalone cluster. I have Spark locally on a different directory than on cluster. Locally, I have it in "/home/grega/mab/analyzer/target/spark" and on the cluster I have it in "/opt/spark". When I run the spark-shell script with: SPARK_HOM