Re: Installing Spark on Mac

2016-03-15 Thread Aida Tefera
th me trying to change the IP address in SPARK_MASTER_IP to the IP address of the master node? If so, how would I go about doing that? Thanks, Aida Sent from my iPhone > On 11 Mar 2016, at 08:37, Jakob Odersky <ja...@odersky.com> wrote: > > regarding my previous message,

Re: Installing Spark on Mac

2016-03-10 Thread Aida
Hi Gaini, thanks for your response Please see the below contents of the files in the conf. directory: 1. docker.properties.template Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for

Re: Installing Spark on Mac

2016-03-10 Thread Aida Tefera
_HOME env param, downloaded a > fresh 1.6.0 tarball, > unzipped it to local dir (~/Downloads), and it ran just fine - the driver > port is some randomly generated large number. > So SPARK_HOME is definitely not needed to run this. > > Aida, you are not running this as the su

Re: Installing Spark on Mac

2016-03-09 Thread Aida Tefera
Hi Jakob, Tried running the command env|grep SPARK; nothing comes back Tried env|grep Spark; which is the directory I created for Spark once I downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark Tried running ./bin/spark-shell ; comes back with same error as below; i.e could

Re: Installing Spark on Mac

2016-03-09 Thread Aida Tefera
_HOME, it should look in the current directory > for the /conf dir. > > The defaults should be relatively safe, I’ve been using them with local mode > on my Mac for a long while without any need to change them. > >> On Mar 9, 2016, at 2:20 PM, Aida Tefera <aida1.tef...

Re: Installing Spark on Mac

2016-03-09 Thread Aida Tefera
eral configuration files. >> I believe that the two that you should look at are >> spark-defaults.conf >> spark-env.sh >> >> >>> On Mar 9, 2016, at 1:45 PM, Aida Tefera <aida1.tef...@gmail.com> wrote: >>> >>> Hi Tristan, thanks for your

Re: Installing Spark on Mac

2016-03-09 Thread Aida Tefera
messages etc. do I need to do anything else? Thanks, Aida Sent from my iPhone > On 8 Mar 2016, at 22:42, Jakob Odersky <ja...@odersky.com> wrote: > > I've had some issues myself with the user-provided-Hadoop version. > If you simply just want to get started, I would recommend d

Re: Installing Spark on Mac

2016-03-09 Thread Aida
Hi everyone, thanks for all your support I went with your suggestion Cody/Jakob and downloaded a pre-built version with Hadoop this time and I think I am finally making some progress :) ukdrfs01:spark-1.6.0-bin-hadoop2.6 aidatefera$ ./bin/spark-shell --master local[2] log4j:WARN No appenders

Re: Installing Spark on Mac

2016-03-08 Thread Aida Tefera
gt; > What happens when you do > > ./bin/spark-shell --master local[2] > > or > > ./bin/start-all.sh > > > >> On Tue, Mar 8, 2016 at 3:45 PM, Aida Tefera <aida1.tef...@gmail.com> wrote: >> Hi Cody, thanks for your reply >> >> I tried

Re: Installing Spark on Mac

2016-03-08 Thread Aida Tefera
Hi Cody, thanks for your reply I tried "sbt/sbt clean assembly" in the Terminal; somehow I still end up with errors. I have looked at the below links, doesn't give much detail on how to install it before executing "./sbin/start-master.sh" Thanks, Aida Sent from my iPh

Re: Installing Spark on Mac

2016-03-08 Thread Aida
tried sbt/sbt package; seemed to run fine until it didn't, was wondering whether the below error has to do with my JVM version. Any thoughts? Thanks ukdrfs01:~ aidatefera$ cd Spark ukdrfs01:Spark aidatefera$ cd spark-1.6.0 ukdrfs01:spark-1.6.0 aidatefera$ sbt/sbt package NOTE: The sbt/sbt script

Re: Installing Spark on Mac

2016-03-08 Thread Aida
Hi all, Thanks everyone for your responses; really appreciate it. Eduardo - I tried your suggestions but ran into some issues, please see below: ukdrfs01:Spark aidatefera$ cd spark-1.6.0 ukdrfs01:spark-1.6.0 aidatefera$ build/mvn -DskipTests clean package Using `mvn` from path: /usr/bin/mvn

Installing Spark on Mac

2016-03-04 Thread Aida
not planning to at this stage. I also downloaded Scala from the Scala website, do I need to download anything else? I am very eager to learn more about Spark but am unsure about the best way to do it. I would be happy for any suggestions or ideas Many thanks, Aida -- View this message in context