Re: Installing Spark on Mac

2016-03-15 Thread Aida Tefera
Hi Jakob, sorry for my late reply

I tried to run the below; came back with "netstat: lunt: unknown or 
uninstrumented protocol

I also tried uninstalling version 1.6.0 and installing version1.5.2 with Java 7 
and SCALA version 2.10.6; got the same error messages

Do you think it would be worth me trying to change the IP address in 
SPARK_MASTER_IP to the IP address of the master node? If so, how would I go 
about doing that? 

Thanks, 

Aida

Sent from my iPhone

> On 11 Mar 2016, at 08:37, Jakob Odersky  wrote:
> 
> regarding my previous message, I forgot to mention to run netstat as
> root (sudo netstat -plunt)
> sorry for the noise
> 
>> On Fri, Mar 11, 2016 at 12:29 AM, Jakob Odersky  wrote:
>> Some more diagnostics/suggestions:
>> 
>> 1) are other services listening to ports in the 4000 range (run
>> "netstat -plunt")? Maybe there is an issue with the error message
>> itself.
>> 
>> 2) are you sure the correct java version is used? java -version
>> 
>> 3) can you revert all installation attempts you have done so far,
>> including files installed by brew/macports or maven and try again?
>> 
>> 4) are there any special firewall rules in place, forbidding
>> connections on localhost?
>> 
>> This is very weird behavior you're seeing. Spark is supposed to work
>> out-of-the-box with ZERO configuration necessary for running a local
>> shell. Again, my prime suspect is a previous, failed Spark
>> installation messing up your config.
>> 
>>> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon  
>>> wrote:
>>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then 
>>> you’re the superuser.
>>> However, as mentioned below, I don’t think its a relevant factor.
>>> 
 On Mar 10, 2016, at 12:02 PM, Aida Tefera  wrote:
 
 Hi Tristan,
 
 I'm afraid I wouldn't know whether I'm running it as super user.
 
 I have java version 1.8.0_73 and SCALA version 2.11.7
 
 Sent from my iPhone
 
> On 9 Mar 2016, at 21:58, Tristan Nixon  wrote:
> 
> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
> fresh 1.6.0 tarball,
> unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
> port is some randomly generated large number.
> So SPARK_HOME is definitely not needed to run this.
> 
> Aida, you are not running this as the super-user, are you?  What versions 
> of Java & Scala do you have installed?
> 
>> On Mar 9, 2016, at 3:53 PM, Aida Tefera  wrote:
>> 
>> Hi Jakob,
>> 
>> Tried running the command env|grep SPARK; nothing comes back
>> 
>> Tried env|grep Spark; which is the directory I created for Spark once I 
>> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>> 
>> Tried running ./bin/spark-shell ; comes back with same error as below; 
>> i.e could not bind to port 0 etc.
>> 
>> Sent from my iPhone
>> 
>>> On 9 Mar 2016, at 21:42, Jakob Odersky  wrote:
>>> 
>>> As Tristan mentioned, it looks as though Spark is trying to bind on
>>> port 0 and then 1 (which is not allowed). Could it be that some
>>> environment variables from you previous installation attempts are
>>> polluting your configuration?
>>> What does running "env | grep SPARK" show you?
>>> 
>>> Also, try running just "/bin/spark-shell" (without the --master
>>> argument), maybe your shell is doing some funky stuff with the
>>> brackets.
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
>> 
>>> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon  
>>> wrote:
>>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then 
>>> you’re the superuser.
>>> However, as mentioned below, I don’t think its a relevant factor.
>>> 
 On Mar 10, 2016, at 12:02 PM, Aida Tefera  wrote:
 
 Hi Tristan,
 
 I'm afraid I wouldn't know whether I'm running it as super user.
 
 I have java version 1.8.0_73 and SCALA version 2.11.7
 
 Sent from my iPhone
 
> On 9 Mar 2016, at 21:58, Tristan Nixon  wrote:
> 
> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
> fresh 1.6.0 tarball,
> unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
> port is some randomly generated large number.
> So SPARK_HOME is definitely not needed to run 

Re: Installing Spark on Mac

2016-03-11 Thread Jakob Odersky
regarding my previous message, I forgot to mention to run netstat as
root (sudo netstat -plunt)
sorry for the noise

On Fri, Mar 11, 2016 at 12:29 AM, Jakob Odersky  wrote:
> Some more diagnostics/suggestions:
>
> 1) are other services listening to ports in the 4000 range (run
> "netstat -plunt")? Maybe there is an issue with the error message
> itself.
>
> 2) are you sure the correct java version is used? java -version
>
> 3) can you revert all installation attempts you have done so far,
> including files installed by brew/macports or maven and try again?
>
> 4) are there any special firewall rules in place, forbidding
> connections on localhost?
>
> This is very weird behavior you're seeing. Spark is supposed to work
> out-of-the-box with ZERO configuration necessary for running a local
> shell. Again, my prime suspect is a previous, failed Spark
> installation messing up your config.
>
> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon  wrote:
>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then 
>> you’re the superuser.
>> However, as mentioned below, I don’t think its a relevant factor.
>>
>>> On Mar 10, 2016, at 12:02 PM, Aida Tefera  wrote:
>>>
>>> Hi Tristan,
>>>
>>> I'm afraid I wouldn't know whether I'm running it as super user.
>>>
>>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>>
>>> Sent from my iPhone
>>>
 On 9 Mar 2016, at 21:58, Tristan Nixon  wrote:

 That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
 fresh 1.6.0 tarball,
 unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
 port is some randomly generated large number.
 So SPARK_HOME is definitely not needed to run this.

 Aida, you are not running this as the super-user, are you?  What versions 
 of Java & Scala do you have installed?

> On Mar 9, 2016, at 3:53 PM, Aida Tefera  wrote:
>
> Hi Jakob,
>
> Tried running the command env|grep SPARK; nothing comes back
>
> Tried env|grep Spark; which is the directory I created for Spark once I 
> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>
> Tried running ./bin/spark-shell ; comes back with same error as below; 
> i.e could not bind to port 0 etc.
>
> Sent from my iPhone
>
>> On 9 Mar 2016, at 21:42, Jakob Odersky  wrote:
>>
>> As Tristan mentioned, it looks as though Spark is trying to bind on
>> port 0 and then 1 (which is not allowed). Could it be that some
>> environment variables from you previous installation attempts are
>> polluting your configuration?
>> What does running "env | grep SPARK" show you?
>>
>> Also, try running just "/bin/spark-shell" (without the --master
>> argument), maybe your shell is doing some funky stuff with the
>> brackets.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org

>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>
>
> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon  wrote:
>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then 
>> you’re the superuser.
>> However, as mentioned below, I don’t think its a relevant factor.
>>
>>> On Mar 10, 2016, at 12:02 PM, Aida Tefera  wrote:
>>>
>>> Hi Tristan,
>>>
>>> I'm afraid I wouldn't know whether I'm running it as super user.
>>>
>>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>>
>>> Sent from my iPhone
>>>
 On 9 Mar 2016, at 21:58, Tristan Nixon  wrote:

 That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
 fresh 1.6.0 tarball,
 unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
 port is some randomly generated large number.
 So SPARK_HOME is definitely not needed to run this.

 Aida, you are not running this as the super-user, are you?  What versions 
 of Java & Scala do you have installed?

> On Mar 9, 2016, at 3:53 PM, Aida Tefera  wrote:
>
> Hi Jakob,
>
> Tried running the command env|grep SPARK; nothing comes back
>
> Tried env|grep Spark; which is the directory I created for Spark once I 
> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>
> Tried running ./bin/spark-shell ; comes back with same error as below; 
> i.e could not bind to port 0 etc.
>
> Sent from my iPhone
>
>> On 9 Mar 2016, at 21:42, Jakob Odersky 

Re: Installing Spark on Mac

2016-03-11 Thread Jakob Odersky
Some more diagnostics/suggestions:

1) are other services listening to ports in the 4000 range (run
"netstat -plunt")? Maybe there is an issue with the error message
itself.

2) are you sure the correct java version is used? java -version

3) can you revert all installation attempts you have done so far,
including files installed by brew/macports or maven and try again?

4) are there any special firewall rules in place, forbidding
connections on localhost?

This is very weird behavior you're seeing. Spark is supposed to work
out-of-the-box with ZERO configuration necessary for running a local
shell. Again, my prime suspect is a previous, failed Spark
installation messing up your config.

On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon  wrote:
> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then you’re 
> the superuser.
> However, as mentioned below, I don’t think its a relevant factor.
>
>> On Mar 10, 2016, at 12:02 PM, Aida Tefera  wrote:
>>
>> Hi Tristan,
>>
>> I'm afraid I wouldn't know whether I'm running it as super user.
>>
>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>
>> Sent from my iPhone
>>
>>> On 9 Mar 2016, at 21:58, Tristan Nixon  wrote:
>>>
>>> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
>>> fresh 1.6.0 tarball,
>>> unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
>>> port is some randomly generated large number.
>>> So SPARK_HOME is definitely not needed to run this.
>>>
>>> Aida, you are not running this as the super-user, are you?  What versions 
>>> of Java & Scala do you have installed?
>>>
 On Mar 9, 2016, at 3:53 PM, Aida Tefera  wrote:

 Hi Jakob,

 Tried running the command env|grep SPARK; nothing comes back

 Tried env|grep Spark; which is the directory I created for Spark once I 
 downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark

 Tried running ./bin/spark-shell ; comes back with same error as below; i.e 
 could not bind to port 0 etc.

 Sent from my iPhone

> On 9 Mar 2016, at 21:42, Jakob Odersky  wrote:
>
> As Tristan mentioned, it looks as though Spark is trying to bind on
> port 0 and then 1 (which is not allowed). Could it be that some
> environment variables from you previous installation attempts are
> polluting your configuration?
> What does running "env | grep SPARK" show you?
>
> Also, try running just "/bin/spark-shell" (without the --master
> argument), maybe your shell is doing some funky stuff with the
> brackets.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon  wrote:
> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then you’re 
> the superuser.
> However, as mentioned below, I don’t think its a relevant factor.
>
>> On Mar 10, 2016, at 12:02 PM, Aida Tefera  wrote:
>>
>> Hi Tristan,
>>
>> I'm afraid I wouldn't know whether I'm running it as super user.
>>
>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>
>> Sent from my iPhone
>>
>>> On 9 Mar 2016, at 21:58, Tristan Nixon  wrote:
>>>
>>> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
>>> fresh 1.6.0 tarball,
>>> unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
>>> port is some randomly generated large number.
>>> So SPARK_HOME is definitely not needed to run this.
>>>
>>> Aida, you are not running this as the super-user, are you?  What versions 
>>> of Java & Scala do you have installed?
>>>
 On Mar 9, 2016, at 3:53 PM, Aida Tefera  wrote:

 Hi Jakob,

 Tried running the command env|grep SPARK; nothing comes back

 Tried env|grep Spark; which is the directory I created for Spark once I 
 downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark

 Tried running ./bin/spark-shell ; comes back with same error as below; i.e 
 could not bind to port 0 etc.

 Sent from my iPhone

> On 9 Mar 2016, at 21:42, Jakob Odersky  wrote:
>
> As Tristan mentioned, it looks as though Spark is trying to bind on
> port 0 and then 1 (which is not allowed). Could it be that some
> environment variables from you previous installation attempts are
> polluting your configuration?
> What does running "env | grep SPARK" show 

Re: Installing Spark on Mac

2016-03-10 Thread Tristan Nixon
If you type ‘whoami’ in the terminal, and it responds with ‘root’ then you’re 
the superuser.
However, as mentioned below, I don’t think its a relevant factor.

> On Mar 10, 2016, at 12:02 PM, Aida Tefera  wrote:
> 
> Hi Tristan, 
> 
> I'm afraid I wouldn't know whether I'm running it as super user. 
> 
> I have java version 1.8.0_73 and SCALA version 2.11.7
> 
> Sent from my iPhone
> 
>> On 9 Mar 2016, at 21:58, Tristan Nixon  wrote:
>> 
>> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
>> fresh 1.6.0 tarball, 
>> unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
>> port is some randomly generated large number.
>> So SPARK_HOME is definitely not needed to run this.
>> 
>> Aida, you are not running this as the super-user, are you?  What versions of 
>> Java & Scala do you have installed?
>> 
>>> On Mar 9, 2016, at 3:53 PM, Aida Tefera  wrote:
>>> 
>>> Hi Jakob,
>>> 
>>> Tried running the command env|grep SPARK; nothing comes back 
>>> 
>>> Tried env|grep Spark; which is the directory I created for Spark once I 
>>> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>>> 
>>> Tried running ./bin/spark-shell ; comes back with same error as below; i.e 
>>> could not bind to port 0 etc.
>>> 
>>> Sent from my iPhone
>>> 
 On 9 Mar 2016, at 21:42, Jakob Odersky  wrote:
 
 As Tristan mentioned, it looks as though Spark is trying to bind on
 port 0 and then 1 (which is not allowed). Could it be that some
 environment variables from you previous installation attempts are
 polluting your configuration?
 What does running "env | grep SPARK" show you?
 
 Also, try running just "/bin/spark-shell" (without the --master
 argument), maybe your shell is doing some funky stuff with the
 brackets.
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>> 
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-10 Thread Aida
er of cores to use on this machine
# - SPARK_WORKER_MEMORY, to set how much total memory workers have to give
executors (e.g. 1000m, 2g)
# - SPARK_WORKER_PORT / SPARK_WORKER_WEBUI_PORT, to use non-default ports
for the worker
# - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
# - SPARK_WORKER_DIR, to set the working directory of worker processes
# - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g.
"-Dx=y")
# - SPARK_DAEMON_MEMORY, to allocate to the master, worker and history
server themselves (default: 1g).
# - SPARK_HISTORY_OPTS, to set config properties only for the history server
(e.g. "-Dx=y")
# - SPARK_SHUFFLE_OPTS, to set config properties only for the external
shuffle service (e.g. "-Dx=y")
# - SPARK_DAEMON_JAVA_OPTS, to set config properties for all daemons (e.g.
"-Dx=y")
# - SPARK_PUBLIC_DNS, to set the public dns name of the master or workers

# Generic options for the daemons used in the standalone deploy mode
# - SPARK_CONF_DIR  Alternate conf dir. (Default: ${SPARK_HOME}/conf)
# - SPARK_LOG_DIR   Where log files are stored.  (Default:
${SPARK_HOME}/logs)
# - SPARK_PID_DIR   Where the pid file is stored. (Default: /tmp)
# - SPARK_IDENT_STRING  A string representing this instance of spark.
(Default: $USER)
# - SPARK_NICENESS  The scheduling priority for daemons. (Default: 0)




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26450.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-10 Thread Aida Tefera
Hi Tristan, 

I'm afraid I wouldn't know whether I'm running it as super user. 

I have java version 1.8.0_73 and SCALA version 2.11.7

Sent from my iPhone

> On 9 Mar 2016, at 21:58, Tristan Nixon  wrote:
> 
> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
> fresh 1.6.0 tarball, 
> unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
> port is some randomly generated large number.
> So SPARK_HOME is definitely not needed to run this.
> 
> Aida, you are not running this as the super-user, are you?  What versions of 
> Java & Scala do you have installed?
> 
>> On Mar 9, 2016, at 3:53 PM, Aida Tefera  wrote:
>> 
>> Hi Jakob,
>> 
>> Tried running the command env|grep SPARK; nothing comes back 
>> 
>> Tried env|grep Spark; which is the directory I created for Spark once I 
>> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>> 
>> Tried running ./bin/spark-shell ; comes back with same error as below; i.e 
>> could not bind to port 0 etc.
>> 
>> Sent from my iPhone
>> 
>>> On 9 Mar 2016, at 21:42, Jakob Odersky  wrote:
>>> 
>>> As Tristan mentioned, it looks as though Spark is trying to bind on
>>> port 0 and then 1 (which is not allowed). Could it be that some
>>> environment variables from you previous installation attempts are
>>> polluting your configuration?
>>> What does running "env | grep SPARK" show you?
>>> 
>>> Also, try running just "/bin/spark-shell" (without the --master
>>> argument), maybe your shell is doing some funky stuff with the
>>> brackets.
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Tristan Nixon
It really shouldn’t, if anything, running as superuser should ALLOW you to bind 
to ports 0, 1 etc.
It seems very strange that it should even be trying to bind to these ports - 
maybe a JVM issue?
I wonder if the old Apple JVM implementations could have used some different 
native libraries for core networking like this...

> On Mar 10, 2016, at 12:40 AM, Gaini Rajeshwar  
> wrote:
> 
> It works just fine as super-user as well.



Re: Installing Spark on Mac

2016-03-09 Thread Gaini Rajeshwar
It should just work with these steps. You don't need to configure much. As
mentioned, some settings on your machine are overriding default spark
settings.

Even running as super-user should not be a problem. It works just fine as
super-user as well.

Can you tell us what version of Java you are using ? Also, can you post the
contents of each file in conf directory.



On Thu, Mar 10, 2016 at 3:28 AM, Tristan Nixon 
wrote:

> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a
> fresh 1.6.0 tarball,
> unzipped it to local dir (~/Downloads), and it ran just fine - the driver
> port is some randomly generated large number.
> So SPARK_HOME is definitely not needed to run this.
>
> Aida, you are not running this as the super-user, are you?  What versions
> of Java & Scala do you have installed?
>
> > On Mar 9, 2016, at 3:53 PM, Aida Tefera  wrote:
> >
> > Hi Jakob,
> >
> > Tried running the command env|grep SPARK; nothing comes back
> >
> > Tried env|grep Spark; which is the directory I created for Spark once I
> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
> >
> > Tried running ./bin/spark-shell ; comes back with same error as below;
> i.e could not bind to port 0 etc.
> >
> > Sent from my iPhone
> >
> >> On 9 Mar 2016, at 21:42, Jakob Odersky  wrote:
> >>
> >> As Tristan mentioned, it looks as though Spark is trying to bind on
> >> port 0 and then 1 (which is not allowed). Could it be that some
> >> environment variables from you previous installation attempts are
> >> polluting your configuration?
> >> What does running "env | grep SPARK" show you?
> >>
> >> Also, try running just "/bin/spark-shell" (without the --master
> >> argument), maybe your shell is doing some funky stuff with the
> >> brackets.
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Installing Spark on Mac

2016-03-09 Thread Tristan Nixon
That’s very strange. I just un-set my SPARK_HOME env param, downloaded a fresh 
1.6.0 tarball, 
unzipped it to local dir (~/Downloads), and it ran just fine - the driver port 
is some randomly generated large number.
So SPARK_HOME is definitely not needed to run this.

Aida, you are not running this as the super-user, are you?  What versions of 
Java & Scala do you have installed?

> On Mar 9, 2016, at 3:53 PM, Aida Tefera  wrote:
> 
> Hi Jakob,
> 
> Tried running the command env|grep SPARK; nothing comes back 
> 
> Tried env|grep Spark; which is the directory I created for Spark once I 
> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
> 
> Tried running ./bin/spark-shell ; comes back with same error as below; i.e 
> could not bind to port 0 etc.
> 
> Sent from my iPhone
> 
>> On 9 Mar 2016, at 21:42, Jakob Odersky  wrote:
>> 
>> As Tristan mentioned, it looks as though Spark is trying to bind on
>> port 0 and then 1 (which is not allowed). Could it be that some
>> environment variables from you previous installation attempts are
>> polluting your configuration?
>> What does running "env | grep SPARK" show you?
>> 
>> Also, try running just "/bin/spark-shell" (without the --master
>> argument), maybe your shell is doing some funky stuff with the
>> brackets.
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Aida Tefera
Hi Jakob,

Tried running the command env|grep SPARK; nothing comes back 

Tried env|grep Spark; which is the directory I created for Spark once I 
downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark

Tried running ./bin/spark-shell ; comes back with same error as below; i.e 
could not bind to port 0 etc.

Sent from my iPhone

> On 9 Mar 2016, at 21:42, Jakob Odersky  wrote:
> 
> As Tristan mentioned, it looks as though Spark is trying to bind on
> port 0 and then 1 (which is not allowed). Could it be that some
> environment variables from you previous installation attempts are
> polluting your configuration?
> What does running "env | grep SPARK" show you?
> 
> Also, try running just "/bin/spark-shell" (without the --master
> argument), maybe your shell is doing some funky stuff with the
> brackets.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Jakob Odersky
nelHandlerContext.java:415)
>
> at
>
> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
>
> at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
>
> at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
>
> at
>
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>
> at
>
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>
> at java.lang.Thread.run(Thread.java:745)
>
>
> java.lang.NullPointerException
>
> at
>
> org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
>
> at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>
> at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
>
> at $iwC$$iwC.(:15)
>
> at $iwC.(:24)
>
> at (:26)
>
> at .(:30)
>
> at .()
>
> at .(:7)
>
> at .()
>
> at $print()
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:497)
>
> at
>
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>
> at
>
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
>
> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>
> at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>
> at
>
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>
> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>
> at
>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
>
> at
>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
>
> at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>
> at
>
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
>
> at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>
> at
>
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>
> at
>
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
>
> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>
> at
>
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
>
> at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>
> at
>
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>
> at
>
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>
> at
>
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>
> at
>
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>
> at
>
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>
> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>
> at org.apache.spark.repl.Main$.main(Main.scala:31)
>
> at org.apache.spark.repl.Main.main(Main.scala)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:497)
>
> at
>
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
> :16: error: not found: value sqlContext
>
> import sqlContext.implicits._
>
>^
>
> :16: error: not found: value sqlContext
>
> import sqlContext.sql
>
>^
>
>
> scala>
>
>
>
>
>
> --
>
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
>
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>
> -
>
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
> -
>
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
> -
>
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Jakob Odersky
As Tristan mentioned, it looks as though Spark is trying to bind on
port 0 and then 1 (which is not allowed). Could it be that some
environment variables from you previous installation attempts are
polluting your configuration?
What does running "env | grep SPARK" show you?

Also, try running just "/bin/spark-shell" (without the --master
argument), maybe your shell is doing some funky stuff with the
brackets.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Aida Tefera
rverSocketChannel.java:125)
>>>>>>>> at
>>>>>>>> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
>>>>>>>> at
>>>>>>>> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
>>>>>>>> at
>>>>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
>>>>>>>> at
>>>>>>>> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
>>>>>>>> at
>>>>>>>> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
>>>>>>>> at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
>>>>>>>> at 
>>>>>>>> io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
>>>>>>>> at
>>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>>>>>>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>>>>>> at
>>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>> java.net.BindException: Can't assign requested address: Service
>>>>>>>> 'sparkDriver' failed after 16 retries!
>>>>>>>> at sun.nio.ch.Net.bind0(Native Method)
>>>>>>>> at sun.nio.ch.Net.bind(Net.java:433)
>>>>>>>> at sun.nio.ch.Net.bind(Net.java:425)
>>>>>>>> at
>>>>>>>> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
>>>>>>>> at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>>>>>>>> at
>>>>>>>> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
>>>>>>>> at
>>>>>>>> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
>>>>>>>> at
>>>>>>>> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
>>>>>>>> at
>>>>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
>>>>>>>> at
>>>>>>>> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
>>>>>>>> at
>>>>>>>> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
>>>>>>>> at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
>>>>>>>> at 
>>>>>>>> io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
>>>>>>>> at
>>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>>>>>>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>>>>>> at
>>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>> 
>>>>>>>> java.lang.NullPointerException
>>>>>>>> at
>>>>>>>> org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
>>>>>>>> at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
>>>>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
>>>>>>>> Method)
>>>>>>>> at
>>>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>>>>>> at
>>>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>>>>>>> at 
>>>>>>>> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
>>>>>>>> at $iwC$$iwC.(:15)
>>>>>>>

Re: Installing Spark on Mac

2016-03-09 Thread Tristan Nixon
ncurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>> java.net.BindException: Can't assign requested address: Service
>>>>>>> 'sparkDriver' failed after 16 retries!
>>>>>>> at sun.nio.ch.Net.bind0(Native Method)
>>>>>>> at sun.nio.ch.Net.bind(Net.java:433)
>>>>>>> at sun.nio.ch.Net.bind(Net.java:425)
>>>>>>> at
>>>>>>> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
>>>>>>> at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>>>>>>> at
>>>>>>> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
>>>>>>> at
>>>>>>> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
>>>>>>> at
>>>>>>> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
>>>>>>> at
>>>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
>>>>>>> at
>>>>>>> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
>>>>>>> at
>>>>>>> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
>>>>>>> at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
>>>>>>> at 
>>>>>>> io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
>>>>>>> at
>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>>>>>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>>>>> at
>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>> 
>>>>>>> java.lang.NullPointerException
>>>>>>> at
>>>>>>> org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
>>>>>>> at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
>>>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>>>>>> at 
>>>>>>> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
>>>>>>> at $iwC$$iwC.(:15)
>>>>>>> at $iwC.(:24)
>>>>>>> at (:26)
>>>>>>> at .(:30)
>>>>>>> at .()
>>>>>>> at .(:7)
>>>>>>> at .()
>>>>>>> at $print()
>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
>>>>>>> at 
>>>>>>> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>>>>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>>>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>>>>> at 
>>>>>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>>>>>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
>>>>>>> at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
>>>>>>> at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
>>>>>>> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
>>>>>>> at 
>>>>>>> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>>>> at
>>>>>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>>>>> at
>>>>>>> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>>>>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>>>>>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>>>>> at org.apache.spark.repl.Main.main(Main.scala)
>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>>> at
>>>>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>>>>>>> at 
>>>>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>>>>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>>>>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>>>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>>>> 
>>>>>>> :16: error: not found: value sqlContext
>>>>>>>   import sqlContext.implicits._
>>>>>>>  ^
>>>>>>> :16: error: not found: value sqlContext
>>>>>>>   import sqlContext.sql
>>>>>>>  ^
>>>>>>> 
>>>>>>> scala> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> --
>>>>>>> View this message in context: 
>>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
>>>>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>>>> 
>>>>>>> -
>>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>> 
>>>>>> 
>>>>>> -
>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>> 
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>> 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Aida Tefera
hannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
>>>>>> at
>>>>>> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
>>>>>> at
>>>>>> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
>>>>>> at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
>>>>>> at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
>>>>>> at
>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>>>>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>>>> at
>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>> 
>>>>>> java.lang.NullPointerException
>>>>>> at
>>>>>> org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
>>>>>> at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
>>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>>>> at
>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>>>> at
>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>>>>> at 
>>>>>> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
>>>>>> at $iwC$$iwC.(:15)
>>>>>> at $iwC.(:24)
>>>>>> at (:26)
>>>>>> at .(:30)
>>>>>> at .()
>>>>>> at .(:7)
>>>>>> at .()
>>>>>> at $print()
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>> at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
>>>>>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>>>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>>>> at 
>>>>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>>>>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
>>>>>> at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
>>>>>> at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
>>>>>> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
>>>>>> at 
>>>>>> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>>> at
>>>>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>>>> at
>>>>>> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>>>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>>>>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>>>> at org.apache.spark.repl.Main.main(Main.scala)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>> at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>> at
>>>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>>>>>> at 
>>>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>>>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>>>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>>> 
>>>>>> :16: error: not found: value sqlContext
>>>>>>import sqlContext.implicits._
>>>>>>   ^
>>>>>> :16: error: not found: value sqlContext
>>>>>>import sqlContext.sql
>>>>>>   ^
>>>>>> 
>>>>>> scala> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> --
>>>>>> View this message in context: 
>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
>>>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>>> 
>>>>>> -
>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>> 
>>>>> 
>>>>> -
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>> 
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Tristan Nixon
;>>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>>>  at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>>>  at
>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>  at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>>>>  at 
>>>>> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
>>>>>  at $iwC$$iwC.(:15)
>>>>>  at $iwC.(:24)
>>>>>  at (:26)
>>>>>  at .(:30)
>>>>>  at .()
>>>>>  at .(:7)
>>>>>  at .()
>>>>>  at $print()
>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>  at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>  at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>  at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>  at
>>>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>>>>  at
>>>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
>>>>>  at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>>>>  at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>>>  at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>>>  at 
>>>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>>>>  at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
>>>>>  at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
>>>>>  at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
>>>>>  at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
>>>>>  at 
>>>>> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>>  at
>>>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>>>  at
>>>>> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>>>>  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>>>>  at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>>>  at org.apache.spark.repl.Main.main(Main.scala)
>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>  at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>  at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>  at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>  at
>>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>>>>>  at 
>>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>>>>>  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>>>>>  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>>>>  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>> 
>>>>> :16: error: not found: value sqlContext
>>>>> import sqlContext.implicits._
>>>>>^
>>>>> :16: error: not found: value sqlContext
>>>>> import sqlContext.sql
>>>>>^
>>>>> 
>>>>> scala> 
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> --
>>>>> View this message in context: 
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
>>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>> 
>>>>> -
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>> 
>>>> 
>>>> -
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>> 
> 
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Tristan Nixon
>>>>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>   at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>   at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>   at java.lang.reflect.Method.invoke(Method.java:497)
>>>>   at
>>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>>>   at
>>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
>>>>   at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>>>   at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>>   at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>>   at 
>>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>>>   at
>>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>>>   at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>>>   at
>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
>>>>   at
>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
>>>>   at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>>>>   at
>>>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
>>>>   at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>>>>   at
>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>>>>   at
>>>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
>>>>   at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>>>   at
>>>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
>>>>   at 
>>>> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>>>>   at
>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>>>>   at
>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>   at
>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>   at
>>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>>   at
>>>> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>>>   at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>>>   at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>>   at org.apache.spark.repl.Main.main(Main.scala)
>>>>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>   at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>   at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>   at java.lang.reflect.Method.invoke(Method.java:497)
>>>>   at
>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>>>>   at 
>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>>>>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>>>>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>>>   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>> 
>>>> :16: error: not found: value sqlContext
>>>>  import sqlContext.implicits._
>>>> ^
>>>> :16: error: not found: value sqlContext
>>>>  import sqlContext.sql
>>>> ^
>>>> 
>>>> scala> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>> --
>>>> View this message in context: 
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>> 
>>>> -
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>> 
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>> 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Tristan Nixon
$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>>  at
>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
>>  at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>  at
>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
>>  at 
>> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>>  at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>>  at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>  at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>  at
>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>  at
>> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>  at org.apache.spark.repl.Main$.main(Main.scala:31)
>>  at org.apache.spark.repl.Main.main(Main.scala)
>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>  at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>  at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>  at java.lang.reflect.Method.invoke(Method.java:497)
>>  at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>>  at 
>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>>  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>>  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> 
>> :16: error: not found: value sqlContext
>>import sqlContext.implicits._
>>   ^
>> :16: error: not found: value sqlContext
>>import sqlContext.sql
>>   ^
>> 
>> scala> 
>> 
>> 
>> 
>> 
>> --
>> View this message in context: 
>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>> 
> 
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Tristan Nixon
park.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>   at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>   at org.apache.spark.repl.Main$.main(Main.scala:31)
>   at org.apache.spark.repl.Main.main(Main.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:497)
>   at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>   at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 
> :16: error: not found: value sqlContext
> import sqlContext.implicits._
>^
> :16: error: not found: value sqlContext
> import sqlContext.sql
>^
> 
> scala> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Aida Tefera
pot(TM) 64-Bit Server VM warning: ignoring option
>>>>>>> MaxPermSize=512M;
>>>>>>> support was removed in 8.0
>>>>>>> [INFO] Scanning for projects...
>>>>>>> [INFO]
>>>>>>> 
>>>>>>> [INFO] Reactor Build Order:
>>>>>>> [INFO]
>>>>>>> [INFO] Spark Project Parent POM
>>>>>>> [INFO] Spark Project Test Tags
>>>>>>> [INFO] Spark Project Launcher
>>>>>>> [INFO] Spark Project Networking
>>>>>>> [INFO] Spark Project Shuffle Streaming Service
>>>>>>> [INFO] Spark Project Unsafe
>>>>>>> [INFO] Spark Project Core
>>>>>>> [INFO] Spark Project Bagel
>>>>>>> [INFO] Spark Project GraphX
>>>>>>> [INFO] Spark Project Streaming
>>>>>>> [INFO] Spark Project Catalyst
>>>>>>> [INFO] Spark Project SQL
>>>>>>> [INFO] Spark Project ML Library
>>>>>>> [INFO] Spark Project Tools
>>>>>>> [INFO] Spark Project Hive
>>>>>>> [INFO] Spark Project Docker Integration Tests
>>>>>>> [INFO] Spark Project REPL
>>>>>>> [INFO] Spark Project Assembly
>>>>>>> [INFO] Spark Project External Twitter
>>>>>>> [INFO] Spark Project External Flume Sink
>>>>>>> [INFO] Spark Project External Flume
>>>>>>> [INFO] Spark Project External Flume Assembly
>>>>>>> [INFO] Spark Project External MQTT
>>>>>>> [INFO] Spark Project External MQTT Assembly
>>>>>>> [INFO] Spark Project External ZeroMQ
>>>>>>> [INFO] Spark Project External Kafka
>>>>>>> [INFO] Spark Project Examples
>>>>>>> [INFO] Spark Project External Kafka Assembly
>>>>>>> [INFO]
>>>>>>> [INFO]
>>>>>>> 
>>>>>>> [INFO] Building Spark Project Parent POM 1.6.0
>>>>>>> [INFO]
>>>>>>> 
>>>>>>> [INFO]
>>>>>>> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
>>>>>>> spark-parent_2.10 ---
>>>>>>> [INFO]
>>>>>>> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
>>>>>>> spark-parent_2.10 ---
>>>>>>> [WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion
>>>>>>> failed with message:
>>>>>>> Detected Maven Version: 3.0.3 is not in the allowed range 3.3.3.
>>>>>>> [INFO]
>>>>>>> 
>>>>>>> [INFO] Reactor Summary:
>>>>>>> [INFO]
>>>>>>> [INFO] Spark Project Parent POM .. FAILURE
>>>>>>> [0.821s]
>>>>>>> [INFO] Spark Project Test Tags ... SKIPPED
>>>>>>> [INFO] Spark Project Launcher  SKIPPED
>>>>>>> [INFO] Spark Project Networking .. SKIPPED
>>>>>>> [INFO] Spark Project Shuffle Streaming Service ... SKIPPED
>>>>>>> [INFO] Spark Project Unsafe .. SKIPPED
>>>>>>> [INFO] Spark Project Core  SKIPPED
>>>>>>> [INFO] Spark Project Bagel ... SKIPPED
>>>>>>> [INFO] Spark Project GraphX .. SKIPPED
>>>>>>> [INFO] Spark Project Streaming ... SKIPPED
>>>>>>> [INFO] Spark Project Catalyst  SKIPPED
>>>>>>> [INFO] Spark Project SQL . SKIPPED
>>>>>>> [INFO] Spark Project ML Library .. SKIPPED
>>>>>>> [INFO] Spark Project Tools ... SKIPPED
>>>>>>> [INFO] Spark Project Hive  SKIPPED
>>>>>>> [INFO] Spark Project Docker I

Re: Installing Spark on Mac

2016-03-09 Thread Aida
ontext.sql
^

scala> 




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-09 Thread Steve Loughran

> On 8 Mar 2016, at 18:06, Aida  wrote:
> 
> Detected Maven Version: 3.0.3 is not in the allowed range 3.3.3.

I'd look at that error message and fix it


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-08 Thread Jakob Odersky
gt; [INFO] Spark Project Streaming
>>>>>> [INFO] Spark Project Catalyst
>>>>>> [INFO] Spark Project SQL
>>>>>> [INFO] Spark Project ML Library
>>>>>> [INFO] Spark Project Tools
>>>>>> [INFO] Spark Project Hive
>>>>>> [INFO] Spark Project Docker Integration Tests
>>>>>> [INFO] Spark Project REPL
>>>>>> [INFO] Spark Project Assembly
>>>>>> [INFO] Spark Project External Twitter
>>>>>> [INFO] Spark Project External Flume Sink
>>>>>> [INFO] Spark Project External Flume
>>>>>> [INFO] Spark Project External Flume Assembly
>>>>>> [INFO] Spark Project External MQTT
>>>>>> [INFO] Spark Project External MQTT Assembly
>>>>>> [INFO] Spark Project External ZeroMQ
>>>>>> [INFO] Spark Project External Kafka
>>>>>> [INFO] Spark Project Examples
>>>>>> [INFO] Spark Project External Kafka Assembly
>>>>>> [INFO]
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] Building Spark Project Parent POM 1.6.0
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO]
>>>>>> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
>>>>>> spark-parent_2.10 ---
>>>>>> [INFO]
>>>>>> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
>>>>>> spark-parent_2.10 ---
>>>>>> [WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion
>>>>>> failed with message:
>>>>>> Detected Maven Version: 3.0.3 is not in the allowed range 3.3.3.
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] Reactor Summary:
>>>>>> [INFO]
>>>>>> [INFO] Spark Project Parent POM .. FAILURE
>>>>>> [0.821s]
>>>>>> [INFO] Spark Project Test Tags ... SKIPPED
>>>>>> [INFO] Spark Project Launcher  SKIPPED
>>>>>> [INFO] Spark Project Networking .. SKIPPED
>>>>>> [INFO] Spark Project Shuffle Streaming Service ... SKIPPED
>>>>>> [INFO] Spark Project Unsafe .. SKIPPED
>>>>>> [INFO] Spark Project Core  SKIPPED
>>>>>> [INFO] Spark Project Bagel ... SKIPPED
>>>>>> [INFO] Spark Project GraphX .. SKIPPED
>>>>>> [INFO] Spark Project Streaming ... SKIPPED
>>>>>> [INFO] Spark Project Catalyst  SKIPPED
>>>>>> [INFO] Spark Project SQL . SKIPPED
>>>>>> [INFO] Spark Project ML Library .. SKIPPED
>>>>>> [INFO] Spark Project Tools ... SKIPPED
>>>>>> [INFO] Spark Project Hive  SKIPPED
>>>>>> [INFO] Spark Project Docker Integration Tests  SKIPPED
>>>>>> [INFO] Spark Project REPL  SKIPPED
>>>>>> [INFO] Spark Project Assembly  SKIPPED
>>>>>> [INFO] Spark Project External Twitter  SKIPPED
>>>>>> [INFO] Spark Project External Flume Sink . SKIPPED
>>>>>> [INFO] Spark Project External Flume .. SKIPPED
>>>>>> [INFO] Spark Project External Flume Assembly . SKIPPED
>>>>>> [INFO] Spark Project External MQTT ... SKIPPED
>>>>>> [INFO] Spark Project External MQTT Assembly .. SKIPPED
>>>>>> [INFO] Spark Project External ZeroMQ . SKIPPED
>>>>>> [INFO] Spark Project External Kafka .. SKIPPED
>>>>>> [INFO] Spark Project Examples  SKIPPED
>>>>>> [INFO] Spark Project External Kafka Assembly . SKIPPED
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] BUILD FAILURE
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] Total time: 1.745s
>>>>>> [INFO] Finished at: Tue Mar 08 18:01:48 GMT 2016
>>>>>> [INFO] Final Memory: 19M/183M
>>>>>> [INFO]
>>>>>> 
>>>>>> [ERROR] Failed to execute goal
>>>>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>>>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>>>>>> failed. Look above for specific messages explaining why the rule failed.
>>>>>> ->
>>>>>> [Help 1]
>>>>>> [ERROR]
>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>>>>> -e
>>>>>> switch.
>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>>> [ERROR]
>>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>>> please
>>>>>> read the following articles:
>>>>>> [ERROR] [Help 1]
>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>> ukdrfs01:spark-1.6.0 aidatefera$
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> View this message in context:
>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
>>>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>>>
>>>>>> -
>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>
>>>>> Informativa sulla Privacy: http://www.unibs.it/node/8155
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-08 Thread Cody Koeninger
Spark Project SQL
>>>>>> [INFO] Spark Project ML Library
>>>>>> [INFO] Spark Project Tools
>>>>>> [INFO] Spark Project Hive
>>>>>> [INFO] Spark Project Docker Integration Tests
>>>>>> [INFO] Spark Project REPL
>>>>>> [INFO] Spark Project Assembly
>>>>>> [INFO] Spark Project External Twitter
>>>>>> [INFO] Spark Project External Flume Sink
>>>>>> [INFO] Spark Project External Flume
>>>>>> [INFO] Spark Project External Flume Assembly
>>>>>> [INFO] Spark Project External MQTT
>>>>>> [INFO] Spark Project External MQTT Assembly
>>>>>> [INFO] Spark Project External ZeroMQ
>>>>>> [INFO] Spark Project External Kafka
>>>>>> [INFO] Spark Project Examples
>>>>>> [INFO] Spark Project External Kafka Assembly
>>>>>> [INFO]
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] Building Spark Project Parent POM 1.6.0
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO]
>>>>>> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
>>>>>> spark-parent_2.10 ---
>>>>>> [INFO]
>>>>>> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
>>>>>> spark-parent_2.10 ---
>>>>>> [WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion
>>>>>> failed with message:
>>>>>> Detected Maven Version: 3.0.3 is not in the allowed range 3.3.3.
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] Reactor Summary:
>>>>>> [INFO]
>>>>>> [INFO] Spark Project Parent POM .. FAILURE
>>>>>> [0.821s]
>>>>>> [INFO] Spark Project Test Tags ... SKIPPED
>>>>>> [INFO] Spark Project Launcher  SKIPPED
>>>>>> [INFO] Spark Project Networking .. SKIPPED
>>>>>> [INFO] Spark Project Shuffle Streaming Service ... SKIPPED
>>>>>> [INFO] Spark Project Unsafe .. SKIPPED
>>>>>> [INFO] Spark Project Core  SKIPPED
>>>>>> [INFO] Spark Project Bagel ... SKIPPED
>>>>>> [INFO] Spark Project GraphX .. SKIPPED
>>>>>> [INFO] Spark Project Streaming ... SKIPPED
>>>>>> [INFO] Spark Project Catalyst  SKIPPED
>>>>>> [INFO] Spark Project SQL . SKIPPED
>>>>>> [INFO] Spark Project ML Library .. SKIPPED
>>>>>> [INFO] Spark Project Tools ... SKIPPED
>>>>>> [INFO] Spark Project Hive  SKIPPED
>>>>>> [INFO] Spark Project Docker Integration Tests  SKIPPED
>>>>>> [INFO] Spark Project REPL  SKIPPED
>>>>>> [INFO] Spark Project Assembly  SKIPPED
>>>>>> [INFO] Spark Project External Twitter  SKIPPED
>>>>>> [INFO] Spark Project External Flume Sink . SKIPPED
>>>>>> [INFO] Spark Project External Flume .. SKIPPED
>>>>>> [INFO] Spark Project External Flume Assembly . SKIPPED
>>>>>> [INFO] Spark Project External MQTT ... SKIPPED
>>>>>> [INFO] Spark Project External MQTT Assembly .. SKIPPED
>>>>>> [INFO] Spark Project External ZeroMQ . SKIPPED
>>>>>> [INFO] Spark Project External Kafka .. SKIPPED
>>>>>> [INFO] Spark Project Examples  SKIPPED
>>>>>> [INFO] Spark Project External Kafka Assembly . SKIPPED
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] BUILD FAILURE
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] Total time: 1.745s
>>>>>> [INFO] Finished at: Tue Mar 08 18:01:48 GMT 2016
>>>>>> [INFO] Final Memory: 19M/183M
>>>>>> [INFO]
>>>>>> 
>>>>>> [ERROR] Failed to execute goal
>>>>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>>>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>>>>>> failed. Look above for specific messages explaining why the rule failed.
>>>>>> ->
>>>>>> [Help 1]
>>>>>> [ERROR]
>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>>>>> -e
>>>>>> switch.
>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>>> [ERROR]
>>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>>> please
>>>>>> read the following articles:
>>>>>> [ERROR] [Help 1]
>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>> ukdrfs01:spark-1.6.0 aidatefera$
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> View this message in context:
>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
>>>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>>>
>>>>>> -
>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>
>>>>> Informativa sulla Privacy: http://www.unibs.it/node/8155

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-08 Thread Aida Tefera
;>>>> [INFO] Spark Project Examples
>>>>> [INFO] Spark Project External Kafka Assembly
>>>>> [INFO]
>>>>> [INFO]
>>>>> 
>>>>> [INFO] Building Spark Project Parent POM 1.6.0
>>>>> [INFO]
>>>>> 
>>>>> [INFO]
>>>>> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
>>>>> spark-parent_2.10 ---
>>>>> [INFO]
>>>>> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
>>>>> spark-parent_2.10 ---
>>>>> [WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion
>>>>> failed with message:
>>>>> Detected Maven Version: 3.0.3 is not in the allowed range 3.3.3.
>>>>> [INFO]
>>>>> 
>>>>> [INFO] Reactor Summary:
>>>>> [INFO]
>>>>> [INFO] Spark Project Parent POM .. FAILURE
>>>>> [0.821s]
>>>>> [INFO] Spark Project Test Tags ... SKIPPED
>>>>> [INFO] Spark Project Launcher  SKIPPED
>>>>> [INFO] Spark Project Networking .. SKIPPED
>>>>> [INFO] Spark Project Shuffle Streaming Service ... SKIPPED
>>>>> [INFO] Spark Project Unsafe .. SKIPPED
>>>>> [INFO] Spark Project Core  SKIPPED
>>>>> [INFO] Spark Project Bagel ... SKIPPED
>>>>> [INFO] Spark Project GraphX .. SKIPPED
>>>>> [INFO] Spark Project Streaming ... SKIPPED
>>>>> [INFO] Spark Project Catalyst  SKIPPED
>>>>> [INFO] Spark Project SQL . SKIPPED
>>>>> [INFO] Spark Project ML Library .. SKIPPED
>>>>> [INFO] Spark Project Tools ... SKIPPED
>>>>> [INFO] Spark Project Hive  SKIPPED
>>>>> [INFO] Spark Project Docker Integration Tests  SKIPPED
>>>>> [INFO] Spark Project REPL  SKIPPED
>>>>> [INFO] Spark Project Assembly  SKIPPED
>>>>> [INFO] Spark Project External Twitter  SKIPPED
>>>>> [INFO] Spark Project External Flume Sink . SKIPPED
>>>>> [INFO] Spark Project External Flume .. SKIPPED
>>>>> [INFO] Spark Project External Flume Assembly . SKIPPED
>>>>> [INFO] Spark Project External MQTT ... SKIPPED
>>>>> [INFO] Spark Project External MQTT Assembly .. SKIPPED
>>>>> [INFO] Spark Project External ZeroMQ . SKIPPED
>>>>> [INFO] Spark Project External Kafka .. SKIPPED
>>>>> [INFO] Spark Project Examples  SKIPPED
>>>>> [INFO] Spark Project External Kafka Assembly . SKIPPED
>>>>> [INFO]
>>>>> 
>>>>> [INFO] BUILD FAILURE
>>>>> [INFO]
>>>>> 
>>>>> [INFO] Total time: 1.745s
>>>>> [INFO] Finished at: Tue Mar 08 18:01:48 GMT 2016
>>>>> [INFO] Final Memory: 19M/183M
>>>>> [INFO]
>>>>> 
>>>>> [ERROR] Failed to execute goal
>>>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>>>>> failed. Look above for specific messages explaining why the rule failed.
>>>>> ->
>>>>> [Help 1]
>>>>> [ERROR]
>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>>>> -e
>>>>> switch.
>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>> [ERROR]
>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>> please
>>>>> read the following articles:
>>>>> [ERROR] [Help 1]
>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>> ukdrfs01:spark-1.6.0 aidatefera$
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
>>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>> 
>>>>> -
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>> 
>>>> Informativa sulla Privacy: http://www.unibs.it/node/8155

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-08 Thread Cody Koeninger
3.3.
>>>> [INFO]
>>>> 
>>>> [INFO] Reactor Summary:
>>>> [INFO]
>>>> [INFO] Spark Project Parent POM .. FAILURE
>>>> [0.821s]
>>>> [INFO] Spark Project Test Tags ... SKIPPED
>>>> [INFO] Spark Project Launcher  SKIPPED
>>>> [INFO] Spark Project Networking .. SKIPPED
>>>> [INFO] Spark Project Shuffle Streaming Service ... SKIPPED
>>>> [INFO] Spark Project Unsafe .. SKIPPED
>>>> [INFO] Spark Project Core  SKIPPED
>>>> [INFO] Spark Project Bagel ... SKIPPED
>>>> [INFO] Spark Project GraphX .. SKIPPED
>>>> [INFO] Spark Project Streaming ... SKIPPED
>>>> [INFO] Spark Project Catalyst  SKIPPED
>>>> [INFO] Spark Project SQL . SKIPPED
>>>> [INFO] Spark Project ML Library .. SKIPPED
>>>> [INFO] Spark Project Tools ... SKIPPED
>>>> [INFO] Spark Project Hive  SKIPPED
>>>> [INFO] Spark Project Docker Integration Tests  SKIPPED
>>>> [INFO] Spark Project REPL  SKIPPED
>>>> [INFO] Spark Project Assembly  SKIPPED
>>>> [INFO] Spark Project External Twitter  SKIPPED
>>>> [INFO] Spark Project External Flume Sink . SKIPPED
>>>> [INFO] Spark Project External Flume .. SKIPPED
>>>> [INFO] Spark Project External Flume Assembly . SKIPPED
>>>> [INFO] Spark Project External MQTT ... SKIPPED
>>>> [INFO] Spark Project External MQTT Assembly .. SKIPPED
>>>> [INFO] Spark Project External ZeroMQ . SKIPPED
>>>> [INFO] Spark Project External Kafka .. SKIPPED
>>>> [INFO] Spark Project Examples  SKIPPED
>>>> [INFO] Spark Project External Kafka Assembly . SKIPPED
>>>> [INFO]
>>>> 
>>>> [INFO] BUILD FAILURE
>>>> [INFO]
>>>> 
>>>> [INFO] Total time: 1.745s
>>>> [INFO] Finished at: Tue Mar 08 18:01:48 GMT 2016
>>>> [INFO] Final Memory: 19M/183M
>>>> [INFO]
>>>> 
>>>> [ERROR] Failed to execute goal
>>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>>>> failed. Look above for specific messages explaining why the rule failed.
>>>> ->
>>>> [Help 1]
>>>> [ERROR]
>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>>> -e
>>>> switch.
>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>> [ERROR]
>>>> [ERROR] For more information about the errors and possible solutions,
>>>> please
>>>> read the following articles:
>>>> [ERROR] [Help 1]
>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>> ukdrfs01:spark-1.6.0 aidatefera$
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> -
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>> Informativa sulla Privacy: http://www.unibs.it/node/8155

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-08 Thread Aida Tefera
 SKIPPED
>>> [INFO] Spark Project Bagel ... SKIPPED
>>> [INFO] Spark Project GraphX .. SKIPPED
>>> [INFO] Spark Project Streaming ... SKIPPED
>>> [INFO] Spark Project Catalyst  SKIPPED
>>> [INFO] Spark Project SQL . SKIPPED
>>> [INFO] Spark Project ML Library .. SKIPPED
>>> [INFO] Spark Project Tools ... SKIPPED
>>> [INFO] Spark Project Hive  SKIPPED
>>> [INFO] Spark Project Docker Integration Tests  SKIPPED
>>> [INFO] Spark Project REPL  SKIPPED
>>> [INFO] Spark Project Assembly  SKIPPED
>>> [INFO] Spark Project External Twitter  SKIPPED
>>> [INFO] Spark Project External Flume Sink . SKIPPED
>>> [INFO] Spark Project External Flume .. SKIPPED
>>> [INFO] Spark Project External Flume Assembly . SKIPPED
>>> [INFO] Spark Project External MQTT ... SKIPPED
>>> [INFO] Spark Project External MQTT Assembly .. SKIPPED
>>> [INFO] Spark Project External ZeroMQ . SKIPPED
>>> [INFO] Spark Project External Kafka .. SKIPPED
>>> [INFO] Spark Project Examples  SKIPPED
>>> [INFO] Spark Project External Kafka Assembly . SKIPPED
>>> [INFO]
>>> 
>>> [INFO] BUILD FAILURE
>>> [INFO]
>>> 
>>> [INFO] Total time: 1.745s
>>> [INFO] Finished at: Tue Mar 08 18:01:48 GMT 2016
>>> [INFO] Final Memory: 19M/183M
>>> [INFO]
>>> 
>>> [ERROR] Failed to execute goal
>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>>> failed. Look above for specific messages explaining why the rule failed.
>>> ->
>>> [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e
>>> switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please
>>> read the following articles:
>>> [ERROR] [Help 1]
>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>> ukdrfs01:spark-1.6.0 aidatefera$
>>> 
>>> 
>>> 
>>> 
>>> 
>>> 
>>> 
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>> 
>> Informativa sulla Privacy: http://www.unibs.it/node/8155

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-08 Thread Cody Koeninger
mbly  SKIPPED
>> [INFO] Spark Project External Twitter  SKIPPED
>> [INFO] Spark Project External Flume Sink . SKIPPED
>> [INFO] Spark Project External Flume .. SKIPPED
>> [INFO] Spark Project External Flume Assembly . SKIPPED
>> [INFO] Spark Project External MQTT ... SKIPPED
>> [INFO] Spark Project External MQTT Assembly .. SKIPPED
>> [INFO] Spark Project External ZeroMQ . SKIPPED
>> [INFO] Spark Project External Kafka .. SKIPPED
>> [INFO] Spark Project Examples  SKIPPED
>> [INFO] Spark Project External Kafka Assembly . SKIPPED
>> [INFO]
>> 
>> [INFO] BUILD FAILURE
>> [INFO]
>> 
>> [INFO] Total time: 1.745s
>> [INFO] Finished at: Tue Mar 08 18:01:48 GMT 2016
>> [INFO] Final Memory: 19M/183M
>> [INFO]
>> 
>> [ERROR] Failed to execute goal
>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>> failed. Look above for specific messages explaining why the rule failed.
>> ->
>> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>> -e
>> switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> [ERROR]
>> [ERROR] For more information about the errors and possible solutions,
>> please
>> read the following articles:
>> [ERROR] [Help 1]
>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>> ukdrfs01:spark-1.6.0 aidatefera$
>>
>>
>>
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>
> Informativa sulla Privacy: http://www.unibs.it/node/8155

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-08 Thread Eduardo Costa Alfaia
9M/183M
> [INFO]
> 
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
> failed. Look above for specific messages explaining why the rule failed. ->
> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please
> read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> ukdrfs01:spark-1.6.0 aidatefera$
>
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

-- 
Informativa sulla Privacy: http://www.unibs.it/node/8155


Re: Installing Spark on Mac

2016-03-08 Thread Aida
)
at
sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:60)
at
sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:60)
at sbt.Command$.process(Command.scala:92)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:98)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:98)
at sbt.State$$anon$1.process(State.scala:184)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:98)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:98)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.MainLoop$.next(MainLoop.scala:98)
at sbt.MainLoop$.run(MainLoop.scala:91)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:70)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:65)
at sbt.Using.apply(Using.scala:24)
at sbt.MainLoop$.runWithNewLog(MainLoop.scala:65)
at sbt.MainLoop$.runAndClearLast(MainLoop.scala:48)
at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:32)
at sbt.MainLoop$.runLogged(MainLoop.scala:24)
at sbt.StandardMain$.runManaged(Main.scala:53)
at sbt.xMain.run(Main.scala:28)
at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
at xsbt.boot.Launch$.withContextLoader(Launch.scala:128)
at xsbt.boot.Launch$.run(Launch.scala:109)
at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35)
at xsbt.boot.Launch$.launch(Launch.scala:117)
at xsbt.boot.Launch$.apply(Launch.scala:18)
at xsbt.boot.Boot$.runImpl(Boot.scala:41)
at xsbt.boot.Boot$.main(Boot.scala:17)
at xsbt.boot.Boot.main(Boot.scala)
[error] Nonzero exit code (139): git clone
https://github.com/ScrapCodes/sbt-pom-reader.git
/Users/aidatefera/.sbt/0.13/staging/ad8e8574a5bcb2d22d23/sbt-pom-reader
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? 




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26432.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-08 Thread Aida
ead the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
ukdrfs01:spark-1.6.0 aidatefera$ 







--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-04 Thread Vishnu Viswanath
Installing spark on mac is similar to how you install it on Linux.

I use mac and have written a blog on how to install spark here is the link
: http://vishnuviswanath.com/spark_start.html

Hope this helps.

On Fri, Mar 4, 2016 at 2:29 PM, Simon Hafner <reactorm...@gmail.com> wrote:

> I'd try `brew install spark` or `apache-spark` and see where that gets
> you. https://github.com/Homebrew/homebrew
>
> 2016-03-04 21:18 GMT+01:00 Aida <aida1.tef...@gmail.com>:
> > Hi all,
> >
> > I am a complete novice and was wondering whether anyone would be willing
> to
> > provide me with a step by step guide on how to install Spark on a Mac; on
> > standalone mode btw.
> >
> > I downloaded a prebuilt version, the second version from the top.
> However, I
> > have not installed Hadoop and am not planning to at this stage.
> >
> > I also downloaded Scala from the Scala website, do I need to download
> > anything else?
> >
> > I am very eager to learn more about Spark but am unsure about the best
> way
> > to do it.
> >
> > I would be happy for any suggestions or ideas
> >
> > Many thanks,
> >
> > Aida
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Installing Spark on Mac

2016-03-04 Thread Eduardo Costa Alfaia
Hi Aida

Run only "build/mvn -DskipTests clean package”

BR

Eduardo Costa Alfaia
Ph.D. Student in Telecommunications Engineering
Università degli Studi di Brescia
Tel: +39 3209333018








On 3/4/16, 16:18, "Aida" <aida1.tef...@gmail.com> wrote:

>Hi all,
>
>I am a complete novice and was wondering whether anyone would be willing to
>provide me with a step by step guide on how to install Spark on a Mac; on
>standalone mode btw.
>
>I downloaded a prebuilt version, the second version from the top. However, I
>have not installed Hadoop and am not planning to at this stage.
>
>I also downloaded Scala from the Scala website, do I need to download
>anything else?
>
>I am very eager to learn more about Spark but am unsure about the best way
>to do it.
>
>I would be happy for any suggestions or ideas
>
>Many thanks,
>
>Aida
>
>
>
>--
>View this message in context: 
>http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397.html
>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>-
>To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>For additional commands, e-mail: user-h...@spark.apache.org
>


-- 
Informativa sulla Privacy: http://www.unibs.it/node/8155

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Installing Spark on Mac

2016-03-04 Thread Simon Hafner
I'd try `brew install spark` or `apache-spark` and see where that gets
you. https://github.com/Homebrew/homebrew

2016-03-04 21:18 GMT+01:00 Aida <aida1.tef...@gmail.com>:
> Hi all,
>
> I am a complete novice and was wondering whether anyone would be willing to
> provide me with a step by step guide on how to install Spark on a Mac; on
> standalone mode btw.
>
> I downloaded a prebuilt version, the second version from the top. However, I
> have not installed Hadoop and am not planning to at this stage.
>
> I also downloaded Scala from the Scala website, do I need to download
> anything else?
>
> I am very eager to learn more about Spark but am unsure about the best way
> to do it.
>
> I would be happy for any suggestions or ideas
>
> Many thanks,
>
> Aida
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Installing Spark on Mac

2016-03-04 Thread Aida
Hi all,

I am a complete novice and was wondering whether anyone would be willing to
provide me with a step by step guide on how to install Spark on a Mac; on
standalone mode btw.

I downloaded a prebuilt version, the second version from the top. However, I
have not installed Hadoop and am not planning to at this stage.

I also downloaded Scala from the Scala website, do I need to download
anything else?

I am very eager to learn more about Spark but am unsure about the best way
to do it.

I would be happy for any suggestions or ideas

Many thanks,

Aida



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org