Hi,
what do you get running just 'sudo netstat'?
Also, what's the output of 'jps -mlv' when running your spark application?
Can you post the contents of the files in $SPARK_HOME/conf ?
Are there any special firewall rules in place, forbidding connections
on localhost?

Regarding the IP address change, what do you mean by "the IP address
of the master node"? I thought you were running Spark shell locally
(hence the IP is the same as localhost).

Please also consider creating a new user account and retrying, there
is most-likely something wrong with your environment.

regards,
--Jakob


On Tue, Mar 15, 2016 at 5:41 AM, Aida Tefera <aida1.tef...@gmail.com> wrote:
> Hi Jakob, sorry for my late reply
>
> I tried to run the below; came back with "netstat: lunt: unknown or 
> uninstrumented protocol
>
> I also tried uninstalling version 1.6.0 and installing version1.5.2 with Java 
> 7 and SCALA version 2.10.6; got the same error messages
>
> Do you think it would be worth me trying to change the IP address in 
> SPARK_MASTER_IP to the IP address of the master node? If so, how would I go 
> about doing that?
>
> Thanks,
>
> Aida
>
> Sent from my iPhone
>
>> On 11 Mar 2016, at 08:37, Jakob Odersky <ja...@odersky.com> wrote:
>>
>> regarding my previous message, I forgot to mention to run netstat as
>> root (sudo netstat -plunt)
>> sorry for the noise
>>
>>> On Fri, Mar 11, 2016 at 12:29 AM, Jakob Odersky <ja...@odersky.com> wrote:
>>> Some more diagnostics/suggestions:
>>>
>>> 1) are other services listening to ports in the 4000 range (run
>>> "netstat -plunt")? Maybe there is an issue with the error message
>>> itself.
>>>
>>> 2) are you sure the correct java version is used? java -version
>>>
>>> 3) can you revert all installation attempts you have done so far,
>>> including files installed by brew/macports or maven and try again?
>>>
>>> 4) are there any special firewall rules in place, forbidding
>>> connections on localhost?
>>>
>>> This is very weird behavior you're seeing. Spark is supposed to work
>>> out-of-the-box with ZERO configuration necessary for running a local
>>> shell. Again, my prime suspect is a previous, failed Spark
>>> installation messing up your config.
>>>
>>>> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon <st...@memeticlabs.org> 
>>>> wrote:
>>>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then 
>>>> you’re the superuser.
>>>> However, as mentioned below, I don’t think its a relevant factor.
>>>>
>>>>> On Mar 10, 2016, at 12:02 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>>>>>
>>>>> Hi Tristan,
>>>>>
>>>>> I'm afraid I wouldn't know whether I'm running it as super user.
>>>>>
>>>>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>>>>
>>>>> Sent from my iPhone
>>>>>
>>>>>> On 9 Mar 2016, at 21:58, Tristan Nixon <st...@memeticlabs.org> wrote:
>>>>>>
>>>>>> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
>>>>>> fresh 1.6.0 tarball,
>>>>>> unzipped it to local dir (~/Downloads), and it ran just fine - the 
>>>>>> driver port is some randomly generated large number.
>>>>>> So SPARK_HOME is definitely not needed to run this.
>>>>>>
>>>>>> Aida, you are not running this as the super-user, are you?  What 
>>>>>> versions of Java & Scala do you have installed?
>>>>>>
>>>>>>> On Mar 9, 2016, at 3:53 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>>>>>>>
>>>>>>> Hi Jakob,
>>>>>>>
>>>>>>> Tried running the command env|grep SPARK; nothing comes back
>>>>>>>
>>>>>>> Tried env|grep Spark; which is the directory I created for Spark once I 
>>>>>>> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>>>>>>>
>>>>>>> Tried running ./bin/spark-shell ; comes back with same error as below; 
>>>>>>> i.e could not bind to port 0 etc.
>>>>>>>
>>>>>>> Sent from my iPhone
>>>>>>>
>>>>>>>> On 9 Mar 2016, at 21:42, Jakob Odersky <ja...@odersky.com> wrote:
>>>>>>>>
>>>>>>>> As Tristan mentioned, it looks as though Spark is trying to bind on
>>>>>>>> port 0 and then 1 (which is not allowed). Could it be that some
>>>>>>>> environment variables from you previous installation attempts are
>>>>>>>> polluting your configuration?
>>>>>>>> What does running "env | grep SPARK" show you?
>>>>>>>>
>>>>>>>> Also, try running just "/bin/spark-shell" (without the --master
>>>>>>>> argument), maybe your shell is doing some funky stuff with the
>>>>>>>> brackets.
>>>>>>>
>>>>>>> ---------------------------------------------------------------------
>>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon <st...@memeticlabs.org> 
>>>> wrote:
>>>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then 
>>>> you’re the superuser.
>>>> However, as mentioned below, I don’t think its a relevant factor.
>>>>
>>>>> On Mar 10, 2016, at 12:02 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>>>>>
>>>>> Hi Tristan,
>>>>>
>>>>> I'm afraid I wouldn't know whether I'm running it as super user.
>>>>>
>>>>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>>>>
>>>>> Sent from my iPhone
>>>>>
>>>>>> On 9 Mar 2016, at 21:58, Tristan Nixon <st...@memeticlabs.org> wrote:
>>>>>>
>>>>>> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
>>>>>> fresh 1.6.0 tarball,
>>>>>> unzipped it to local dir (~/Downloads), and it ran just fine - the 
>>>>>> driver port is some randomly generated large number.
>>>>>> So SPARK_HOME is definitely not needed to run this.
>>>>>>
>>>>>> Aida, you are not running this as the super-user, are you?  What 
>>>>>> versions of Java & Scala do you have installed?
>>>>>>
>>>>>>> On Mar 9, 2016, at 3:53 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>>>>>>>
>>>>>>> Hi Jakob,
>>>>>>>
>>>>>>> Tried running the command env|grep SPARK; nothing comes back
>>>>>>>
>>>>>>> Tried env|grep Spark; which is the directory I created for Spark once I 
>>>>>>> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>>>>>>>
>>>>>>> Tried running ./bin/spark-shell ; comes back with same error as below; 
>>>>>>> i.e could not bind to port 0 etc.
>>>>>>>
>>>>>>> Sent from my iPhone
>>>>>>>
>>>>>>>> On 9 Mar 2016, at 21:42, Jakob Odersky <ja...@odersky.com> wrote:
>>>>>>>>
>>>>>>>> As Tristan mentioned, it looks as though Spark is trying to bind on
>>>>>>>> port 0 and then 1 (which is not allowed). Could it be that some
>>>>>>>> environment variables from you previous installation attempts are
>>>>>>>> polluting your configuration?
>>>>>>>> What does running "env | grep SPARK" show you?
>>>>>>>>
>>>>>>>> Also, try running just "/bin/spark-shell" (without the --master
>>>>>>>> argument), maybe your shell is doing some funky stuff with the
>>>>>>>> brackets.
>>>>>>>
>>>>>>> ---------------------------------------------------------------------
>>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to