Re: IP error on starting spark-shell on windows 7

2015-12-13 Thread Akhil Das
Its a warning, not an error. What happens when you don't specify
SPARK_LOCAL_IP at all? If it is able to bring up the spark shell, then
try *netstat
-np* and see on which address the driver is binding to.

Thanks
Best Regards

On Thu, Dec 10, 2015 at 9:49 AM, Stefan Karos  wrote:

> On starting spark-shell I see this just before the scala prompt:
>
> WARN : Your hostname, BloomBear-SSD resolves to a loopback/non-reachable
> address: fe80:0:0:0:0:5efe:c0a8:317%net10, but we couldn't find any
> external IP address!
>
> I get this error even when firewall is disabled.
> I also tried setting the environment variable SPARK_IP_LOCAL to various
> choices listed below:
>
> SPARK_LOCAL_IP=localhost
> SPARK_LOCAL_IP=127.0.0.1
> SPARK_LOCAL_IP=192.168.1.88   (my local machine's IPv4 address)
> SPARK_LOCAL_IP=fe80::eda5:a1a7:be1e:13cb%14  (my local machine's IPv6
> address)
>
> I still get this annoying error! How can I resolve this?
> See below for my environment
>
> Environment
> windows 7 64 bit
> Spark 1.5.2
> Scala 2.10.6
> Python 2.7.10 (from Anaconda)
>
> PATH includes:
> C:\Users\Stefan\spark-1.5.2-bin-hadoop2.6\bin
> C:\ProgramData\Oracle\Java\javapath
> C:\Users\Stefan\scala\bin
> C:\Users\Stefan\hadoop-2.6.0\bin
> (where the bin\winutils resides)
> C:\ProgramData\Oracle\Java\javapath
>
> SYSTEM variables set are:
> SPARK_HOME=C:\Users\Stefan\spark-1.5.2-bin-hadoop2.6
> JAVA_HOME=C:\Program Files\Java\jre1.8.0_65
> HADOOP_HOME=C:\Users\Stefan\hadoop-2.6.0
>
> \tmp\hive directory at root on C; drive with full permissions,
> e.g.
> >winutils ls \tmp\hive
> drwxrwxrwx 1 BloomBear-SSD\Stefan BloomBear-SSD\None 0 Dec  8 2015
> \tmp\hive
>
>


IP error on starting spark-shell on windows 7

2015-12-09 Thread Stefan Karos
On starting spark-shell I see this just before the scala prompt:

WARN : Your hostname, BloomBear-SSD resolves to a loopback/non-reachable
address: fe80:0:0:0:0:5efe:c0a8:317%net10, but we couldn't find any
external IP address!

I get this error even when firewall is disabled.
I also tried setting the environment variable SPARK_IP_LOCAL to various
choices listed below:

SPARK_LOCAL_IP=localhost
SPARK_LOCAL_IP=127.0.0.1
SPARK_LOCAL_IP=192.168.1.88   (my local machine's IPv4 address)
SPARK_LOCAL_IP=fe80::eda5:a1a7:be1e:13cb%14  (my local machine's IPv6
address)

I still get this annoying error! How can I resolve this?
See below for my environment

Environment
windows 7 64 bit
Spark 1.5.2
Scala 2.10.6
Python 2.7.10 (from Anaconda)

PATH includes:
C:\Users\Stefan\spark-1.5.2-bin-hadoop2.6\bin
C:\ProgramData\Oracle\Java\javapath
C:\Users\Stefan\scala\bin
C:\Users\Stefan\hadoop-2.6.0\bin
(where the bin\winutils resides)
C:\ProgramData\Oracle\Java\javapath

SYSTEM variables set are:
SPARK_HOME=C:\Users\Stefan\spark-1.5.2-bin-hadoop2.6
JAVA_HOME=C:\Program Files\Java\jre1.8.0_65
HADOOP_HOME=C:\Users\Stefan\hadoop-2.6.0

\tmp\hive directory at root on C; drive with full permissions,
e.g.
>winutils ls \tmp\hive
drwxrwxrwx 1 BloomBear-SSD\Stefan BloomBear-SSD\None 0 Dec  8 2015 \tmp\hive