It appears like i had issues in my /etc/hosts... it seems ok now
> On Jul 10, 2016, at 2:13 PM, Jean Georges Perrin <j...@jgp.net> wrote:
>
> I tested that:
>
> I set:
>
> _JAVA_OPTIONS=-Djava.net.preferIPv4Stack=true
> SPARK_LOCAL_IP=10.0.100.120
> I still have the warning in the log:
>
> 16/07/10 14:10:13 WARN Utils: Your hostname, micha resolves to a loopback
> address: 127.0.1.1; using 10.0.100.120 instead (on interface eno1)
> 16/07/10 14:10:13 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
> another address
> and still connection refused...
>
> but no luck
>
>> On Jul 10, 2016, at 1:26 PM, Jean Georges Perrin <j...@jgp.net
>> <mailto:j...@jgp.net>> wrote:
>>
>> Hi,
>>
>> So far I have been using Spark "embedded" in my app. Now, I'd like to run it
>> on a dedicated server.
>>
>> I am that far:
>> - fresh ubuntu 16, server name is mocha / ip 10.0.100.120, installed scala
>> 2.10, installed Spark 1.6.2, recompiled
>> - Pi test works
>> - UI on port 8080 works
>>
>> Log says:
>> Spark Command: /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp
>> /opt/apache-spark-1.6.2/conf/:/opt/apache-spark-1.6.2/assembly/target/scala-2.10/spark-assembly-1.6.2-hadoop2.2.0.jar:/opt/apache-spark-1.6.2/lib_managed/jars/datanucleus-rdbms-3.2.9.jar:/opt/apache-spark-1.6.2/lib_managed/jars/datanucleus-core-3.2.10.jar:/opt/apache-spark-1.6.2/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar
>> -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip micha --port 7077
>> --webui-port 8080
>> ========================================
>> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>> 16/07/10 13:03:55 INFO Master: Registered signal handlers for [TERM, HUP,
>> INT]
>> 16/07/10 13:03:55 WARN Utils: Your hostname, micha resolves to a loopback
>> address: 127.0.1.1; using 10.0.100.120 instead (on interface eno1)
>> 16/07/10 13:03:55 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
>> another address
>> 16/07/10 13:03:55 WARN NativeCodeLoader: Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>> 16/07/10 13:03:55 INFO SecurityManager: Changing view acls to: root
>> 16/07/10 13:03:55 INFO SecurityManager: Changing modify acls to: root
>> 16/07/10 13:03:55 INFO SecurityManager: SecurityManager: authentication
>> disabled; ui acls disabled; users with view permissions: Set(root); users
>> with modify permissions: Set(root)
>> 16/07/10 13:03:56 INFO Utils: Successfully started service 'sparkMaster' on
>> port 7077.
>> 16/07/10 13:03:56 INFO Master: Starting Spark master at spark://micha:7077
>> <spark://micha:7077>
>> 16/07/10 13:03:56 INFO Master: Running Spark version 1.6.2
>> 16/07/10 13:03:56 INFO Server: jetty-8.y.z-SNAPSHOT
>> 16/07/10 13:03:56 INFO AbstractConnector: Started
>> SelectChannelConnector@0.0.0.0 <mailto:SelectChannelConnector@0.0.0.0>:8080
>> 16/07/10 13:03:56 INFO Utils: Successfully started service 'MasterUI' on
>> port 8080.
>> 16/07/10 13:03:56 INFO MasterWebUI: Started MasterWebUI at
>> http://10.0.100.120:8080 <http://10.0.100.120:8080/>
>> 16/07/10 13:03:56 INFO Server: jetty-8.y.z-SNAPSHOT
>> 16/07/10 13:03:56 INFO AbstractConnector: Started
>> SelectChannelConnector@micha:6066
>> 16/07/10 13:03:56 INFO Utils: Successfully started service on port 6066.
>> 16/07/10 13:03:56 INFO StandaloneRestServer: Started REST server for
>> submitting applications on port 6066
>> 16/07/10 13:03:56 INFO Master: I have been elected leader! New state: ALIVE
>>
>>
>> In my app, i changed the config to:
>> SparkConf conf = new
>> SparkConf().setAppName("myapp").setMaster("spark://10.0.100.120:6066
>> <spark://10.0.100.120:6066>");
>>
>>
>> (also tried 7077)
>>
>>
>> On the client:
>> 16-07-10 13:22:58:300 INFO org.spark-project.jetty.server.AbstractConnector
>> - Started SelectChannelConnector@0.0.0.0
>> <mailto:SelectChannelConnector@0.0.0.0>:4040
>> 16-07-10 13:22:58:300 DEBUG
>> org.spark-project.jetty.util.component.AbstractLifeCycle - STARTED
>> SelectChannelConnector@0.0.0.0 <mailto:SelectChannelConnector@0.0.0.0>:4040
>> 16-07-10 13:22:58:300 DEBUG
>> org.spark-project.jetty.util.component.AbstractLifeCycle - STARTED
>> org.spark-project.jetty.server.Server@3eb292cd
>> 16-07-10 13:22:58:301 INFO org.apache.spark.util.Utils - Successfully
>> started service 'SparkUI' on port 4040.
>> 16-07-10 13:22:58:306 INFO org.apache.spark.ui.SparkUI - Started SparkUI at
>> http://10.0.100.100:4040 <http://10.0.100.100:4040/>
>> 16-07-10 13:22:58:621 INFO
>> org.apache.spark.deploy.client.AppClient$ClientEndpoint - Connecting to
>> master spark://10.0.100.120:6066 <spark://10.0.100.120:6066>...
>> 16-07-10 13:22:58:648 DEBUG
>> org.apache.spark.network.client.TransportClientFactory - Creating new
>> connection to /10.0.100.120:6066
>> 16-07-10 13:22:58:689 DEBUG io.netty.util.ResourceLeakDetector -
>> -Dio.netty.leakDetectionLevel: simple
>> 16-07-10 13:22:58:714 WARN
>> org.apache.spark.deploy.client.AppClient$ClientEndpoint - Failed to connect
>> to master 10.0.100.120:6066
>> java.io.IOException: Failed to connect to /10.0.100.120:6066
>> at
>> org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:216)
>>
>> and if I try to telnet:
>>
>> $ telnet 10.0.100.120 6066
>> Trying 10.0.100.120...
>> telnet: connect to address 10.0.100.120: Connection refused
>> telnet: Unable to connect to remote host
>>
>> $ telnet 10.0.100.120 7077
>> Trying 10.0.100.120...
>> telnet: connect to address 10.0.100.120: Connection refused
>> telnet: Unable to connect to remote host
>>
>> On the server, I checked with netstat:
>> jgp@micha:/opt/apache-spark$ netstat -a | grep 6066
>> tcp6 0 0 micha.nc.rr.com <http://micha.nc.rr.com/>:6066 [::]:*
>> LISTEN
>> jgp@micha:/opt/apache-spark$ netstat -a | grep 7077
>> tcp6 0 0 micha.nc.rr.com <http://micha.nc.rr.com/>:7077 [::]:*
>> LISTEN
>>
>> If I interpret this, it looks like it listens in IP v6 and not 4...
>>
>> Any clue would be very helpful. I do not think I am that far, but...
>>
>> Thanks
>>
>>
>> jg
>>
>>
>>
>>
>