Hello all,

I wrote some Java code that uses Spark, but for some reason I can't run it
from the command line.  I am running Spark on a single node (my
workstation). The program stops running after this line is executed :

SparkContext sparkContext = new SparkContext("spark://myworkstation:7077",
"sparkbase");

When that line is executed, this is printed to the screen :
15/01/12 15:56:19 WARN util.Utils: Your hostname, myworkstation resolves to
a loopback address: 127.0.1.1; using 10.211.55.3 instead (on interface eth0)
15/01/12 15:56:19 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind
to another address
15/01/12 15:56:19 INFO spark.SecurityManager: Changing view acls to:
myusername
15/01/12 15:56:19 INFO spark.SecurityManager: Changing modify acls to:
myusername
15/01/12 15:56:19 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(myusername); users with modify permissions: Set(myusername)

After it writes this to the screen, the program stops executing without
reporting an exception.

What's odd is that when I run this code from Eclipse, the same lines are
printed to the screen, but the program keeps executing.

Don't know if it matters, but I'm using the maven assembly plugin, which
includes the dependencies in the JAR.

Here are the versions I'm using :
Cloudera : 2.5.0-cdh5.2.1
Hadoop : 2.5.0-cdh5.2.1
HBase : HBase 0.98.6-cdh5.2.1
Java : 1.7.0_65
Ubuntu : 14.04.1 LTS
Spark : 1.2

Reply via email to