Re: Can't run Spark java code from command line

2015-01-13 Thread Ye Xianjin
There is no binding issue here. Spark picks the right ip 10.211.55.3 for you. 
The printed message is just an indication.
 However I have no idea why spark-shell hangs or stops.

发自我的 iPhone

> 在 2015年1月14日,上午5:10,Akhil Das  写道:
> 
> It just a binding issue with the hostnames in your /etc/hosts file. You can 
> set SPARK_LOCAL_IP and SPARK_MASTER_IP in your conf/spark-env.sh file and 
> restart your cluster. (in that case the spark://myworkstation:7077 will 
> change to the ip address that you provided eg: spark://10.211.55.3).
> 
> Thanks
> Best Regards
> 
>> On Tue, Jan 13, 2015 at 11:15 PM, jeremy p  
>> wrote:
>> Hello all,
>> 
>> I wrote some Java code that uses Spark, but for some reason I can't run it 
>> from the command line.  I am running Spark on a single node (my 
>> workstation). The program stops running after this line is executed :
>> 
>> SparkContext sparkContext = new SparkContext("spark://myworkstation:7077", 
>> "sparkbase");
>> 
>> When that line is executed, this is printed to the screen : 
>> 15/01/12 15:56:19 WARN util.Utils: Your hostname, myworkstation resolves to 
>> a loopback address: 127.0.1.1; using 10.211.55.3 instead (on interface eth0)
>> 15/01/12 15:56:19 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to 
>> another address
>> 15/01/12 15:56:19 INFO spark.SecurityManager: Changing view acls to: 
>> myusername
>> 15/01/12 15:56:19 INFO spark.SecurityManager: Changing modify acls to: 
>> myusername
>> 15/01/12 15:56:19 INFO spark.SecurityManager: SecurityManager: 
>> authentication disabled; ui acls disabled; users with view permissions: 
>> Set(myusername); users with modify permissions: Set(myusername)
>> 
>> After it writes this to the screen, the program stops executing without 
>> reporting an exception.
>> 
>> What's odd is that when I run this code from Eclipse, the same lines are 
>> printed to the screen, but the program keeps executing.
>> 
>> Don't know if it matters, but I'm using the maven assembly plugin, which 
>> includes the dependencies in the JAR.
>> 
>> Here are the versions I'm using :
>> Cloudera : 2.5.0-cdh5.2.1
>> Hadoop : 2.5.0-cdh5.2.1
>> HBase : HBase 0.98.6-cdh5.2.1
>> Java : 1.7.0_65
>> Ubuntu : 14.04.1 LTS
>> Spark : 1.2
> 


Re: Can't run Spark java code from command line

2015-01-13 Thread Akhil Das
It just a binding issue with the hostnames in your /etc/hosts file. You can
set SPARK_LOCAL_IP and SPARK_MASTER_IP in your conf/spark-env.sh file and
restart your cluster. (in that case the spark://myworkstation:7077 will
change to the ip address that you provided eg: spark://10.211.55.3).

Thanks
Best Regards

On Tue, Jan 13, 2015 at 11:15 PM, jeremy p 
wrote:

> Hello all,
>
> I wrote some Java code that uses Spark, but for some reason I can't run it
> from the command line.  I am running Spark on a single node (my
> workstation). The program stops running after this line is executed :
>
> SparkContext sparkContext = new SparkContext("spark://myworkstation:7077",
> "sparkbase");
>
> When that line is executed, this is printed to the screen :
> 15/01/12 15:56:19 WARN util.Utils: Your hostname, myworkstation resolves
> to a loopback address: 127.0.1.1; using 10.211.55.3 instead (on interface
> eth0)
> 15/01/12 15:56:19 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind
> to another address
> 15/01/12 15:56:19 INFO spark.SecurityManager: Changing view acls to:
> myusername
> 15/01/12 15:56:19 INFO spark.SecurityManager: Changing modify acls to:
> myusername
> 15/01/12 15:56:19 INFO spark.SecurityManager: SecurityManager:
> authentication disabled; ui acls disabled; users with view permissions:
> Set(myusername); users with modify permissions: Set(myusername)
>
> After it writes this to the screen, the program stops executing without
> reporting an exception.
>
> What's odd is that when I run this code from Eclipse, the same lines are
> printed to the screen, but the program keeps executing.
>
> Don't know if it matters, but I'm using the maven assembly plugin, which
> includes the dependencies in the JAR.
>
> Here are the versions I'm using :
> Cloudera : 2.5.0-cdh5.2.1
> Hadoop : 2.5.0-cdh5.2.1
> HBase : HBase 0.98.6-cdh5.2.1
> Java : 1.7.0_65
> Ubuntu : 14.04.1 LTS
> Spark : 1.2
>


Can't run Spark java code from command line

2015-01-13 Thread jeremy p
Hello all,

I wrote some Java code that uses Spark, but for some reason I can't run it
from the command line.  I am running Spark on a single node (my
workstation). The program stops running after this line is executed :

SparkContext sparkContext = new SparkContext("spark://myworkstation:7077",
"sparkbase");

When that line is executed, this is printed to the screen :
15/01/12 15:56:19 WARN util.Utils: Your hostname, myworkstation resolves to
a loopback address: 127.0.1.1; using 10.211.55.3 instead (on interface eth0)
15/01/12 15:56:19 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind
to another address
15/01/12 15:56:19 INFO spark.SecurityManager: Changing view acls to:
myusername
15/01/12 15:56:19 INFO spark.SecurityManager: Changing modify acls to:
myusername
15/01/12 15:56:19 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(myusername); users with modify permissions: Set(myusername)

After it writes this to the screen, the program stops executing without
reporting an exception.

What's odd is that when I run this code from Eclipse, the same lines are
printed to the screen, but the program keeps executing.

Don't know if it matters, but I'm using the maven assembly plugin, which
includes the dependencies in the JAR.

Here are the versions I'm using :
Cloudera : 2.5.0-cdh5.2.1
Hadoop : 2.5.0-cdh5.2.1
HBase : HBase 0.98.6-cdh5.2.1
Java : 1.7.0_65
Ubuntu : 14.04.1 LTS
Spark : 1.2