Hi all,

I reinstalled spark,reboot the system,but still I am not able to start the 
workers.Its throwing the following exception:

Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to 
bind to: master/192.168.125.174:0

I doubt the problem is with 192.168.125.174:0. Eventhough the command contains 
master:7077,why its showing 0 in the log.

java -cp 
::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
org.apache.spark.deploy.worker.Worker spark://master:7077

Can somebody tell me  a solution.
 
Thanks & Regards, 
Meethu M


On Friday, 27 June 2014 4:28 PM, MEETHU MATHEW <meethu2...@yahoo.co.in> wrote:
 


Hi,
ya I tried setting another PORT also,but the same problem..
master is set in etc/hosts
 
Thanks & Regards, 
Meethu M


On Friday, 27 June 2014 3:23 PM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
 


tha's strange, did you try setting the master port to something else (use 
SPARK_MASTER_PORT).

Also you said you are able to start it from the java commandline

java -cp 
::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
org.apache.spark.deploy.worker.Worker spark://:master:7077


What is the master ip specified here? is it like you have entry for master in 
the /etc/hosts? 


Thanks
Best Regards


On Fri, Jun 27, 2014 at 3:09 PM, MEETHU MATHEW <meethu2...@yahoo.co.in> wrote:

Hi Akhil,
>
>
>I am running it in a LAN itself..The IP of the master is given correctly.
> 
>Thanks & Regards, 
>Meethu M
>
>
>
>On Friday, 27 June 2014 2:51 PM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
> 
>
>
>why is it binding to port 0? 192.168.125.174:0 :/
>
>
>Check the ip address of that master machine (ifconfig) looks like the ip 
>address has been changed (hoping you are running this machines on a LAN)
>
>
>Thanks
>Best Regards
>
>
>On Fri, Jun 27, 2014 at 12:00 PM, MEETHU MATHEW <meethu2...@yahoo.co.in> wrote:
>
>Hi all,
>>
>>
>>My Spark(Standalone mode) was running fine till yesterday.But now I am 
>>getting  the following exeception when I am running start-slaves.sh or 
>>start-all.sh
>>
>>
>>slave3: failed to launch org.apache.spark.deploy.worker.Worker:
>>slave3:   at 
>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>slave3:   at java.lang.Thread.run(Thread.java:662)
>>
>>
>>The log files has the following lines.
>>
>>
>>14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j profile: 
>>org/apache/spark/log4j-defaults.properties
>>14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
>>14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication 
>>disabled; ui acls disabled; users with view permissions: Set(hduser)
>>14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
>>14/06/27 11:06:30 INFO Remoting: Starting remoting
>>Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed 
>>to bind to: master/192.168.125.174:0
>>at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
>>...
>>Caused by: java.net.BindException: Cannot assign requested address
>>...
>>I saw the same error reported before and have tried the following solutions.
>>
>>
>>Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a different 
>>number..But nothing is working.
>>
>>
>>When I try to start the worker from the respective machines using the 
>>following java command,its running without any exception
>>
>>
>>java -cp 
>>::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
>> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
>>org.apache.spark.deploy.worker.Worker spark://:master:7077
>>
>>
>>
>>Somebody please give a solution
>> 
>>Thanks & Regards, 
>>Meethu M
>
>
>

Reply via email to