Hi Akhil,

The IP is correct and is able to start the workers when we start it as a java 
command.Its becoming 192.168.125.174:0  when we call from the scripts.


 
Thanks & Regards, 
Meethu M


On Friday, 27 June 2014 1:49 PM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
 


why is it binding to port 0? 192.168.125.174:0 :/

Check the ip address of that master machine (ifconfig) looks like the ip 
address has been changed (hoping you are running this machines on a LAN)


Thanks
Best Regards


On Fri, Jun 27, 2014 at 12:00 PM, MEETHU MATHEW <meethu2...@yahoo.co.in> wrote:

Hi all,
>
>
>My Spark(Standalone mode) was running fine till yesterday.But now I am getting 
> the following exeception when I am running start-slaves.sh or start-all.sh
>
>
>slave3: failed to launch org.apache.spark.deploy.worker.Worker:
>slave3:   at 
>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>slave3:   at java.lang.Thread.run(Thread.java:662)
>
>
>The log files has the following lines.
>
>
>14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j profile: 
>org/apache/spark/log4j-defaults.properties
>14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
>14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication 
>disabled; ui acls disabled; users with view permissions: Set(hduser)
>14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
>14/06/27 11:06:30 INFO Remoting: Starting remoting
>Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to 
>bind to: master/192.168.125.174:0
>at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
>...
>Caused by: java.net.BindException: Cannot assign requested address
>...
>I saw the same error reported before and have tried the following solutions.
>
>
>Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a different 
>number..But nothing is working.
>
>
>When I try to start the worker from the respective machines using the 
>following java command,its running without any exception
>
>
>java -cp 
>::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
>org.apache.spark.deploy.worker.Worker spark://:master:7077
>
>
>
>Somebody please give a solution
> 
>Thanks & Regards, 
>Meethu M

Reply via email to