Hi,

I read that doc several times now. I am stuck with the below error message
when I run ./spark-shell --master yarn --deploy-mode client.

I have my HADOOP_CONF_DIR set to /usr/local/hadoop-2.7.3/etc/hadoop and
SPARK_HOME set to /usr/local/spark on all 3 machines (1 node for Resource
Manager and NameNode, 2 Nodes for Node Manager and DataNodes).

Any idea?



*18/03/13 00:19:13 INFO LineBufferedStream: stdout:
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File
/user/centos/.sparkStaging/application_1520898664848_0003/__spark_libs__2434167523839846774.zip
could only be replicated to 0 nodes instead of minReplication (=1).  There
are 2 datanode(s) running and no node(s) are excluded in this operation.*


18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1571)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:725)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
java.security.AccessController.doPrivileged(Native Method)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
javax.security.auth.Subject.doAs(Subject.java:422)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
18/03/13 00:19:13 INFO LineBufferedStream: stdout:  at
org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
18/03/13


Thanks!


On Mon, Mar 12, 2018 at 4:46 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> That's not an error, just a warning. The docs [1] have more info about
> the config options mentioned in that message.
>
> [1] http://spark.apache.org/docs/latest/running-on-yarn.html
>
> On Mon, Mar 12, 2018 at 4:42 PM, kant kodali <kanth...@gmail.com> wrote:
> > Hi All,
> >
> > I am trying to use YARN for the very first time. I believe I configured
> all
> > the resource manager and name node fine. And then I run the below command
> >
> > ./spark-shell --master yarn --deploy-mode client
> >
> > I get the below output and it hangs there forever (I had been waiting
> over
> > 10 minutes)
> >
> > 18/03/12 23:36:32 WARN Client: Neither spark.yarn.jars nor
> > spark.yarn.archive is set, falling back to uploading libraries under
> > SPARK_HOME.
> >
> > Any idea?
> >
> > Thanks!
>
>
>
> --
> Marcelo
>

Reply via email to