It seems there is hadoop 1 somewhere in the path.

On Fri, Dec 19, 2014, 21:24 Sean Owen <so...@cloudera.com> wrote:

> Yes, but your error indicates that your application is actually using
> Hadoop 1.x of some kind. Check your dependencies, especially
> hadoop-client.
>
> On Fri, Dec 19, 2014 at 2:11 PM, Haopu Wang <hw...@qilinsoft.com> wrote:
> > I’m using Spark 1.1.0 built for HDFS 2.4.
> >
> > My application enables check-point (to HDFS 2.5.1) and it can build. But
> > when I run it, I get below error:
> >
> >
> >
> > Exception in thread "main" org.apache.hadoop.ipc.RemoteException:
> Server IPC
> > version 9 cannot communicate with client version 4
> >
> >     at org.apache.hadoop.ipc.Client.call(Client.java:1070)
> >
> >     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
> >
> >     at com.sun.proxy.$Proxy6.getProtocolVersion(Unknown Source)
> >
> >     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
> >
> >     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
> >
> >     at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
> >
> >     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
> >
> >     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
> >
> >     at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:89)
> >
> >     at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
> >
> >     at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> >
> >     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
> >
> >     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
> >
> >     at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
> >
> >     at
> > org.apache.spark.streaming.StreamingContext.checkpoint(
> StreamingContext.scala:201)
> >
> >
> >
> > Does that mean I have to use HDFS 2.4 to save check-point? Thank you!
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to