Mariia  created HADOOP-17533:
--------------------------------

             Summary: Server IPC version 9 cannot communicate with client 
version 4
                 Key: HADOOP-17533
                 URL: https://issues.apache.org/jira/browse/HADOOP-17533
             Project: Hadoop Common
          Issue Type: Bug
            Reporter: Mariia 


`I want to connect to hdfs using java jast like this
String url = "hdfs://c7301.ambari.apache.org:8020/file.txt";

FileSystem fs = null;
InputStream in = null;
try \{
    Configuration conf = new Configuration();
    fs = FileSystem.get(URI.create(url), conf, "admin");

    in = fs.open(new Path(url));

    IOUtils.copyBytes(in, System.out, 4096, false);

} catch (Exception e) \{
    e.printStackTrace();
} finally \{
    IOUtils.closeStream(fs);
} 

Error that i got
[2021-02-17 20:02:06,115] ERROR PriviledgedActionException as:admin 
cause:org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot 
communicate with client version 4 
(org.apache.hadoop.security.UserGroupInformation:1124)
org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate 
with client version 4
        at org.apache.hadoop.ipc.Client.call(Client.java:1070)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
        at com.sun.proxy.$Proxy4.getProtocolVersion(Unknown Source)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
        at 
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
        at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
        at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:117)
        at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:115)
        at java.base/java.security.AccessController.doPrivileged(Native Method)
        at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:115)
        at Main.main(Main.java:38) 
{{ I tried different solutions to the problem, but nothing helped. }}

Its my pom.xml file

 
 <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0";
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>producer</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <maven.compiler.source>11</maven.compiler.source>
        <maven.compiler.target>11</maven.compiler.target>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>0.10.0.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>3.2.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>3.2.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-yarn-common</artifactId>
            <version>3.2.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-common</artifactId>
            <version>3.2.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.2.0</version>
        </dependency>

    </dependencies>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>3.2.4</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <transformers>
                                <transformer
                                        
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                    <mainClass>Main</mainClass>
                                </transformer>
                            </transformers>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

</project>
{{}}

And its version of hdfs

Hadoop 3.1.1.3.1.4.0-315 Source code repository 
g...@github.com:hortonworks/hadoop.git -r 
58d0fd3d8ce58b10149da3c717c45e5e57a60d14 Compiled by jenkins on 
2019-08-23T05:15Z Compiled with protoc 2.5.0 From source with checksum 
fcbd146ffa6d48fef0ed81332f9d6f0 This command was run using 
/usr/ddp/3.1.4.0-315/hadoop/hadoop-common-3.1.1.3.1.4.0-315.jar

 

if someone knew a similar problem, please help



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Reply via email to