[ https://issues.apache.org/jira/browse/HDDS-1014?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Elek, Marton resolved HDDS-1014. -------------------------------- Resolution: Duplicate Thanks the report [~bharatviswa] For mapreduce please use hadoop-ozone-filesystem-lib-current-0.5.0-SNAPSHOT.jar or hadoop-ozone-filesystem-lib-legacy-0.5.0-SNAPSHOT.jar instead of hadoop-ozone-filesystem-0.5.0-SNAPSHOT.jar. Legacy and current jar files are the shaded jar files. the simple filesystem includes only the ozonefs jar files to make it work with "ozone fs" command. BTW, legacy/current jar files were broken at the time of this report which made harder to find the right jars. but they will be fixed by HDDS-1525 and HDDS-1717 very soon... > hadoop-ozone-filesystem is missing required jars > ------------------------------------------------ > > Key: HDDS-1014 > URL: https://issues.apache.org/jira/browse/HDDS-1014 > Project: Hadoop Distributed Data Store > Issue Type: Bug > Reporter: Bharat Viswanadham > Assignee: Bharat Viswanadham > Priority: Major > > https://hadoop.apache.org/ozone/docs/0.3.0-alpha/ozonefs.html > After following the steps mentioned, I still get below error: > {code:java} > 19/01/25 17:15:28 ERROR client.OzoneClientFactory: Couldn't create protocol > class org.apache.hadoop.ozone.client.rpc.RpcClient exception: > java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.hadoop.ozone.client.OzoneClientFactory.getClientProtocol(OzoneClientFactory.java:291) > at > org.apache.hadoop.ozone.client.OzoneClientFactory.getRpcClient(OzoneClientFactory.java:169) > at > org.apache.hadoop.fs.ozone.OzoneFileSystem.initialize(OzoneFileSystem.java:128) > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3354) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124) > at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:461) > at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361) > at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:352) > at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:250) > at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:233) > at > org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:104) > at org.apache.hadoop.fs.shell.Command.run(Command.java:177) > at org.apache.hadoop.fs.FsShell.run(FsShell.java:328) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) > at org.apache.hadoop.fs.FsShell.main(FsShell.java:391) > Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: > org/apache/ratis/thirdparty/com/google/protobuf/ByteString > at org.apache.ratis.protocol.RaftId.<init>(RaftId.java:64) > at org.apache.ratis.protocol.ClientId.<init>(ClientId.java:47) > at org.apache.ratis.protocol.ClientId.randomId(ClientId.java:31) > at org.apache.hadoop.ozone.client.rpc.RpcClient.<init>(RpcClient.java:115) > ... 24 more > Caused by: java.lang.NoClassDefFoundError: > org/apache/ratis/thirdparty/com/google/protobuf/ByteString > ... 28 more > Caused by: java.lang.ClassNotFoundException: > org.apache.ratis.thirdparty.com.google.protobuf.ByteString > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > ... 28 more > {code} > So, I proceeded and added ratis-thirdparty-misc jar. > After that I got error related to missing RatisProto, and then next missing > bouncy castle. > After adding all of those jars I am able to run dfs and map red jobs. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org