Hi
try to  make sure there is spark-assembly-1.5.2-hadoop2.6.0.jar in oozie spark 
share lib. 
spark assembly jar must be locate in oozie spark share lib. 
 
-----Original Message-----
From: "Rohit Mishra"<rohitkmis...@mindagroup.com> 
To: <user@hadoop.apache.org>; 
Cc: 
Sent: 2017-01-16 (월) 15:04:26
Subject: oozie issue java.lang.UnsupportedOperationException: Not implemented 
by the TFS FileSystem implementatio
 
Hello, I am new to hadoop.I am having issue to run a spark job in 
oozie.individually i am able to run the spark job but with oozie after the job 
is launched i am getting the following error: 017-01-12 13:51:57,696 INFO 
[main] org.apache.hadoop.service.AbstractService: Service 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED; cause: 
java.lang.UnsupportedOperationException: Not implemented by the TFS FileSystem 
implementationjava.lang.UnsupportedOperationException: Not implemented by the 
TFS FileSystem implementation   at 
org.apache.hadoop.fs.FileSystem.getScheme(FileSystem.java:216)       at 
org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2564)        at 
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574)     at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)       at 
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)       at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)      at 
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)      at 
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)     at 
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)     at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.getFileSystem(MRAppMaster.java:497)
   at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:281)
     at 
org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)     at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1499)  at 
java.security.AccessController.doPrivileged(Native Method)   at 
javax.security.auth.Subject.doAs(Subject.java:422)   at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
 at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1496)
  at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1429) 
Spark version: spark-1.5.2-bin-hadoop2.6Hadoop: hadoop-2.6.2Hbase : 
hbase-1.1.5Oozie: oozie-4.2.0 snapshot of my pom.xml is: <dependency>
   <groupId>org.apache.zookeeper</groupId>
   <artifactId>zookeeper</artifactId>
   <version>3.4.8</version>
   <type>pom</type>
</dependency>
<dependency>
   <groupId>org.apache.hbase</groupId>
   <artifactId>hbase-common</artifactId>
   <version>1.1.5</version>
   <exclusions>
      <exclusion>
         <groupId>org.slf4j</groupId>
         <artifactId>slf4j-log4j12</artifactId>
      </exclusion>
   </exclusions>
</dependency>

<dependency>
   <groupId>org.apache.hbase</groupId>
   <artifactId>hbase-client</artifactId>
   <version>1.1.5</version>
   <exclusions>
      <exclusion>
         <groupId>org.slf4j</groupId>
         <artifactId>slf4j-log4j12</artifactId>
      </exclusion>
   </exclusions>
</dependency>

<dependency>
   <groupId>org.apache.hbase</groupId>
   <artifactId>hbase-server</artifactId>
   <version>1.1.5</version>
   <exclusions>
      <exclusion>
         <groupId>org.slf4j</groupId>
         <artifactId>slf4j-log4j12</artifactId>
      </exclusion>
   </exclusions>
</dependency>
<dependency>
   <groupId>org.apache.hbase</groupId>
   <artifactId>hbase-testing-util</artifactId>
   <version>1.1.5</version>
</dependency>
<dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-core_2.11</artifactId>
   <version>1.5.2</version>
   <exclusions>
      <exclusion>
         <artifactId>javax.servlet</artifactId>
         <groupId>org.eclipse.jetty.orbit</groupId>
      </exclusion>
   </exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 
-->
<dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-sql_2.11</artifactId>
   <version>1.5.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-yarn_2.10 
-->
<dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-yarn_2.11</artifactId>
   <version>1.5.2</version>

</dependency>


<!-- 
https://mvnrepository.com/artifact/org.mongodb.mongo-hadoop/mongo-hadoop-core 
-->
<dependency>
   <groupId>org.mongodb.mongo-hadoop</groupId>
   <artifactId>mongo-hadoop-core</artifactId>
   <version>1.5.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common 
-->
<dependency>
   <groupId>org.apache.hadoop</groupId>
   <artifactId>hadoop-common</artifactId>
   <version>2.6.2</version>
   <exclusions>
      <exclusion>
         <artifactId>servlet-api</artifactId>
         <groupId>javax.servlet</groupId>
      </exclusion>
      <exclusion>
         <artifactId>jetty-util</artifactId>
         <groupId>org.mortbay.jetty</groupId>
      </exclusion>
      <exclusion>
         <artifactId>jsp-api</artifactId>
         <groupId>javax.servlet.jsp</groupId>
      </exclusion>
   </exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client 
-->
<dependency>
   <groupId>org.apache.hadoop</groupId>
   <artifactId>hadoop-client</artifactId>
   <version>2.6.2</version>
   <exclusions>
      <exclusion>
         <artifactId>jetty-util</artifactId>
         <groupId>org.mortbay.jetty</groupId>
      </exclusion>
   </exclusions>
</dependency>
<!-- 
https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core
 -->
<dependency>
   <groupId>org.apache.hadoop</groupId>
   <artifactId>hadoop-mapreduce-client-core</artifactId>
   <version>2.6.2</version>
</dependency>
<dependency>
   <groupId>org.mongodb</groupId>
   <artifactId>mongo-java-driver</artifactId>
   <version>3.2.1</version>
</dependency>
<!-- hadoop dependency -->
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core -->
<dependency>
   <groupId>org.apache.hadoop</groupId>
   <artifactId>hadoop-core</artifactId>
   <version>1.2.1</version>
   <exclusions>
      <exclusion>
         <artifactId>jetty-util</artifactId>
         <groupId>org.mortbay.jetty</groupId>
      </exclusion>
   </exclusions>
</dependency>  Till now I have searched several blogs. What i do 
understand from reading those blogs iis that there is some issue with the 
tachyon jar which is embedded in spark-assembly-1.5.2-hadoop2.6.0.jar.I tried 
removing tachyon-0.5.0.jar tachyon-client-0.5.0.jar from shared library of 
oozie (was present under spark library) but then i started getting error: 
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], 
main() threw exception, 
org.apache.spark.util.Utils$.DEFAULT_DRIVER_MEM_MB()Ijava.lang.NoSuchMethodError:
 org.apache.spark.util.Utils$.DEFAULT_DRIVER_MEM_MB()I Please help me debug and 
solve it. Thanks,Rohit

Reply via email to