[ 
https://issues.apache.org/jira/browse/SPARK-15368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15287658#comment-15287658
 ] 

Dawson Choong edited comment on SPARK-15368 at 5/17/16 9:45 PM:
----------------------------------------------------------------

I see, sorry for the confusion. May I ask how I would specify an argument to 
the history server? I need to add a library of JARs to the history server 
classpath. [~srowen]


was (Author: dawson.choong):
I see, sorry for the confusion. May I ask how I would specify an argument to 
the history server? I need to add a library of JARs to the history server 
classpath [~srowen]

> Spark History Server does not pick up extraClasspath
> ----------------------------------------------------
>
>                 Key: SPARK-15368
>                 URL: https://issues.apache.org/jira/browse/SPARK-15368
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.1
>         Environment: HDP-2.4
> CentOS
>            Reporter: Dawson Choong
>
> We've encountered a problem where the Spark History Server is not picking up 
> on the {{spark.driver.extraClassPath}} paremter in the {{Custom 
> spark-defaults}} inside Ambari. Because the needed JARs are not being picked 
> up, this is leading to {{ClassNotFoundException}}. (Our current workaround is 
> to manually export the JARs in the Spark-env.)
> Log file:
> Spark Command: /usr/java/default/bin/java -Dhdp.version=2.4.0.0-169 -cp 
> /usr/hdp/2.4.0.0-169/spark/sbin/../conf/:/usr/hdp/2.4.0.0-169/spark/lib/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/spark/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.4.0.0-169/spark/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.4.0.0-169/spark/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/current/hadoop-client/conf/
>  -Xms1g -Xmx1g -XX:MaxPermSize=256m 
> org.apache.spark.deploy.history.HistoryServer
> ========================================
> 16/04/12 12:23:44 INFO HistoryServer: Registered signal handlers for [TERM, 
> HUP, INT]
> 16/04/12 12:23:45 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 16/04/12 12:23:45 INFO SecurityManager: Changing view acls to: spark
> 16/04/12 12:23:45 INFO SecurityManager: Changing modify acls to: spark
> 16/04/12 12:23:45 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(spark); users 
> with modify permissions: Set(spark)
> Exception in thread "main" java.lang.reflect.InvocationTargetException
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>       at 
> org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:235)
>       at 
> org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
> Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: 
> Class com.wandisco.fs.client.FusionHdfs not found
>       at 
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
>       at 
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2638)
>       at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651)
>       at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
>       at 
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
>       at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
>       at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
>       at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:362)
>       at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1650)
>       at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1657)
>       at 
> org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:71)
>       at 
> org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:49)
>       ... 6 more
> Caused by: java.lang.ClassNotFoundException: Class 
> com.wandisco.fs.client.FusionHdfs not found
>       at 
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
>       at 
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
>       ... 17 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to