[ 
https://issues.apache.org/jira/browse/SPARK-7819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14590010#comment-14590010
 ] 

Yin Huai commented on SPARK-7819:
---------------------------------

[~mandoskippy] Yeah, it is the official way to work around the problem. Since 
1.4.0, we introduced isolated class loaders to make Spark SQL be able to 
connect to different versions of Hive metastore. Basically, our metastore 
client part is using a different class loader than the other part of the Spark 
SQL. I am attaching the doc for this conf 
({{spark.sql.hive.metastore.sharedPrefixes}}), hope this is helpful.

{quote}
A comma separated list of class prefixes that should be loaded using the 
classloader that is shared between Spark SQL and a specific version of Hive. An 
example of classes that should be shared is JDBC drivers that are needed to 
talk to the metastore. Other classes that need to be shared are those that 
interact with classes that are already shared.  For example, custom appenders 
that are used by log4j.
{quote}


> Isolated Hive Client Loader appears to cause Native Library 
> libMapRClient.4.0.2-mapr.so already loaded in another classloader error
> -----------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-7819
>                 URL: https://issues.apache.org/jira/browse/SPARK-7819
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>            Reporter: Fi
>            Assignee: Yin Huai
>            Priority: Critical
>             Fix For: 1.4.0
>
>         Attachments: invalidClassException.log, stacktrace.txt, test.py
>
>
> In reference to the pull request: https://github.com/apache/spark/pull/5876
> I have been running the Spark 1.3 branch for some time with no major hiccups, 
> and recently switched to the Spark 1.4 branch.
> I build my spark distribution with the following build command:
> {noformat}
> make-distribution.sh --tgz --skip-java-test --with-tachyon -Phive 
> -Phive-0.13.1 -Pmapr4 -Pspark-ganglia-lgpl -Pkinesis-asl -Phive-thriftserver
> {noformat}
> When running a python script containing a series of smoke tests I use to 
> validate the build, I encountered an error under the following conditions:
> * start a spark context
> * start a hive context
> * run any hive query
> * stop the spark context
> * start a second spark context
> * run any hive query
> ** ERROR
> From what I can tell, the Isolated Class Loader is hitting a MapR class that 
> is loading its native library (presumedly as part of a static initializer).
> Unfortunately, the JVM prohibits this the second time around.
> I would think that shutting down the SparkContext would clear out any 
> vestigials of the JVM, so I'm surprised that this would even be a problem.
> Note: all other smoke tests we are running passes fine.
> I will attach the stacktrace and a python script reproducing the issue (at 
> least for my environment and build).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to