Are you using RC4?

On Wed, Jun 3, 2015 at 10:58 PM, Night Wolf <nightwolf...@gmail.com> wrote:

> Thanks Yin, that seems to work with the Shell. But on a compiled
> application with Spark-submit it still fails with the same exception.
>
> On Thu, Jun 4, 2015 at 2:46 PM, Yin Huai <yh...@databricks.com> wrote:
>
>> Can you put the following setting in spark-defaults.conf and try again?
>>
>> spark.sql.hive.metastore.sharedPrefixes
>> com.mysql.jdbc,org.postgresql,com.microsoft.sqlserver,oracle.jdbc,com.mapr.fs.shim.LibraryLoader,com.mapr.security.JNISecurity,com.mapr.fs.jni
>>
>> https://issues.apache.org/jira/browse/SPARK-7819 has more context about
>> it.
>>
>> On Wed, Jun 3, 2015 at 9:38 PM, Night Wolf <nightwolf...@gmail.com>
>> wrote:
>>
>>> Hi all,
>>>
>>> Trying out Spark 1.4 RC4 on MapR4/Hadoop 2.5.1 running in yarn-client
>>> mode with Hive support.
>>>
>>> *Build command;*
>>> ./make-distribution.sh --name mapr4.0.2_yarn_j6_2.10 --tgz -Pyarn
>>> -Pmapr4 -Phadoop-2.4 -Pmapr4 -Phive -Phadoop-provided
>>> -Dhadoop.version=2.5.1-mapr-1501 -Dyarn.version=2.5.1-mapr-1501 -DskipTests
>>> -e -X
>>>
>>>
>>> When trying to run a hive query in the spark shell *sqlContext.sql("show
>>> tables")* I get the following exception;
>>>
>>> scala> sqlContext.sql("show tables")
>>> 15/06/04 04:33:16 INFO hive.HiveContext: Initializing
>>> HiveMetastoreConnection version 0.13.1 using Spark classes.
>>> java.lang.reflect.InvocationTargetException
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at com.mapr.fs.ShimLoader.loadNativeLibrary(ShimLoader.java:323)
>>> at com.mapr.fs.ShimLoader.load(ShimLoader.java:198)
>>> at
>>> org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:59)
>>> at java.lang.Class.forName0(Native Method)
>>> at java.lang.Class.forName(Class.java:274)
>>> at
>>> org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1857)
>>> at
>>> org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2072)
>>> at
>>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2282)
>>> at
>>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2234)
>>> at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2151)
>>> at org.apache.hadoop.conf.Configuration.set(Configuration.java:1002)
>>> at org.apache.hadoop.conf.Configuration.set(Configuration.java:974)
>>> at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:518)
>>> at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:536)
>>> at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:430)
>>> at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:1366)
>>> at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:1332)
>>> at
>>> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:99)
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> at
>>> org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:170)
>>> at
>>> org.apache.spark.sql.hive.client.IsolatedClientLoader.<init>(IsolatedClientLoader.scala:166)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:212)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:175)
>>> at
>>> org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:370)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:370)
>>> at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:369)
>>> at
>>> org.apache.spark.sql.hive.HiveContext$$anon$1.<init>(HiveContext.scala:382)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.analyzer$lzycompute(HiveContext.scala:382)
>>> at org.apache.spark.sql.hive.HiveContext.analyzer(HiveContext.scala:381)
>>> at
>>> org.apache.spark.sql.SQLContext$QueryExecution.assertAnalyzed(SQLContext.scala:901)
>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:131)
>>> at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>>> at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:725)
>>> at
>>> $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:21)
>>> at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:26)
>>> at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:28)
>>> at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
>>> at $line37.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:32)
>>> at $line37.$read$$iwC$$iwC$$iwC.<init>(<console>:34)
>>> at $line37.$read$$iwC$$iwC.<init>(<console>:36)
>>> at $line37.$read$$iwC.<init>(<console>:38)
>>> at $line37.$read.<init>(<console>:40)
>>> at $line37.$read$.<init>(<console>:44)
>>> at $line37.$read$.<clinit>(<console>)
>>> at $line37.$eval$.<init>(<console>:7)
>>> at $line37.$eval$.<clinit>(<console>)
>>> at $line37.$eval.$print(<console>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>> at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>> at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>> at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>> at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
>>> at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
>>> at org.apache.spark.repl.SparkILoop.org
>>> $apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>> at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>> at org.apache.spark.repl.SparkILoop.org
>>> $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>> at org.apache.spark.repl.Main.main(Main.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> Caused by: java.lang.UnsatisfiedLinkError: Native Library /tmp/
>>> mapr-nw-libMapRClient.1.4.0-SNAPSHOT.so already loaded in another
>>> classloader
>>> at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1931)
>>> at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
>>> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1851)
>>> at java.lang.Runtime.load0(Runtime.java:795)
>>> at java.lang.System.load(System.java:1062)
>>> at com.mapr.fs.shim.LibraryLoader.load(LibraryLoader.java:29)
>>> ... 86 more
>>> java.lang.UnsatisfiedLinkError: Native Library /tmp/
>>> mapr-nw-libMapRClient.1.4.0-SNAPSHOT.so already loaded in another
>>> classloader
>>> at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1931)
>>> at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
>>> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1851)
>>> at java.lang.Runtime.load0(Runtime.java:795)
>>> at java.lang.System.load(System.java:1062)
>>> at com.mapr.fs.shim.LibraryLoader.load(LibraryLoader.java:29)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at com.mapr.fs.ShimLoader.loadNativeLibrary(ShimLoader.java:323)
>>> at com.mapr.fs.ShimLoader.load(ShimLoader.java:198)
>>> at
>>> org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:59)
>>> at java.lang.Class.forName0(Native Method)
>>> at java.lang.Class.forName(Class.java:274)
>>> at
>>> org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1857)
>>> at
>>> org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2072)
>>> at
>>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2282)
>>> at
>>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2234)
>>> at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2151)
>>> at org.apache.hadoop.conf.Configuration.set(Configuration.java:1002)
>>> at org.apache.hadoop.conf.Configuration.set(Configuration.java:974)
>>> at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:518)
>>> at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:536)
>>> at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:430)
>>> at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:1366)
>>> at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:1332)
>>> at
>>> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:99)
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> at
>>> org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:170)
>>> at
>>> org.apache.spark.sql.hive.client.IsolatedClientLoader.<init>(IsolatedClientLoader.scala:166)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:212)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:175)
>>> at
>>> org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:370)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:370)
>>> at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:369)
>>> at
>>> org.apache.spark.sql.hive.HiveContext$$anon$1.<init>(HiveContext.scala:382)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.analyzer$lzycompute(HiveContext.scala:382)
>>> at org.apache.spark.sql.hive.HiveContext.analyzer(HiveContext.scala:381)
>>> at
>>> org.apache.spark.sql.SQLContext$QueryExecution.assertAnalyzed(SQLContext.scala:901)
>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:131)
>>> at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>>> at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:725)
>>> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:21)
>>> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:26)
>>> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:28)
>>> at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
>>> at $iwC$$iwC$$iwC$$iwC.<init>(<console>:32)
>>> at $iwC$$iwC$$iwC.<init>(<console>:34)
>>> at $iwC$$iwC.<init>(<console>:36)
>>> at $iwC.<init>(<console>:38)
>>> at <init>(<console>:40)
>>> at .<init>(<console>:44)
>>> at .<clinit>(<console>)
>>> at .<init>(<console>:7)
>>> at .<clinit>(<console>)
>>> at $print(<console>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>> at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>> at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>> at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>> at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
>>> at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
>>> at org.apache.spark.repl.SparkILoop.org
>>> $apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>> at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>> at org.apache.spark.repl.SparkILoop.org
>>> $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>> at org.apache.spark.repl.Main.main(Main.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> Any ideas what is causing this?
>>>
>>> Cheers,
>>> ~N
>>>
>>
>>
>

Reply via email to