Hi,

Rather than pass in the JARs using the '--jars' flag, what happens if you
include them all in the 'extraClassPath' settings, or the SPARK_CLASSPATH
environment variable? That specific class, PhoenixConfigurationUtil, is in
the phoenix-core JAR, maybe do a 'unzip -l' on it to make sure that is in
in fact included?

Alternatively, if you're familiar with git/patch/maven, you can checkout
the 4.6.0-HBase-1.1 release, apply that patch in PHOENIX-2503, and use the
resulting 'client-spark' JAR in your classpath.

Josh



On Wed, Dec 30, 2015 at 8:24 AM, sac...@outlook.com <sac...@outlook.com>
wrote:

> hi josh:
>
>            i did use add the jars by using the --jars, the 
> 'com.fasterxml.jackson'`s
>  error disappeared,but here  raise a new exception:
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/phoenix/mapreduce/util/PhoenixConfigurationUtil
>
>
>         at 
> org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(PhoenixRDD.scala:73)
>
>
>         at 
> org.apache.phoenix.spark.PhoenixRDD.phoenixConf$lzycompute(PhoenixRDD.scala:39)
>
>
>         at 
> org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
>
>         at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
>
>          .....
>
>           My conf is as follows
>
>
> --conf 
> "spark.executor.extraClassPath=/data/public/spark/libs/phoenix-spark-4.6.0-HBase-1.1.jar"
>  \
>
>
> --conf 
> "spark.driver.extraClassPath=/data/public/spark/libs/phoenix-spark-4.6.0-HBase-1.1.jar"
>  \
>
> --jars /data/public/mengfei/spark/libs/guava-12.0.1.jar, \
>
> /data/public/spark/libs/hbase-client-1.1.0.jar, \
>
> /data/public/spark/libs/hbase-common-1.1.0.jar, \
>
> /data/public/spark/libs/hbase-protocol-1.1.0.jar, \
>
> /data/public/spark/libs/hbase-server-1.1.0.jar, \
>
> /data/public/spark/libs/htrace-core-3.1.0-incubating.jar, \
>
> /data/public/spark/libs/phoenix-4.6.0-HBase-1.1-client.jar, \
>
> /data/public/spark/libs/phoenix-core-4.6.0-HBase-1.1.jar, \
>
> /data/public/spark/libs/phoenix-spark-4.6.0-HBase-1.1.jar
>
>
> Thank you
> ------------------------------
> sac...@outlook.com
>
>
> *From:* Josh Mahonin <jmaho...@gmail.com>
> *Date:* 2015-12-30 00:56
> *To:* user <user@phoenix.apache.org>
> *Subject:* Re: error when get data from Phoenix 4.5.2 on CDH 5.5.x by
> spark 1.5
> Hi,
>
> This issue is fixed with the following patch, and using the resulting
> 'client-spark' JAR after compilation:
> https://issues.apache.org/jira/browse/PHOENIX-2503
>
> As an alternative, you may have some luck also including updated
> com.fasterxml.jackson jackson-databind JARs in your app that are in sync
> with Spark's versions. Unfortunately the client JAR right now is shipping
> fasterxml jars that conflict with the Spark runtime.
>
> Another user has also had success by bundling their own Phoenix
> dependencies, if you want to try that out instead:
>
> http://mail-archives.apache.org/mod_mbox/incubator-phoenix-user/201512.mbox/%3c0f96d592-74d7-431a-b301-015374a6b...@sandia.gov%3E
>
> Josh
>
>
>
> On Tue, Dec 29, 2015 at 9:11 AM, sac...@outlook.com <sac...@outlook.com>
> wrote:
>
>> The error is
>>
>> java.lang.NoSuchMethodError:
>> com.fasterxml.jackson.databind.Module$SetupContext.setClassIntrospector(Lcom/fasterxml/jackson/databind/introspect/ClassIntrospector;)V
>>
>>         at
>> com.fasterxml.jackson.module.scala.introspect.ScalaClassIntrospectorModule$$anonfun$1.apply(ScalaClassIntrospector.scala:32)
>>
>>         at
>> com.fasterxml.jackson.module.scala.introspect.ScalaClassIntrospectorModule$$anonfun$1.apply(ScalaClassIntrospector.scala:32)
>>
>>         at
>> com.fasterxml.jackson.module.scala.JacksonModule$$anonfun$setupModule$1.apply(JacksonModule.scala:47)
>>
>>       …..
>>
>> The scala code is
>>
>> val df = sqlContext.load(
>>
>>   "org.apache.phoenix.spark",
>>
>>   Map("table" -> "AIS ", "zkUrl" -> "cdhzk1.ccco.com:2181")
>>
>> )
>>
>>
>>
>> Maybe I got the resoon ,the Phoenix 4.5.2 on  CDH 5.5.x is build with
>> spark 1.4 ,and cdh5.5`defalut spark version is 1.5.
>>
>> So  how could I do?  To rebuild a phoenix 4.5.2 version with spark 1.5 Or
>> change the cdh spark to 1.4.  Apreantly these are difficult for me . Could
>> someone help me ? Thank you vey much.
>>
>>
>>
>
>

Reply via email to