Hi,

The error "java.sql.SQLException: No suitable driver found..." is typically
thrown when the worker nodes can't find Phoenix on the class path.

I'm not certain that passing those values using '--conf' actually works or
not with Spark. I tend to set them in my 'spark-defaults.conf' in the Spark
configuration folder. I think restarting the master and workers may be
required as well.

Josh

On Tue, Jan 5, 2016 at 5:38 AM, mengfei <sac...@outlook.com> wrote:

> hi josh:
>
>                 thank you for your advice,and it did work . i build the
> client-spark jar refreed the patch with thr CDH code and it succeed.
>                 Then i run some code with the "local" mode ,and the
> result is correct. But when it comes to the "yarn-client" mode ,some error
> happend:
>
>
>  java.lang.RuntimeException: java.sql.SQLException: No suitable driver found 
> for jdbc:phoenix:
> cdhzk1.boloomo.com,cdhzk2.boloomo.com,cdhzk3.boloomo.com:2181;
>
>                 I did try every way i know or i find in the commities: but
>  they don`t help.  So i want to get help from you ,Thank you for your
> patience.
>
>                 My code is :
>
> import org.apache.spark.sql.SQLContext
>
> import org.apache.phoenix.spark._
>
> import org.apache.spark.SparkContext
>
> import org.apache.spark.sql.SQLContext
>
>  import org.apache.phoenix.jdbc.PhoenixDriver
>
> import java.sql.DriverManager
>
>  DriverManager.registerDriver(new PhoenixDriver)
>
> val pred = s"MMSI = '002190048'"
>
> val rdd = sc.phoenixTableAsRDD(
>
>     "AIS_WS",
>
>     Seq("MMSI","LON","LAT","RID"),
>
>     predicate = Some(pred),
>
>     zkUrl = Some("cdhzk1.boloomo.com,cdhzk2.boloomo.com,cdhzk3.boloomo.com
> "))
>
> println(rdd.count())
>
>
> my  scripts are:
>
>  spark-submit \
>
> --master yarn-cluster  \
>
>
> --driver-class-path  
> "/data/public/mengfei/lib/phoenix-1.2.0-client-spark.jar" \
>
>
> --conf 
> "spark.executor.extraClassPath=/data/public/mengfei/lib/phoenix-1.2.0-client-spark.jar"
>  \
>
>
> --conf 
> "spark.driver.extraClassPath=/data/public/mengfei/lib/phoenix-1.2.0-client-spark.jar"
>  \
>
> --jars /data/public/mengfei/lib/phoenix-1.2.0-client-spark.jar \
>
>
> spark-shell \
> --master yarn-client -v \
> --driver-class-path  
> "/opt/cloudera/parcels/CLABS_PHOENIX/lib/phoenix/phoenix-1.2.0-client-spark.jar"
>  \
>
> --conf 
> "spark.executor.extraClassPath=/opt/cloudera/parcels/CLABS_PHOENIX/lib/phoenix/phoenix-1.2.0-client-spark.jar"
>  \
>
> --conf 
> "spark.driver.extraClassPath=/opt/cloudera/parcels/CLABS_PHOENIX/lib/phoenix/phoenix-1.2.0-client-spark.jar"
>  \
> --jars 
> /opt/cloudera/parcels/CLABS_PHOENIX/lib/phoenix/phoenix-1.2.0-client-spark.jar
>
>
>
> ps: i did copied the jar to every node,and give the even the 777 rghts.
>
> ------------------------------
> sac...@outlook.com
>
>
> *From:* Josh Mahonin <jmaho...@gmail.com>
> *Date:* 2015-12-30 00:56
> *To:* user <user@phoenix.apache.org>
> *Subject:* Re: error when get data from Phoenix 4.5.2 on CDH 5.5.x by
> spark 1.5
> Hi,
>
> This issue is fixed with the following patch, and using the resulting
> 'client-spark' JAR after compilation:
> https://issues.apache.org/jira/browse/PHOENIX-2503
>
> As an alternative, you may have some luck also including updated
> com.fasterxml.jackson jackson-databind JARs in your app that are in sync
> with Spark's versions. Unfortunately the client JAR right now is shipping
> fasterxml jars that conflict with the Spark runtime.
>
> Another user has also had success by bundling their own Phoenix
> dependencies, if you want to try that out instead:
>
> http://mail-archives.apache.org/mod_mbox/incubator-phoenix-user/201512.mbox/%3c0f96d592-74d7-431a-b301-015374a6b...@sandia.gov%3E
>
> Josh
>
>
>
> On Tue, Dec 29, 2015 at 9:11 AM, sac...@outlook.com <sac...@outlook.com>
> wrote:
>
>> The error is
>>
>> java.lang.NoSuchMethodError:
>> com.fasterxml.jackson.databind.Module$SetupContext.setClassIntrospector(Lcom/fasterxml/jackson/databind/introspect/ClassIntrospector;)V
>>
>>         at
>> com.fasterxml.jackson.module.scala.introspect.ScalaClassIntrospectorModule$$anonfun$1.apply(ScalaClassIntrospector.scala:32)
>>
>>         at
>> com.fasterxml.jackson.module.scala.introspect.ScalaClassIntrospectorModule$$anonfun$1.apply(ScalaClassIntrospector.scala:32)
>>
>>         at
>> com.fasterxml.jackson.module.scala.JacksonModule$$anonfun$setupModule$1.apply(JacksonModule.scala:47)
>>
>>       …..
>>
>> The scala code is
>>
>> val df = sqlContext.load(
>>
>>   "org.apache.phoenix.spark",
>>
>>   Map("table" -> "AIS ", "zkUrl" -> "cdhzk1.ccco.com:2181")
>>
>> )
>>
>>
>>
>> Maybe I got the resoon ,the Phoenix 4.5.2 on  CDH 5.5.x is build with
>> spark 1.4 ,and cdh5.5`defalut spark version is 1.5.
>>
>> So  how could I do?  To rebuild a phoenix 4.5.2 version with spark 1.5 Or
>> change the cdh spark to 1.4.  Apreantly these are difficult for me . Could
>> someone help me ? Thank you vey much.
>>
>>
>>
>
>

Reply via email to