(1) You need only client jar (phoenix-xxxx-client.jar)
(2) set spark.executor.extraClassPath in the spark-defaults.conf to the
client jar
Hope that would help.

Thanks,
Sergey

On Tue, Oct 25, 2016 at 9:31 PM, min zou <[email protected]> wrote:

> Dear, i use spark to do data analysis,then save the result to Phonix. When
> i run the application on Intellij IDEA by local model, the apllication runs
> ok, but i run it by spark-submit(spark-submit --class
> com.bigdata.main.RealTimeMain --master yarn  --driver-memory 2G
> --executor-memory 2G --num-executors 5 /home/zt/rt-analyze-1.0-SNAPSHOT.jar)
> on my cluster, i get a error:Caused by: java.lang.ClassNotFoundException:
> Class org.apache.phoenix.mapreduce.PhoenixOutputFormat not found.
>
> Exception in thread "main" java.lang.RuntimeException:
> java.lang.ClassNotFoundException: Class 
> org.apache.phoenix.mapreduce.PhoenixOutputFormat
> not found    at org.apache.hadoop.conf.Configu
> ration.getClass(Configuration.java:2112)    at
> org.apache.hadoop.mapreduce.task.JobContextImpl.getOutputFor
> matClass(JobContextImpl.java:232)    at org.apache.spark.rdd.PairRDDFu
> nctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:971)    at
> org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopFile(PairRDDFunctions.scala:903)
>   at 
> org.apache.phoenix.spark.ProductRDDFunctions.saveToPhoenix(ProductRDDFunctions.scala:51)
>   at com.mypackage.save(DAOImpl.scala:41)    at
> com.mypackage.ProtoStreamingJob.execute(ProtoStreamingJob.scala:58)    at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)    at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)    at
> com.mypackage.SparkApplication.sparkRun(SparkApplication.scala:95)    at
> com.mypackage.SparkApplication$delayedInit$body.apply(SparkApplication.scala:112)
>   at scala.Function0$class.apply$mcV$sp(Function0.scala:40)    at
> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
>   at scala.App$$anonfun$main$1.apply(App.scala:71)    at
> scala.App$$anonfun$main$1.apply(App.scala:71)    at
> scala.collection.immutable.List.foreach(List.scala:318)    at
> scala.collection.generic.TraversableForwarder$class.foreach(
> TraversableForwarder.scala:32)    at scala.App$class.main(App.scala:71)
>   at com.mypackage.SparkApplication.main(SparkApplication.scala:15)    at
> com.mypackage.ProtoStreamingJobRunner.main(ProtoStreamingJob.scala)    at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)    at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)    at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
> $SparkSubmit$$runMain(SparkSubmit.scala:569)    at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)Caused by:
> java.lang.ClassNotFoundException: Class 
> org.apache.phoenix.mapreduce.PhoenixOutputFormat
> not found    at org.apache.hadoop.conf.Configu
> ration.getClassByName(Configuration.java:2018)    at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2110)
> ... 30 more
>
>
> Then i use spark-submit --jars(spark-submit --class
> com.bigdata.main.RealTimeMain --master yarn --jars
> /root/apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-spark-4.8.
> 0-HBase-1.2.jar,/root/apache-phoenix-4.8.0-HBase-1.2-bin/
> phoenix-4.8.0-HBase-1.2-client.jar,/root/apache-phoenix-4.8.
> 0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar    --driver-memory 2G
> --executor-memory 2G --num-executors 5 /home/zm/rt-analyze-1.0-SNAPSHOT.jar)
> , i get the same error. My cluster is CDH5.7,phoenix4.8.0, Hbase1.2,
> spark1.6 . How can i solve the promble ? Please help me. thanks.
>

Reply via email to