Make sure to double check your imports. Note the following from 
https://phoenix.apache.org/phoenix_spark.html


import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
import org.apache.phoenix.spark._

There's also a sample repository here: 
https://github.com/jmahonin/spark-graphx-phoenix

From: Hardika Catur Sapta
Reply-To: "user@phoenix.apache.org<mailto:user@phoenix.apache.org>"
Date: Tuesday, September 29, 2015 at 5:28 AM
To: "user@phoenix.apache.org<mailto:user@phoenix.apache.org>"
Subject: Re: integration Phoenix and Spark

/spark/Project Spark$ scala SavingPhoenix.scala
/home/hduser/spark/Project Spark/SavingPhoenix.scala:1: error: object spark is 
not a member of package phoenix.org.apache
import phoenix.org.apache.spark.SparkContext
                          ^
/home/hduser/spark/Project Spark/SavingPhoenix.scala:4: error: not found: type 
SparkContext
val sc = new SparkContext("local", "phoenix-test")
             ^
two errors found


2015-09-29 16:20 GMT+07:00 Konstantinos Kougios 
<kostas.koug...@googlemail.com<mailto:kostas.koug...@googlemail.com>>:
Hi,

Just to add that, at least for hadoop-2.7.1 and phoenix 4.5.2-HBase-1.1, hadoop 
guava lib has to be patched to 14.0.1 (under hadoop/share/hadoop/common/lib) 
otherwise spark tasks might fail due to missing guava methods.

Cheers


On 29/09/15 10:17, Hardika Catur Sapta wrote:
Spark setup

  1.  Ensure that all requisite Phoenix / HBase platform dependencies are 
available on the classpath for the Spark executors and drivers

  2.  One method is to add the phoenix-4.4.0-client.jar to ‘SPARK_CLASSPATH’ in 
spark-env.sh, or setting both ‘spark.executor.

  3.  To help your IDE, you may want to add the following ‘provided’ dependency


sorry for bad English.

intent to number 2 and 3 how ??

please explain step by step.


Thanks.



Reply via email to