/spark/Project Spark$ scala SavingPhoenix.scala
/home/hduser/spark/Project Spark/SavingPhoenix.scala:1: error: object spark
is not a member of package phoenix.org.apache
import phoenix.org.apache.spark.SparkContext
                          ^
/home/hduser/spark/Project Spark/SavingPhoenix.scala:4: error: not found:
type SparkContext
val sc = new SparkContext("local", "phoenix-test")
             ^
two errors found


2015-09-29 16:20 GMT+07:00 Konstantinos Kougios <
kostas.koug...@googlemail.com>:

> Hi,
>
> Just to add that, at least for hadoop-2.7.1 and phoenix 4.5.2-HBase-1.1,
> hadoop guava lib has to be patched to 14.0.1 (under
> hadoop/share/hadoop/common/lib) otherwise spark tasks might fail due to
> missing guava methods.
>
> Cheers
>
>
> On 29/09/15 10:17, Hardika Catur Sapta wrote:
>
> Spark setup
>
>    1.
>
>    Ensure that all requisite Phoenix / HBase platform dependencies are
>    available on the classpath for the Spark executors and drivers
>    2.
>
>    One method is to add the phoenix-4.4.0-client.jar to ‘SPARK_CLASSPATH’
>    in spark-env.sh, or setting both ‘spark.executor.
>    3.
>
>    To help your IDE, you may want to add the following ‘provided’
>    dependency
>
>
> sorry for bad English.
>
> intent to number 2 and 3 how ??
>
> please explain step by step.
>
>
> Thanks.
>
>
>
>

Reply via email to