Re: integration Phoenix and Spark

2015-09-29 Thread Josh Mahonin
spark-graphx-phoenix From: Hardika Catur Sapta Reply-To: "user@phoenix.apache.org<mailto:user@phoenix.apache.org>" Date: Tuesday, September 29, 2015 at 5:28 AM To: "user@phoenix.apache.org<mailto:user@phoenix.apache.org>" Subject: Re: integration Phoenix a

Re: integration Phoenix and Spark

2015-09-29 Thread Hardika Catur Sapta
/spark/Project Spark$ scala SavingPhoenix.scala /home/hduser/spark/Project Spark/SavingPhoenix.scala:1: error: object spark is not a member of package phoenix.org.apache import phoenix.org.apache.spark.SparkContext ^ /home/hduser/spark/Project Spark/SavingPhoenix.scala:4:

Re: integration Phoenix and Spark

2015-09-29 Thread Konstantinos Kougios
Hi, Just to add that, at least for hadoop-2.7.1 and phoenix 4.5.2-HBase-1.1, hadoop guava lib has to be patched to 14.0.1 (under hadoop/share/hadoop/common/lib) otherwise spark tasks might fail due to missing guava methods. Cheers On 29/09/15 10:17, Hardika Catur Sapta wrote: Spa

integration Phoenix and Spark

2015-09-29 Thread Hardika Catur Sapta
Spark setup 1. Ensure that all requisite Phoenix / HBase platform dependencies are available on the classpath for the Spark executors and drivers 2. One method is to add the phoenix-4.4.0-client.jar to ‘SPARK_CLASSPATH’ in spark-env.sh, or setting both ‘spark.executor. 3.