Interesting.
The phoenix dependency wasn't shown in the classpath of your previous email.
On Thu, Apr 28, 2016 at 4:12 AM, pierre lacave wrote:
> Narrowed down to some version incompatibility with Phoenix 4.7 ,
>
> Including
Narrowed down to some version incompatibility with Phoenix 4.7 ,
Including $SPARK_HOME/lib/phoenix-4.7.0-HBase-1.1-client-spark.jar to
extraClassPath and that trigger the issue above.
I ll have a go at adding the individual dependencies as opposed to this fat
jar and see how it goes.
Thanks
Thanks Ted,
I am actually using the hadoop free version of spark
(spark-1.5.0-bin-without-hadoop) over hadoop 2.6.1, so could very well be
related indeed.
I have configured spark-env.sh with export
SPARK_DIST_CLASSPATH=$($HADOOP_PREFIX/bin/hadoop classpath), which is the
only version of hadoop
Can you check that the DFSClient Spark uses is the same version as on the
server side ?
The client and server (NameNode) negotiate a "crypto protocol version" -
this is a forward-looking feature.
Please note:
bq. Client provided: []
Meaning client didn't provide any supported crypto protocol
Hi
I am trying to use spark to write to a protected zone in hdfs, I am
able to create and list file using the hdfs client but when writing
via Spark I get this exception.
I could not find any mention of CryptoProtocolVersion in the spark doc.
Any idea what could have gone wrong?
spark