I'm sharing all the Config files and code here, Please have a look
Following is the code i'm using
System.setProperty("java.security.krb5.conf",
"/etc/krb5.conf");
System.setProperty("java.security.auth.login.config",
"/etc/hbase/conf/zk-jaas.conf");
val hconf = HBaseConfiguration
Hi Subroto,
I checked this. When i set the property in spark-defaults.conf file and log
its value from SparkConf, it says "No Such Element Found". But when i set
it through SparkConf explicitly, the previous issue is not resolved.
I'm trying hard to get it done but no workaround found yet!
Thank
Not sure what could be the problem be but, I would suggest you to double
check if the said property is part of SparkConf obejct being created in the
code (just by logging it).
Cheers,
Subroto Sanyal
On Wed, Aug 10, 2016 at 1:39 PM, Aneela Saleem
wrote:
> The property was already set in spark-de
The property was already set in spark-default.conf file but still facing
same error.
On Wed, Aug 10, 2016 at 4:35 PM, Subroto Sanyal
wrote:
> yes... you can set the property in the conf file or you can the property
> explicitly in the Spark Configuration object used while creation of
> SparkCont
yes... you can set the property in the conf file or you can the property
explicitly in the Spark Configuration object used while creation of
SparkContext/JavaSparkContext.
Cheers,
Subroto Sanyal
On Wed, Aug 10, 2016 at 12:09 PM, Aneela Saleem
wrote:
> Thanks Subroto,
>
> Do i need to set it to
Thanks Subroto,
Do i need to set it to 'true' in spar-default.conf file?
On Wed, Aug 10, 2016 at 2:59 PM, Subroto Sanyal
wrote:
> hi Aneela
>
> By any chance you are missing the property:
> spark.yarn.security.tokens.habse.enabled
> This was introduced as part of the fix:
> https://github.com/a
hi Aneela
By any chance you are missing the property:
spark.yarn.security.tokens.habse.enabled
This was introduced as part of the fix:
https://github.com/apache/spark/pull/8134/files
Cheers,
Subroto Sanyal
On Wed, Aug 10, 2016 at 11:53 AM, Aneela Saleem
wrote:
> And I'm using Apache distributi
And I'm using Apache distribution of Spark not Cloudera.
On Wed, Aug 10, 2016 at 12:06 PM, Aneela Saleem
wrote:
> Thanks Nkechi,
>
> I added this dependency as an external jar, when i compile the code,
> unfortunately i got the following error:
>
> error: object cloudera is not a member of packa
Thanks Nkechi,
I added this dependency as an external jar, when i compile the code,
unfortunately i got the following error:
error: object cloudera is not a member of package com
[ERROR] import com.cloudera.spark.hbase.HBaseContext
On Tue, Aug 9, 2016 at 7:51 PM, Nkechi Achara
wrote:
> hi,
>
hi,
Due to the fact we are not on Hbase 2.00 we are using SparkOnHbase.
Dependency:
com.cloudera
spark-hbase
0.0.2-clabs
It is quite a small snippet of code. For a general scan using a start and
stop time as the scan time range.
val conf = new S
Thanks Nkechi,
Can you please direct me to some code snippet with hbase on spark module?
I've been trying that for last few days but did not found a workaround.
On Tue, Aug 9, 2016 at 6:13 PM, Nkechi Achara
wrote:
> Hey,
>
> Have you tried hbase on spark module, or the spark-hbase module to c
Hey,
Have you tried hbase on spark module, or the spark-hbase module to connect?
The principal and keytab options should work out of the box for kerberized
access. I can attempt your code if you don't have the ability to use those
modules.
Thanks
K
On 9 Aug 2016 2:25 p.m., "Aneela Saleem" wrote
Hi all,
I'm trying to connect to Hbase with security enabled using spark job. I
have kinit'd from command line. When i run the following job i.e.,
/usr/local/spark-2/bin/spark-submit --keytab /etc/hadoop/conf/spark.keytab
--principal spark/hadoop-master@platalyticsrealm --class
com.platalytics.ex
13 matches
Mail list logo