It is in yarn module.
"org.apache.spark.deploy.yarn.security.ServiceCredentialProvider".
2018-03-23 15:10 GMT+08:00 Jorge Machado :
> Hi Jerry,
>
> where do you see that Class on Spark ? I only found
> HadoopDelegationTokenManager
> and I don’t see any way to add my Provider into
Hi Jerry,
where do you see that Class on Spark ? I only found
HadoopDelegationTokenManager and I don’t see any way to add my Provider into
it.
private def getDelegationTokenProviders: Map[String,
HadoopDelegationTokenProvider] = {
val providers = List(new
I think you can build your own Accumulo credential provider as similar to
HadoopDelegationTokenProvider out of Spark, Spark already provided an
interface "ServiceCredentialProvider" for user to plug-in customized
credential provider.
Thanks
Jerry
2018-03-23 14:29 GMT+08:00 Jorge Machado
Hi Guys,
I’m on the middle of writing a spark Datasource connector for Apache Spark to
connect to Accumulo Tablets, because we have Kerberos it get’s a little trick
because Spark only handles the Delegation Tokens from Hbase, hive and hdfs.
Would be a PR for a implementation of
You can simply use a custom inputformat (AccumuloInputFormat) with the
hadoop RDDs (sc.newApiHadoopFile etc) for that, all you need to do is to
pass the jobConfs. Here's pretty clean discussion:
http://stackoverflow.com/questions/29244530/how-do-i-create-a-spark-rdd-from-accumulo-1-6-in-spark
://stackoverflow.com/questions/29244530/how-do-i-create-a-spark-rdd-from-accumulo-1-6-in-spark-notebook#answers-header
Thanks
Best Regards
On Tue, Apr 21, 2015 at 9:55 AM, madhvi madhvi.gu...@orkash.com wrote:
Hi all,
Is there anything to integrate spark with accumulo or make spark to
process
Hi all,
Is there anything to integrate spark with accumulo or make spark to
process over accumulo data?
Thanks
Madhvi Gupta
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h