It is in yarn module.
"org.apache.spark.deploy.yarn.security.ServiceCredentialProvider".

2018-03-23 15:10 GMT+08:00 Jorge Machado <jom...@me.com>:

> Hi Jerry,
>
> where do you see that Class on Spark ? I only found 
> HadoopDelegationTokenManager
> and I don’t see any way to add my Provider into it.
>
> private def getDelegationTokenProviders: Map[String, 
> HadoopDelegationTokenProvider] = {
>   val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
>     new HiveDelegationTokenProvider,
>     new HBaseDelegationTokenProvider)
>
>   // Filter out providers for which 
> spark.security.credentials.{service}.enabled is false.
>   providers
>     .filter { p => isServiceEnabled(p.serviceName) }
>     .map { p => (p.serviceName, p) }
>     .toMap
> }
>
>
> If you could give me a tipp there would be great.
> Thanks
>
> Jorge Machado
>
>
>
>
>
> On 23 Mar 2018, at 07:38, Saisai Shao <sai.sai.s...@gmail.com> wrote:
>
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
>
> Thanks
> Jerry
>
> 2018-03-23 14:29 GMT+08:00 Jorge Machado <jom...@me.com>:
>
> Hi Guys,
>
> I’m on the middle of writing a spark Datasource connector for Apache Spark
> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
> trick because Spark only handles the Delegation Tokens from Hbase, hive and
> hdfs.
>
> Would be a PR for a implementation of HadoopDelegationTokenProvider for
> Accumulo be accepted ?
>
>
> Jorge Machado
>
>
>
>
>
>
>
>

Reply via email to