Re: Spark and Accumulo Delegation tokens

2018-03-23 Thread Saisai Shao
It is in yarn module.
"org.apache.spark.deploy.yarn.security.ServiceCredentialProvider".

2018-03-23 15:10 GMT+08:00 Jorge Machado :

> Hi Jerry,
>
> where do you see that Class on Spark ? I only found 
> HadoopDelegationTokenManager
> and I don’t see any way to add my Provider into it.
>
> private def getDelegationTokenProviders: Map[String, 
> HadoopDelegationTokenProvider] = {
>   val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
> new HiveDelegationTokenProvider,
> new HBaseDelegationTokenProvider)
>
>   // Filter out providers for which 
> spark.security.credentials.{service}.enabled is false.
>   providers
> .filter { p => isServiceEnabled(p.serviceName) }
> .map { p => (p.serviceName, p) }
> .toMap
> }
>
>
> If you could give me a tipp there would be great.
> Thanks
>
> Jorge Machado
>
>
>
>
>
> On 23 Mar 2018, at 07:38, Saisai Shao  wrote:
>
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
>
> Thanks
> Jerry
>
> 2018-03-23 14:29 GMT+08:00 Jorge Machado :
>
> Hi Guys,
>
> I’m on the middle of writing a spark Datasource connector for Apache Spark
> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
> trick because Spark only handles the Delegation Tokens from Hbase, hive and
> hdfs.
>
> Would be a PR for a implementation of HadoopDelegationTokenProvider for
> Accumulo be accepted ?
>
>
> Jorge Machado
>
>
>
>
>
>
>
>


Re: Spark and Accumulo Delegation tokens

2018-03-23 Thread Jorge Machado
Hi Jerry, 

where do you see that Class on Spark ? I only found 
HadoopDelegationTokenManager and I don’t see any way to add my Provider into 
it. 

private def getDelegationTokenProviders: Map[String, 
HadoopDelegationTokenProvider] = {
  val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
new HiveDelegationTokenProvider,
new HBaseDelegationTokenProvider)

  // Filter out providers for which 
spark.security.credentials.{service}.enabled is false.
  providers
.filter { p => isServiceEnabled(p.serviceName) }
.map { p => (p.serviceName, p) }
.toMap
}

If you could give me a tipp there would be great. 
Thanks 

Jorge Machado





> On 23 Mar 2018, at 07:38, Saisai Shao  wrote:
> 
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
> 
> Thanks
> Jerry
> 
> 2018-03-23 14:29 GMT+08:00 Jorge Machado :
> 
>> Hi Guys,
>> 
>> I’m on the middle of writing a spark Datasource connector for Apache Spark
>> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
>> trick because Spark only handles the Delegation Tokens from Hbase, hive and
>> hdfs.
>> 
>> Would be a PR for a implementation of HadoopDelegationTokenProvider for
>> Accumulo be accepted ?
>> 
>> 
>> Jorge Machado
>> 
>> 
>> 
>> 
>> 
>> 



Re: Spark and Accumulo Delegation tokens

2018-03-23 Thread Saisai Shao
I think you can build your own Accumulo credential provider as similar to
HadoopDelegationTokenProvider out of Spark, Spark already provided an
interface "ServiceCredentialProvider" for user to plug-in customized
credential provider.

Thanks
Jerry

2018-03-23 14:29 GMT+08:00 Jorge Machado :

> Hi Guys,
>
> I’m on the middle of writing a spark Datasource connector for Apache Spark
> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
> trick because Spark only handles the Delegation Tokens from Hbase, hive and
> hdfs.
>
> Would be a PR for a implementation of HadoopDelegationTokenProvider for
> Accumulo be accepted ?
>
>
> Jorge Machado
>
>
>
>
>
>


Spark and Accumulo Delegation tokens

2018-03-23 Thread Jorge Machado
Hi Guys, 

I’m on the middle of writing a spark Datasource connector for Apache Spark to 
connect to Accumulo Tablets, because we have Kerberos it get’s a little trick 
because Spark only handles the Delegation Tokens from Hbase, hive and hdfs. 

Would be a PR for a implementation of HadoopDelegationTokenProvider for 
Accumulo be accepted ? 


Jorge Machado







Re: Spark and accumulo

2015-04-21 Thread Akhil Das
You can simply use a custom inputformat (AccumuloInputFormat) with the
hadoop RDDs (sc.newApiHadoopFile etc) for that, all you need to do is to
pass the jobConfs. Here's pretty clean discussion:
http://stackoverflow.com/questions/29244530/how-do-i-create-a-spark-rdd-from-accumulo-1-6-in-spark-notebook#answers-header

Thanks
Best Regards

On Tue, Apr 21, 2015 at 9:55 AM, madhvi madhvi.gu...@orkash.com wrote:

 Hi all,

 Is there anything to integrate spark with accumulo or make spark to
 process over accumulo data?

 Thanks
 Madhvi Gupta

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Spark and accumulo

2015-04-21 Thread andy petrella
Hello Madvi,

Some work has been done by @pomadchin using the spark notebook, maybe you
should come on https://gitter.im/andypetrella/spark-notebook and poke him?
There are some discoveries he made that might be helpful to know.

Also you can poke @lossyrob from Azavea, he did that for geotrellis

my0.2c
andy


On Tue, Apr 21, 2015 at 9:25 AM Akhil Das ak...@sigmoidanalytics.com
wrote:

 You can simply use a custom inputformat (AccumuloInputFormat) with the
 hadoop RDDs (sc.newApiHadoopFile etc) for that, all you need to do is to
 pass the jobConfs. Here's pretty clean discussion:
 http://stackoverflow.com/questions/29244530/how-do-i-create-a-spark-rdd-from-accumulo-1-6-in-spark-notebook#answers-header

 Thanks
 Best Regards

 On Tue, Apr 21, 2015 at 9:55 AM, madhvi madhvi.gu...@orkash.com wrote:

 Hi all,

 Is there anything to integrate spark with accumulo or make spark to
 process over accumulo data?

 Thanks
 Madhvi Gupta

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





Spark and accumulo

2015-04-20 Thread madhvi

Hi all,

Is there anything to integrate spark with accumulo or make spark to 
process over accumulo data?


Thanks
Madhvi Gupta

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org