James Srinivasan wrote:
However, I seem to get this when trying to use the DelegationToken:

scala>  rdd.count()
17/05/19 21:30:55  INFO UserGroupInformation: Login successful for user
[email protected]  using keytab file
/tmp/accumulo.headless.keytab
java.lang.NullPointerException
   at 
org.apache.accumulo.core.client.mapreduce.lib.impl.ConfiguratorBase.unwrapAuthenticationToken(ConfiguratorBase.java:493)
   at 
org.apache.accumulo.core.client.mapreduce.AbstractInputFormat.validateOptions(AbstractInputFormat.java:390)
   at 
org.apache.accumulo.core.client.mapreduce.AbstractInputFormat.getSplits(AbstractInputFormat.java:668)
   at 
org.locationtech.geomesa.jobs.mapreduce.GeoMesaAccumuloInputFormat.getSplits(GeoMesaAccumuloInputFormat.scala:174)
   at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:121)

Looking over the code, I can't see an obvious reason it would be null
on those lines. Any help is much appreciated!

Delegation tokens are serialized into the Job's "credentials" section and distributed securely that way.

When your job needs to construct its input splits, it first needs to pull the delegate token out of the Job. For whatever reason, the serialized DelegationToken we expected to pull out of the Job's credentials is invalid/malformed.

Perhaps in your copying of the Configuration, you're blowing away something? I'm not sure.

Reply via email to