[image: Inline image 1]

This is what we are on.

On Wed, Nov 22, 2017 at 12:33 PM, KhajaAsmath Mohammed <
mdkhajaasm...@gmail.com> wrote:

> We use oracle JDK. we are on unix.
>
> On Wed, Nov 22, 2017 at 12:31 PM, Georg Heiler <georg.kf.hei...@gmail.com>
> wrote:
>
>> Do you use oracle or open jdk? We recently had an issue with open jdk:
>> formerly, java Security extensions were installed by default - no longer so
>> on centos 7.3
>>
>> Are these installed?
>>
>> KhajaAsmath Mohammed <mdkhajaasm...@gmail.com> schrieb am Mi. 22. Nov.
>> 2017 um 19:29:
>>
>>> I passed keytab, renewal is enabled by running the script every eight
>>> hours. User gets renewed by the script every eight hours.
>>>
>>> On Wed, Nov 22, 2017 at 12:27 PM, Georg Heiler <
>>> georg.kf.hei...@gmail.com> wrote:
>>>
>>>> Did you pass a keytab? Is renewal enabled in your kdc?
>>>> KhajaAsmath Mohammed <mdkhajaasm...@gmail.com> schrieb am Mi. 22. Nov.
>>>> 2017 um 19:25:
>>>>
>>>>> Hi,
>>>>>
>>>>> I have written spark stream job and job is running successfully for
>>>>> more than 36 hours. After around 36 hours job gets failed with kerberos
>>>>> issue. Any solution on how to resolve it.
>>>>>
>>>>> org.apache.spark.SparkException: Task failed while wri\
>>>>>
>>>>> ting rows.
>>>>>
>>>>>                 at org.apache.spark.sql.hive.Spar
>>>>> kHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterC
>>>>> ontainers.scala:328)
>>>>>
>>>>>                 at org.apache.spark.sql.hive.exec
>>>>> ution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply
>>>>> (InsertIntoHiveTable.scala:210)
>>>>>
>>>>>                 at org.apache.spark.sql.hive.exec
>>>>> ution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply
>>>>> (InsertIntoHiveTable.scala:210)
>>>>>
>>>>>                 at org.apache.spark.scheduler.Res
>>>>> ultTask.runTask(ResultTask.scala:87)
>>>>>
>>>>>                 at org.apache.spark.scheduler.Task.run(Task.scala:99)
>>>>>
>>>>>                 at org.apache.spark.executor.Exec
>>>>> utor$TaskRunner.run(Executor.scala:322)
>>>>>
>>>>>                 at java.util.concurrent.ThreadPoo
>>>>> lExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>
>>>>>                 at java.util.concurrent.ThreadPoo
>>>>> lExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>
>>>>>                 at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
>>>>> java.io.IOException: org.apache.hadoop.security.aut
>>>>> hentication.client.\
>>>>>
>>>>> AuthenticationException: 
>>>>> org.apache.hadoop.security.token.SecretManager$InvalidToken:
>>>>> token (kms-dt owner=va_dflt, renewer=yarn, re\
>>>>>
>>>>> alUser=, issueDate=1511262017635, maxDate=1511866817635,
>>>>> sequenceNumber=1854601, masterKeyId=3392) is expired
>>>>>
>>>>>                 at org.apache.hadoop.hive.ql.io.H
>>>>> iveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:248)
>>>>>
>>>>>                 at org.apache.spark.sql.hive.Spar
>>>>> kHiveDynamicPartitionWriterContainer.newOutputWriter$1(hiveW
>>>>> riterContainers.scala:346)
>>>>>
>>>>>                 at org.apache.spark.sql.hive.Spar
>>>>> kHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterC
>>>>> ontainers.scala:304)
>>>>>
>>>>>                 ... 8 more
>>>>>
>>>>> Caused by: java.io.IOException: org.apache.hadoop.security.aut
>>>>> hentication.client.AuthenticationException: org.apache.hadoop.securit\
>>>>>
>>>>> y.token.SecretManager$InvalidToken: token (kms-dt owner=va_dflt,
>>>>> renewer=yarn, realUser=, issueDate=1511262017635, maxDate=15118668\
>>>>>
>>>>> 17635, sequenceNumber=1854601, masterKeyId=3392) is expired
>>>>>
>>>>>                 at org.apache.hadoop.crypto.key.k
>>>>> ms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBa
>>>>> lancingKMSClientProvider.java:216)
>>>>>
>>>>>                 at org.apache.hadoop.crypto.key.K
>>>>> eyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCry
>>>>> ptoExtension.java:388)
>>>>>
>>>>>                 at org.apache.hadoop.hdfs.DFSClie
>>>>> nt.decryptEncryptedDataEncryptionKey(DFSClient.java:1440)
>>>>>
>>>>>                 at org.apache.hadoop.hdfs.DFSClie
>>>>> nt.createWrappedOutputStream(DFSClient.java:1542)
>>>>>
>>>>>                 at org.apache.hadoop.hdfs.DFSClie
>>>>> nt.createWrappedOutputStream(DFSClient.java:1527)
>>>>>
>>>>>                 at org.apache.hadoop.hdfs.Distrib
>>>>> utedFileSystem$7.doCall(DistributedFileSystem.java:428)
>>>>>
>>>>>                 at org.apache.hadoop.hdfs.Distrib
>>>>> utedFileSystem$7.doCall(DistributedFileSystem.java:421)
>>>>>
>>>>>                 at org.apache.hadoop.fs.FileSyste
>>>>> mLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>>>
>>>>>                 at org.apache.hadoop.hdfs.Distrib
>>>>> utedFileSystem.create(DistributedFileSystem.java:421)
>>>>>
>>>>>                 at org.apache.hadoop.hdfs.Distrib
>>>>> utedFileSystem.create(DistributedFileSystem.java:362)
>>>>>
>>>>>                 at org.apache.hadoop.fs.FileSyste
>>>>> m.create(FileSystem.java:925)
>>>>>
>>>>>                 at org.apache.hadoop.fs.FileSyste
>>>>> m.create(FileSystem.java:906)
>>>>>
>>>>>                 at parquet.hadoop.ParquetFileWrit
>>>>> er.<init>(ParquetFileWriter.java:220)
>>>>>
>>>>>                 at parquet.hadoop.ParquetOutputFo
>>>>> rmat.getRecordWriter(ParquetOutputFormat.java:311)
>>>>>
>>>>>                 at parquet.hadoop.ParquetOutputFo
>>>>> rmat.getRecordWriter(ParquetOutputFormat.java:287)
>>>>>
>>>>>                 at org.apache.hadoop.hive.ql.io.p
>>>>> arquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecord
>>>>> WriterWrapper.java:65)
>>>>>
>>>>>                 at org.apache.hadoop.hive.ql.io.p
>>>>> arquet.MapredParquetOutputFormat.getParquerRecordWriterWrapp
>>>>> er(MapredParquetOutputFormat.java:125)
>>>>>
>>>>>                 at org.apache.hadoop.hive.ql.io.p
>>>>> arquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredP
>>>>> arquetOutputFormat.java:114)
>>>>>
>>>>>                 at org.apache.hadoop.hive.ql.io.H
>>>>> iveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:260)
>>>>>
>>>>>                 at org.apache.hadoop.hive.ql.io.H
>>>>> iveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:245)
>>>>>
>>>>>                 ... 10 more
>>>>>
>>>>> Caused by: org.apache.hadoop.security.aut
>>>>> hentication.client.AuthenticationException:
>>>>> org.apache.hadoop.security.token.SecretManager\
>>>>>
>>>>> $InvalidToken: token (kms-dt owner=va_dflt, renewer=yarn, realUser=,
>>>>> issueDate=1511262017635, maxDate=1511866817635, sequenceNumber\
>>>>>
>>>>> =1854601, masterKeyId=3392) is expired
>>>>>
>>>>>                 at sun.reflect.NativeConstructorA
>>>>> ccessorImpl.newInstance0(Native Method)
>>>>>
>>>>>                 at sun.reflect.NativeConstructorA
>>>>> ccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>>>
>>>>>                 at sun.reflect.DelegatingConstruc
>>>>> torAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>
>>>>>                 at java.lang.reflect.Constructor.
>>>>> newInstance(Constructor.java:526)
>>>>>
>>>>>                 at org.apache.hadoop.util.HttpExc
>>>>> eptionUtils.validateResponse(HttpExceptionUtils.java:157)
>>>>>
>>>>>                 at org.apache.hadoop.crypto.key.k
>>>>> ms.KMSClientProvider.call(KMSClientProvider.java:627)
>>>>>
>>>>>                 at org.apache.hadoop.crypto.key.k
>>>>> ms.KMSClientProvider.call(KMSClientProvider.java:585)
>>>>>
>>>>>                 at org.apache.hadoop.crypto.key.k
>>>>> ms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:852)
>>>>>
>>>>>                 at org.apache.hadoop.crypto.key.k
>>>>> ms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSCli
>>>>> entProvider.java:209)
>>>>>
>>>>>                 at org.apache.hadoop.crypto.key.k
>>>>> ms.LoadBalancingKMSClientProvider$5.call(LoadBalancingKMSCli
>>>>> entProvider.java:205)
>>>>>
>>>>>                 at org.apache.hadoop.crypto.key.k
>>>>> ms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClien
>>>>> tProvider.java:94)
>>>>>
>>>>>                 at org.apache.hadoop.crypto.key.k
>>>>> ms.LoadBalancingKMSClientProvider.decryptEncryptedKey(LoadBa
>>>>> lancingKMSClientProvider.java:205)
>>>>>
>>>>>                 ... 29 more
>>>>>
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Asmath
>>>>>
>>>>
>>>
>

Reply via email to