Hi Yaqian Zhang, Thanks for your reply.

I am having another query, please help.

*Is kylin tested against HDFS encryption or with Ranger KMS ?*


 You can reach me out at
 Email- [email protected]

 with regards,
 Sonu Kumar Singh




On Fri, Mar 18, 2022 at 12:44 PM Yaqian Zhang <[email protected]> wrote:

> Hi Singh:
>
> Kylin has not been tested and supported in hadoop environment with KMS.
> According to the error report, you may need to modify and debug the source
> code to run kylin normally in your environment.
>
> > 在 2022年3月18日,上午10:18,Singh Sonu <[email protected]> 写道:
> >
> > Hi Experts, We need your help with below query:
> >
> > We enabled HDFS encryption using, hadoop KMS ( Hadoop 3.1.0 )..
> > hdfs-site.xml and core-site.xml is moved to $KYLIN_HOME/spark/conf with
> the
> > details of the kms keyprovider...
> >
> > (Kylin 3.1.2) After this the spark cube build is failing in the job step
> :
> > Convert Cuboid Data to HFile,
> >
> > this is because Cuboid files which should be generated in the  Build Cube
> > with Spark step is not created.. and in yarn logs, we could see the below
> > error..
> >
> > This is from build cuboid step
> > ----------------------------------------------------------------------
> > 2022-03-16 07:32:45,140 INFO spark.SparkCubingByLayer: RDD input path:
> >
> hdfs://dev-enc:8020/apps/kylin/kylin/kylin-b122587f-6391-7943-7dfa-84f4898c94b1/qubz_inte
> >
> rmediate_qubz_hive_metrics_query_rpc_qa_589ee76a_201b_932f_d691_5ce8714ff503
> > 2022-03-16 07:32:45,140 INFO spark.SparkCubingByLayer: RDD Output path:
> >
> hdfs://dev-enc:8020/apps/kylin/kylin/kylin-b122587f-6391-7943-7dfa-84f4898c94b1/QUBZ_HIV
> > E_METRICS_QUERY_RPC_QA/cuboid/
> > 2022-03-16 07:32:45,206 INFO compress.CodecPool: Got brand-new
> decompressor
> > [.deflate]
> > 2022-03-16 07:32:45,240 INFO spark.SparkCubingByLayer: All measure are
> > normal (agg on all cuboids) ? : true
> > 2022-03-16 07:32:45,257 WARN util.HadoopUtil: Read sequence file
> > .hive-staging_hive_2022-03-16_07-29-50_524_5412701170850172132-1 failed.
> > java.io.FileNotFoundException: Path is not a file:
> >
> /apps/kylin/kylin/kylin-b122587f-6391-7943-7dfa-84f4898c94b1/qubz_intermediate_qubz_hive_metrics_query_rpc_qa
> >
> _589ee76a_201b_932f_d691_5ce8714ff503/.hive-staging_hive_2022-03-16_07-29-50_524_5412701170850172132-1
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:90)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:76)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java:153)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1927)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:738)
> > at
> >
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java
> > :424)
> > at
> >
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> > at
> >
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
> > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
> > at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
> > at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:422)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682)
> > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)
> > -------------------------------------------------------------------------
> > This is from build cuboid step
> > -----------------------------------------------------------------------
> > 2022-03-16 15:30:14,343 ERROR spark.SparkCubingByLayer: Exception
> > java.lang.RuntimeException: java.io.IOException:
> > java.lang.ClassCastException: org.apache.hadoop.fs.FsUrlConnection cannot
> > be cast to java.net.HttpURLConnection
> > at
> >
> org.apache.kylin.dict.DictionaryManager.getDictionaryInfo(DictionaryManager.java:103)
> > at
> >
> org.apache.kylin.cube.CubeManager$DictionaryAssist.getDictionary(CubeManager.java:1210)
> > at org.apache.kylin.cube.CubeManager.getDictionary(CubeManager.java:1132)
> > at org.apache.kylin.cube.CubeSegment.getDictionary(CubeSegment.java:382)
> > at
> >
> org.apache.kylin.cube.kv.CubeDimEncMap.getDictionary(CubeDimEncMap.java:79)
> > at org.apache.kylin.cube.kv.CubeDimEncMap.get(CubeDimEncMap.java:58)
> > at
> > org.apache.kylin.engine.mr
> .common.CubeStatsReader.getCuboidSizeMapFromRowCount(CubeStatsReader.java:213)
> > at
> > org.apache.kylin.engine.mr
> .common.CubeStatsReader.getCuboidSizeMap(CubeStatsReader.java:172)
> > at
> > org.apache.kylin.engine.mr
> .common.CubeStatsReader.getCuboidSizeMap(CubeStatsReader.java:168)
> > at
> > org.apache.kylin.engine.mr
> .common.CubeStatsReader.estimateLayerSize(CubeStatsReader.java:397)
> > at
> >
> org.apache.kylin.engine.spark.SparkUtil.estimateLayerPartitionNum(SparkUtil.java:104)
> > at
> >
> org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:179)
> > at
> >
> org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:29)
> > at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:36)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:685)
> > Caused by: java.io.IOException: java.lang.ClassCastException:
> > org.apache.hadoop.fs.FsUrlConnection cannot be cast to
> > java.net.HttpURLConnection
> > at
> >
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:491)
> > at
> >
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:776)
> > at
> >
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
> > at
> >
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1381)
> > at
> >
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1451)
> > at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:305)
> > at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:299)
> > at
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:312)
> > at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:769)
> > at
> >
> org.apache.kylin.common.persistence.HDFSResourceStore.getResourceImpl(HDFSResourceStore.java:182)
> > at
> >
> org.apache.kylin.common.persistence.ResourceStore.lambda$getResourceWithRetry$1(ResourceStore.java:310)
> > at
> >
> org.apache.kylin.common.persistence.ExponentialBackoffRetry.doWithRetry(ExponentialBackoffRetry.java:45)
> > at
> >
> org.apache.kylin.common.persistence.ResourceStore.getResourceWithRetry(ResourceStore.java:310)
> > at
> >
> org.apache.kylin.common.persistence.ResourceStore.getResource(ResourceStore.java:287)
> > at
> >
> org.apache.kylin.common.persistence.ResourceStore.getResource(ResourceStore.java:278)
> > at
> org.apache.kylin.dict.DictionaryManager.load(DictionaryManager.java:432)
> > at
> org.apache.kylin.dict.DictionaryManager$1.load(DictionaryManager.java:75)
> > at
> org.apache.kylin.dict.DictionaryManager$1.load(DictionaryManager.java:72)
> > at
> >
> com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> > at
> com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> > at
> >
> com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> > at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257)
> > at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
> > at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> > at
> >
> com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> > at
> >
> org.apache.kylin.dict.DictionaryManager.getDictionaryInfo(DictionaryManager.java:96)
> > ... 18 more
> > Caused by: java.lang.ClassCastException:
> > org.apache.hadoop.fs.FsUrlConnection cannot be cast to
> > java.net.HttpURLConnection
> > at
> >
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:216)
> > at
> >
> org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.openConnection(DelegationTokenAuthenticatedURL.java:322)
> > at
> >
> org.apache.hadoop.crypto.key.kms.KMSClientProvider$1.run(KMSClientProvider.java:483)
> > at
> >
> org.apache.hadoop.crypto.key.kms.KMSClientProvider$1.run(KMSClientProvider.java:478)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:422)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
> > at
> >
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:478)
> > ... 44 more
> >
> >
> >
> > You can reach me out at
> > Mb. No- 7092292112
> > Email- [email protected]
> >
> > with regards,
> > Sonu Kumar Singh
>
>

Reply via email to