Hey Kamil,

Have you followed this guide to setup kerberos authentication[1]?

Best,

Dawid

[1]
https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/deployment/security/security-kerberos/

On 14/01/2022 17:09, Kamil ty wrote:
> Hello all,
> I have a flink job that is using the HbaseSinkFunction as specified
> here: flink/flink-connectors/flink-connector-hbase-2.2 at master ·
> a0x8o/flink (github.com)
> <https://github.com/a0x8o/flink/tree/master/flink-connectors/flink-connector-hbase-2.2#writing-into-hbase-tables-from-datastreams>
>
> I'm deploying the job to a cluster in yarn per-job mode. Using flink
> run -d job.jar.
>
> The job gets accepted and I get the address of the UI but when looking
> at the UI the job stays at CREATED and never actually runs. After some
> time it stops. 
>
> This error stands out when looking at the logs:
> WARN [main] org.apache.hadoop.security.LdapGroupsMapping: Exception
> while trying to get password for alias
> hadoop.security.group.mapping.ldap.bind.password:
> java.io.IOException: Configuration problem with provider path.
>         at
> org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2428)
>         at
> org.apache.hadoop.conf.Configuration.getPassword(Configuration.java:2347)
>         at
> org.apache.hadoop.security.LdapGroupsMapping.getPassword(LdapGroupsMapping.java:797)
>         at
> org.apache.hadoop.security.LdapGroupsMapping.setConf(LdapGroupsMapping.java:680)
>         at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
>         at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
>         at org.apache.hadoop.security.Groups.<init>(Groups.java:105)
>         at org.apache.hadoop.security.Groups.<init>(Groups.java:101)
>         at
> org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:476)
>         at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:352)
>         at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:314)
>         at
> org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:1996)
>         at
> org.apache.hadoop.security.UserGroupInformation.createLoginUser(UserGroupInformation.java:743)
>         at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:693)
>         at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:604)
>         at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer.main(ContainerLocalizer.java:468)
> Caused by: java.nio.file.AccessDeniedException:
> /var/run/.../process/1546359139-yarn-NODEMANAGER/creds.localjceks
>         at
> sun.nio.fs.UnixException.translateToIOException(UnixException.java:84)
>         at
> sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
>         at
> sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
>         at
> sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
>         at java.nio.file.Files.newByteChannel(Files.java:361)
>         at java.nio.file.Files.newByteChannel(Files.java:407)
>         at
> java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:384)
>         at java.nio.file.Files.newInputStream(Files.java:152)
>         at
> org.apache.hadoop.security.alias.LocalKeyStoreProvider.getInputStreamForFile(LocalKeyStoreProvider.java:76)
>         at
> org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.locateKeystore(AbstractJavaKeyStoreProvider.java:325)
>         at
> org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.<init>(AbstractJavaKeyStoreProvider.java:86)
>         at
> org.apache.hadoop.security.alias.LocalKeyStoreProvider.<init>(LocalKeyStoreProvider.java:56)
>         at
> org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.<init>(LocalJavaKeyStoreProvider.java:42)
>         at
> org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.<init>(LocalJavaKeyStoreProvider.java:34)
>         at
> org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider$Factory.createProvider(LocalJavaKeyStoreProvider.java:68)
>         at
> org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:73)
>         at
> org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2409)
>         ... 15 more
>
> This seems as if it tries to access by password based authentication
> but on the cluster only Kerberos based authentication should be used.
>
> The log output when scheduling the job might be a clue:
> org.apache.flink.yarn.Utils         [] - Attempting to obtain Kerberos
> security token for HBase
> org.apache.flink.yarn.Utils         [] - Hbase is not available (not
> packaged with this application): ClassNotFoundException :
> "org.apache.hadoop.hbase.HbaseConfiguration".
>
> The flink-connector-hbase-2.2 dependecy has been specified in the
> compile scope and the hadoop-common in the provided scope in the jobs'
> pom.xml. I have also tried including more dependencies as
> hbase-common, hbase-client without any luck.
> I have also tried setting the HBASE_HOME, HBASE_CONF_DIR,
> HADOOP_CLASSPATH and HBASE_CLASSPATH environment variables.
> I have also made sure that the host where the job is deployed (flink
> run is called) is also an HBase gateway node.
>
> Looking at the hadoop classpath I haven't found the hbase-common.jar.
> Could this possibly be the issue? I'm unsure if yarn looks for this
> dependency inside the compiled job or in the hadoop classpath.
>
> Any help with this issue would be appreciated. 
>
> Best Regards
> Kamil

Attachment: OpenPGP_signature
Description: OpenPGP digital signature

Reply via email to