dnskr opened a new pull request, #6875: URL: https://github.com/apache/kyuubi/pull/6875
### Why are the changes needed? The PR adds support for Hadoop configuration files to be used by Apache Kyuubi, Apache Spark etc. The PR is continuation of PR https://github.com/apache/kyuubi/pull/6521 and relates to the issue https://github.com/apache/kyuubi/issues/6123. ### How was this patch tested? 1. Create `hadoop-configs.yaml` file (ConfigMap with `core-site.xml` and `hive-site.xml` entries): ```yaml apiVersion: v1 kind: ConfigMap metadata: name: hadoop-configs data: 'core-site.xml': | <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hadoop.pr.test</name> <value>configmap</value> </property> </configuration> 'hive-site.xml': | <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hive.pr.test</name> <value>configmap</value> </property> </configuration> ``` 2. Create ConfigMap from `hadoop-configs.yaml` file: ```shell kubectl create -f hadoop-configs.yaml ``` 3. Create custom `values-hadoop.yaml` (overwrites `core-site.xml`): ```yaml image: repository: apache/kyuubi tag: 1.10.0-spark rbac: create: true rules: - apiGroups: [""] resources: ["pods", "configmaps", "services"] verbs: ["create", "list", "delete", "watch", "deletecollection", "get"] hadoopConf: files: 'core-site.xml': | <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hadoop.pr.test</name> <value>values</value> </property> </configuration> 'hdfs-site.xml': | <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hdfs.pr.test</name> <value>values</value> </property> </configuration> filesFrom: - configMap: name: hadoop-configs sparkConf: files: 'spark-defaults.conf': | spark.submit.deployMode=client spark.kubernetes.container.image=apache/spark:3.5.2 spark.kubernetes.authenticate.driver.serviceAccountName=kyuubi ``` 4. Install the chart ```shell helm install kyuubi charts/kyuubi -f values-hadoop.yaml ``` 5. Check there are 3 files in the Hadoop configuration directory: ```shell kubectl exec kyuubi-0 -- ls /opt/hadoop/conf core-site.xml hdfs-site.xml hive-site.xml ``` 6. Check `/opt/hadoop/conf/core-site.xml` has content from ConfigMap: ```shell kubectl exec kyuubi-0 -- cat /opt/hadoop/conf/core-site.xml <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hadoop.pr.test</name> <value>configmap</value> </property> </configuration ``` 7. Check `/opt/hadoop/conf/hdfs-site.xml` has content from values: ```shell kubectl exec kyuubi-0 -- cat /opt/hadoop/conf/hdfs-site.xml <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hdfs.pr.test</name> <value>values</value> </property> </configuration> ``` 8. Check `/opt/hadoop/conf/hive-site.xml` has content from ConfigMap: ```shell kubectl exec kyuubi-0 -- cat /opt/hadoop/conf/hive-site.xml <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hive.pr.test</name> <value>configmap</value> </property> </configuration> ``` 9. Check configuration values from Spark session: ```shell kubectl exec kyuubi-0 -- ./bin/beeline -u 'jdbc:hive2://kyuubi-thrift-binary:10009' -e 'set hadoop.pr.test;' +-----------------+------------+ | key | value | +-----------------+------------+ | hadoop.pr.test | configmap | +-----------------+------------ kubectl exec kyuubi-0 -- ./bin/beeline -u 'jdbc:hive2://kyuubi-thrift-binary:10009' -e 'set hdfs.pr.test;' +---------------+---------+ | key | value | +---------------+---------+ | hdfs.pr.test | values | +---------------+--------- kubectl exec kyuubi-0 -- ./bin/beeline -u 'jdbc:hive2://kyuubi-thrift-binary:10009' -e 'set hive.pr.test;' +---------------+------------+ | key | value | +---------------+------------+ | hive.pr.test | configmap | +---------------+------------ ``` ### Was this patch authored or co-authored using generative AI tooling? No -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
