dnskr commented on PR #4776:
URL: https://github.com/apache/kyuubi/pull/4776#issuecomment-1542881873

   Thanks for the comments!
   This is experimental changes and they are not working fully, so I created 
the PR as draft. Apologize for confusing and for the delayed response.
   
   > Seems we had such an idea about the structure of `value.yaml`, but decided 
to reject it.
   
   Right, we discussed 
[here](https://github.com/apache/kyuubi/pull/4147#discussion_r1068996020). I'll 
continue with flat structure in a separate PR. As I mentioned above, it is more 
experimental PR to track my tries and demo different approach.
   
   > In practice, if Spark uses HDFS as storage and HMS as metastore, 
typically, the user should provide `hive-site.xml` `core-site.xml` 
`hdfs-site.xml` etc. under HADOOP_CONF_DIR, which would be shared by both 
Kyuubi server and Spark engine (other engines may require it too)
   
   Got it! I'll add these files as well. Am I right that there is no default 
`HADOOP_CONF_DIR` path in Kyuubi server or Kyuubi docker image? If no, could 
you please suggest how to set it in the chart(add env variable, property etc)?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to