[ https://issues.apache.org/jira/browse/SPARK-25742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16652064#comment-16652064 ]
Yinan Li commented on SPARK-25742: ---------------------------------- The k8s secrets you add through the {{spark.kubernetes.driver.secrets.}} config option will also get mounted into the init-container in the driver pod. You can use that to pass credential for pulling dependencies into the driver init-container. > Is there a way to pass the Azure blob storage credentials to the spark for > k8s init-container? > ---------------------------------------------------------------------------------------------- > > Key: SPARK-25742 > URL: https://issues.apache.org/jira/browse/SPARK-25742 > Project: Spark > Issue Type: Question > Components: Kubernetes > Affects Versions: 2.3.2 > Reporter: Oscar Bonilla > Priority: Minor > > I'm trying to run spark on a kubernetes cluster in Azure. The idea is to > store the Spark application jars and dependencies in a container in Azure > Blob Storage. > I've tried to do this with a public container and this works OK, but when > having a private Blob Storage container, the spark-init init container > doesn't download the jars. > The equivalent in AWS S3 is as simple as adding the key_id and secret as > environment variables, but I don't see how to do this for Azure Blob Storage. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org