[ https://issues.apache.org/jira/browse/SPARK-25742?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-25742. ---------------------------------- Resolution: Invalid Questions should go to mailing list. Please see https://spark.apache.org/community.html > Is there a way to pass the Azure blob storage credentials to the spark for > k8s init-container? > ---------------------------------------------------------------------------------------------- > > Key: SPARK-25742 > URL: https://issues.apache.org/jira/browse/SPARK-25742 > Project: Spark > Issue Type: Question > Components: Kubernetes > Affects Versions: 2.3.2 > Reporter: Oscar Bonilla > Priority: Minor > > I'm trying to run spark on a kubernetes cluster in Azure. The idea is to > store the Spark application jars and dependencies in a container in Azure > Blob Storage. > I've tried to do this with a public container and this works OK, but when > having a private Blob Storage container, the spark-init init container > doesn't download the jars. > The equivalent in AWS S3 is as simple as adding the key_id and secret as > environment variables, but I don't see how to do this for Azure Blob Storage. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org