Nicholas Chammas created SPARK-33436:
----------------------------------------

             Summary: PySpark equivalent of SparkContext.hadoopConfiguration
                 Key: SPARK-33436
                 URL: https://issues.apache.org/jira/browse/SPARK-33436
             Project: Spark
          Issue Type: Improvement
          Components: PySpark
    Affects Versions: 3.1.0
            Reporter: Nicholas Chammas


PySpark should offer an API to {{hadoopConfiguration}} to [match 
Scala's|http://spark.apache.org/docs/latest/api/scala/org/apache/spark/SparkContext.html#hadoopConfiguration:org.apache.hadoop.conf.Configuration].

Setting Hadoop configs within a job is handy for any configurations that are 
not appropriate as global defaults, or that will not be known until run time. 
The various {{fs.s3a.*}} configs are a good example of this.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to