Github user ericl commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12248#discussion_r59076114
  
    --- Diff: 
core/src/main/scala/org/apache/spark/scheduler/ShuffleMapTask.scala ---
    @@ -42,20 +43,22 @@ import org.apache.spark.shuffle.ShuffleWriter
      * @param _initialAccums initial set of accumulators to be used in this 
task for tracking
      *                       internal metrics. Other accumulators will be 
registered later when
      *                       they are deserialized on the executors.
    + * @param localProperties copy of thread-local properties set by the user 
on the driver side.
      */
     private[spark] class ShuffleMapTask(
         stageId: Int,
         stageAttemptId: Int,
         taskBinary: Broadcast[Array[Byte]],
         partition: Partition,
         @transient private var locs: Seq[TaskLocation],
    -    _initialAccums: Seq[Accumulator[_]])
    -  extends Task[MapStatus](stageId, stageAttemptId, partition.index, 
_initialAccums)
    +    _initialAccums: Seq[Accumulator[_]],
    +    localProperties: Properties)
    +  extends Task[MapStatus](stageId, stageAttemptId, partition.index, 
_initialAccums, localProperties)
       with Logging {
     
       /** A constructor used only in test suites. This does not require 
passing in an RDD. */
       def this(partitionId: Int) {
    -    this(0, 0, null, new Partition { override def index: Int = 0 }, null, 
null)
    +    this(0, 0, null, new Partition { override def index: Int = 0 }, null, 
null, new Properties)
    --- End diff --
    
    It seemed safer to make it required. I can change this to an option if you 
think creating a Properties each time is too much overhead.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to