Hi Spark Devs,

I'm curious what your take on this code [1] would be if you were me trying
to understand it:

      (0 until 1).flatMap { _ =>
        (splitPoints :+ numMappers).sliding(2).map {
          case Seq(start, end) => CoalescedMapperPartitionSpec(start, end,
numReducers)
        }
      }

There's something important going on here but it's so convoluted that my
Scala coding skills seem not enough (not to mention AQE skills themselves).

I'm tempted to change (0 until 1) to Seq(0), but Seq(0).flatMap feels
awkward too. Is this Seq(0).flatMap even needed?! Even with no splitPoints
we've got numReducers > 0.

Looks like the above is as simple as

    (splitPoints :+ numMappers).sliding(2).map {
          case Seq(start, end) => CoalescedMapperPartitionSpec(start, end,
numReducers)
     }

Correct?

I'm mentally used up and can't seem to think straight. Would a PR with such
a change be acceptable? (Sean I'm looking at you :D)

[1]
https://github.com/apache/spark/blob/8d817dcf3084d56da22b909d578a644143f775d5/sql/core/src/main/scala/org/apache/spark/sql/execution/adaptive/OptimizeShuffleWithLocalRead.scala#L89-L93

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
"The Internals Of" Online Books <https://books.japila.pl/>
Follow me on https://twitter.com/jaceklaskowski

<https://twitter.com/jaceklaskowski>

Reply via email to