[ 
https://issues.apache.org/jira/browse/SPARK-21369?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shixiong Zhu updated SPARK-21369:
---------------------------------
    Description: 
Right now the external shuffle service uses Scala Tuple2. However, the Scala 
library won't be shaded into the yarn shuffle assembly jar. Then when the codes 
are called, it will throw ClassNotFoundException.

Right now it's safe because we disabled spark.reducer.maxReqSizeShuffleToMem by 
default. However,  to allow using spark.reducer.maxReqSizeShuffleToMem for Yarn 
users, we should remove all usages of Tuples.

  was:
Right now the external shuffle service uses Scala Tuple2. However, the Scala 
library won't be shaded into the yarn shuffle assembly jar. Then when the codes 
are called, it will throw ClassNotFoundException.

Right now it's safe because we disabled spark.reducer.maxReqSizeShuffleToMem by 
default. However,  to allow using spark.reducer.maxReqSizeShuffleToMem, we 
should remove all usages of Tuples.


> Don't use Scala classes in external shuffle service
> ---------------------------------------------------
>
>                 Key: SPARK-21369
>                 URL: https://issues.apache.org/jira/browse/SPARK-21369
>             Project: Spark
>          Issue Type: Bug
>          Components: Shuffle, YARN
>    Affects Versions: 2.2.0
>            Reporter: Shixiong Zhu
>            Assignee: Shixiong Zhu
>
> Right now the external shuffle service uses Scala Tuple2. However, the Scala 
> library won't be shaded into the yarn shuffle assembly jar. Then when the 
> codes are called, it will throw ClassNotFoundException.
> Right now it's safe because we disabled spark.reducer.maxReqSizeShuffleToMem 
> by default. However,  to allow using spark.reducer.maxReqSizeShuffleToMem for 
> Yarn users, we should remove all usages of Tuples.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to