[ 
https://issues.apache.org/jira/browse/SPARK-5682?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14315784#comment-14315784
 ] 

liyunzhang_intel commented on SPARK-5682:
-----------------------------------------

How to test encrypted shuffle feature on spark on yarn framework:
# build(because i modified the pom.xml) : mvn package -Pyarn 
-Dyarn.version=2.6.0 -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive –DskipTests
# when need enable encrypted shuffle, add following in  conf/spark-defaults.conf
spark.encrypted.shuffle          true
# start master and work: sbin/start-all.sh
# run wordcount in yarn-cluster and yarn-client mode:
## ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
yarn-cluster --num-executors 3     --driver-memory 1g     --executor-memory 1g  
   --executor-cores 1   wordcount.jar
## ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
yarn-client --num-executors 3     --driver-memory 1g     --executor-memory 1g   
  --executor-cores 1   wordcount.jar


> Reuse hadoop encrypted shuffle algorithm to enable spark encrypted shuffle
> --------------------------------------------------------------------------
>
>                 Key: SPARK-5682
>                 URL: https://issues.apache.org/jira/browse/SPARK-5682
>             Project: Spark
>          Issue Type: New Feature
>          Components: Shuffle
>            Reporter: liyunzhang_intel
>         Attachments: Design Document of Encrypted Spark Shuffle_20150209.docx
>
>
> Encrypted shuffle is enabled in hadoop 2.6 which make the process of shuffle 
> data safer. This feature is necessary in spark. We reuse hadoop encrypted 
> shuffle feature to spark and because ugi credential info is necessary in 
> encrypted shuffle, we first enable encrypted shuffle on spark-on-yarn 
> framework.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to