[ 
https://issues.apache.org/jira/browse/SPARK-22166?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

吴志龙 updated SPARK-22166:
------------------------
    Description: 
${SPARK_HOME}/bin/spark-sql --master=yarn --queue lx_etl --driver-memory 4g 
--driver-java-options -XX:MaxMetaspaceSize=512m --num-executors 12  
--executor-memory 3g  --hiveconf hive.cli.print.header=false --conf 
spark.executor.extraJavaOptions=" -Xmn768m -XX:+UseG1GC 
-XX:MaxMetaspaceSize=512m -XX:MaxGCPauseMillis=400 -XX:G1ReservePercent=30 
-XX:SoftRefLRUPolicyMSPerMB=0 -XX:InitiatingHeapOccupancyPercent=35" -e ""


!http://example.com/image.png!

  was:
${SPARK_HOME}/bin/spark-sql --master=yarn --queue lx_etl --driver-memory 4g 
--driver-java-options -XX:MaxMetaspaceSize=512m --num-executors 12  
--executor-memory 3g  --hiveconf hive.cli.print.header=false --conf 
spark.executor.extraJavaOptions=" -Xmn768m -XX:+UseG1GC 
-XX:MaxMetaspaceSize=512m -XX:MaxGCPauseMillis=400 -XX:G1ReservePercent=30 
-XX:SoftRefLRUPolicyMSPerMB=0 -XX:InitiatingHeapOccupancyPercent=35" -e ""



> java.lang.OutOfMemoryError: error while calling spill() 
> --------------------------------------------------------
>
>                 Key: SPARK-22166
>                 URL: https://issues.apache.org/jira/browse/SPARK-22166
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0
>         Environment: spark 2.2
> hadoop 2.6.0
> jdk 1.8
>            Reporter: 吴志龙
>
> ${SPARK_HOME}/bin/spark-sql --master=yarn --queue lx_etl --driver-memory 4g 
> --driver-java-options -XX:MaxMetaspaceSize=512m --num-executors 12  
> --executor-memory 3g  --hiveconf hive.cli.print.header=false --conf 
> spark.executor.extraJavaOptions=" -Xmn768m -XX:+UseG1GC 
> -XX:MaxMetaspaceSize=512m -XX:MaxGCPauseMillis=400 -XX:G1ReservePercent=30 
> -XX:SoftRefLRUPolicyMSPerMB=0 -XX:InitiatingHeapOccupancyPercent=35" -e ""
> !http://example.com/image.png!



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to