Hi,

Can you print out the environment tab on your UI.

By default spark-sql runs on local mode which means that you only have one
driver and one executor in one jvm. Do you increase the executor memory
through

SET spark.executor.memory=xG

And adjust it and run the SQL again.


HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 18 July 2016 at 08:16, leezy <lizhenm...@163.com> wrote:

> hi:
> i am run a join operation in the spark-sql, But when i increase the
> executor-memory, the run time become long. In the spark UI, i can see that
> the shuffle becomes slowly when the memory becomes big. How can i to tune
> it?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-tuning-spark-shuffle-tp27350.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to