14000 partitions seem to be way too many to be performant (except for large 
data sets). How much data does one partition contain?

> On 22 May 2016, at 09:34, SRK <swethakasire...@gmail.com> wrote:
> 
> Hi,
> 
> In my Spark SQL query to insert data, I have around 14,000 partitions of
> data which seems to be causing memory issues. How can I insert the data for
> 100 partitions at a time to avoid any memory issues? 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-insert-data-for-100-partitions-at-a-time-using-Spark-SQL-tp26997.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to