Re: Whether Spark will use disk when the memory is not enough on MEMORY_ONLY Storage Level
Jone: For #3, consider ask on vendor's mailing list. On Fri, Oct 30, 2015 at 7:11 AM, Akhil Das wrote: > You can set it to MEMORY_AND_DISK, in this case data will fall back to > disk when it exceeds the memory. > > Thanks > Best Regards > > On Fri, Oct 23, 2015 at 9:52 AM, JoneZhang wrote: > >> 1.Whether Spark will use disk when the memory is not enough on MEMORY_ONLY >> Storage Level? >> 2.If not, How can i set Storage Level when i use Hive on Spark? >> 3.Do Spark have any intention of dynamically determined Hive on MapReduce >> or Hive on Spark, base on SQL features. >> >> Thanks in advance >> Best regards >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Whether-Spark-will-use-disk-when-the-memory-is-not-enough-on-MEMORY-ONLY-Storage-Level-tp25171.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> - >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> >> >
Re: Whether Spark will use disk when the memory is not enough on MEMORY_ONLY Storage Level
You can set it to MEMORY_AND_DISK, in this case data will fall back to disk when it exceeds the memory. Thanks Best Regards On Fri, Oct 23, 2015 at 9:52 AM, JoneZhang wrote: > 1.Whether Spark will use disk when the memory is not enough on MEMORY_ONLY > Storage Level? > 2.If not, How can i set Storage Level when i use Hive on Spark? > 3.Do Spark have any intention of dynamically determined Hive on MapReduce > or Hive on Spark, base on SQL features. > > Thanks in advance > Best regards > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Whether-Spark-will-use-disk-when-the-memory-is-not-enough-on-MEMORY-ONLY-Storage-Level-tp25171.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >
Whether Spark will use disk when the memory is not enough on MEMORY_ONLY Storage Level
1.Whether Spark will use disk when the memory is not enough on MEMORY_ONLY Storage Level? 2.If not, How can i set Storage Level when i use Hive on Spark? 3.Do Spark have any intention of dynamically determined Hive on MapReduce or Hive on Spark, base on SQL features. Thanks in advance Best regards -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Whether-Spark-will-use-disk-when-the-memory-is-not-enough-on-MEMORY-ONLY-Storage-Level-tp25171.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org