1.Whether Spark will use disk when the memory is not enough on MEMORY_ONLY
Storage Level?
2.If not, How can i set Storage Level when i use Hive on Spark?
3.Do Spark have any intention of  dynamically determined Hive on MapReduce
or Hive on Spark, base on SQL features. 

Thanks in advance
Best regards



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Whether-Spark-will-use-disk-when-the-memory-is-not-enough-on-MEMORY-ONLY-Storage-Level-tp25171.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to