Hi I have spark job which creates hive table partitions I have switched to in spark 1.5.1 and spark 1.5.1 creates so many hive staging files and it doesn't delete it after job finishes. Is it a bug or do I need to disable something to prevents hive staging files from getting created or at least delete it. Hive staging files looks like the following
.hive-staging_hive_blabla -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-5-1-hadoop-2-4-does-not-clear-hive-staging-files-after-job-finishes-tp25203.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org