24...@163.com>;
*Date:* Thu, Jul 2, 2020 08:39 PM
*To:* "user";
*Subject:* Re: File Not Found: /tmp/spark-events in Spark 3.0
Hi,
First, the /tmp/spark-events is the default storage location of spark
eventLog, but the log is stored only when you set the
'spark.eventLog.enabled=true', wh
Hi,
First, the '/tmp/spark-events' is the default storage location of spark
eventLog, but the log will be stored in it only when the
'spark.eventLog.enabled' is true, which your spark 2.4.6 may set to false.
So you can try to set false and the error may disappear.
Second, I suggest enable
This could be the result of you not setting the location of eventLog properly.
By default, it's/TMP/Spark-Events, and since the files in the/TMP directory are
cleaned up regularly, you could have this problem.
--Original--
From:"Xin Jinhan"<18183124...@163.com;
Hi,
First, the /tmp/spark-events is the default storage location of spark
eventLog, but the log is stored only when you set the
'spark.eventLog.enabled=true', which maybe your spark 2.4.6 set to false. So
you can just set it to false and the error will disappear.
Second, I suggest to open the
This should only be needed if the spark.eventLog.enabled property was set
to true. Is it possible the job configuration is different between your
two environments?
On Mon, Jun 29, 2020 at 9:21 AM ArtemisDev wrote:
> While launching a spark job from Zeppelin against a standalone spark
> cluster
While launching a spark job from Zeppelin against a standalone spark
cluster (Spark 3.0 with multiple workers without hadoop), we have
encountered a Spark interpreter exception caused by a I/O File Not Found
exception due to the non-existence of the /tmp/spark-events directory.
We had to