This should only be needed if the spark.eventLog.enabled property was set
to true.  Is it possible the job configuration is different between your
two environments?

On Mon, Jun 29, 2020 at 9:21 AM ArtemisDev <arte...@dtechspace.com> wrote:

> While launching a spark job from Zeppelin against a standalone spark
> cluster (Spark 3.0 with multiple workers without hadoop), we have
> encountered a Spark interpreter exception caused by a I/O File Not Found
> exception due to the non-existence of the /tmp/spark-events directory.
> We had to create the /tmp/spark-events directory manually in order to
> resolve the problem.
>
> As a reference, the same notebook code run on Spark 2.4.6 (also a
> standalone cluster) without any problems.
>
> What is /tmp/spark-events for and is there anyway to pre-define this
> directory as some config parameter so we don't end up manually add it in
> /tmp?
>
> Thanks!
>
> -- ND
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to